WO2021054659A2 - Device and method for surgical navigation - Google Patents

Device and method for surgical navigation Download PDF

Info

Publication number
WO2021054659A2
WO2021054659A2 PCT/KR2020/011896 KR2020011896W WO2021054659A2 WO 2021054659 A2 WO2021054659 A2 WO 2021054659A2 KR 2020011896 W KR2020011896 W KR 2020011896W WO 2021054659 A2 WO2021054659 A2 WO 2021054659A2
Authority
WO
WIPO (PCT)
Prior art keywords
marker
surgical
robot
posture
correlation
Prior art date
Application number
PCT/KR2020/011896
Other languages
French (fr)
Korean (ko)
Other versions
WO2021054659A3 (en
Inventor
김봉오
임흥순
Original Assignee
큐렉소 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 큐렉소 주식회사 filed Critical 큐렉소 주식회사
Publication of WO2021054659A2 publication Critical patent/WO2021054659A2/en
Publication of WO2021054659A3 publication Critical patent/WO2021054659A3/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/46Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/46Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor
    • A61F2/4603Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor for insertion or extraction of endoprosthetic joints or of accessories thereof
    • A61F2/461Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor for insertion or extraction of endoprosthetic joints or of accessories thereof of knees
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00128Electrical control of surgical instruments with audible or visual output related to intensity or progress of surgical action
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3904Markers, e.g. radio-opaque or breast lesions markers specially adapted for marking specified tissue
    • A61B2090/3916Bone tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/46Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor
    • A61F2002/4632Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor using computer-controlled surgery, e.g. robotic surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/46Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor
    • A61F2002/4635Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor using minimally invasive surgery

Definitions

  • the present invention relates to a surgical navigation device and a method thereof, and more particularly, to a surgical navigation device and a method for detecting whether a marker attached to a surgical object is deformed.
  • the surgical path is determined in advance, an optical marker is mounted on the object to be operated, and the position and posture of the optical marker are tracked with an optical sensor, The operation is performed by monitoring the position of the robot.
  • a marker mounted on an object to be operated is often deformed during surgery due to external force or the like.
  • a problem occurs in estimating the position and posture of the object to be operated on.
  • an additional marker or instrument was attached to an object to be operated to determine whether it was deformed.
  • the present invention has been proposed to solve the above problems, and is to provide a surgical navigation device and method capable of easily detecting whether a marker attached to a surgical object is deformed during surgery.
  • the present invention has been proposed to solve the above problems, and is to provide a surgical navigation device and a method that can be easily recovered when the marker is deformed.
  • a surgical navigation device for achieving the above object includes a first image including an object marker attached to an object to be operated on and a second image on the object to be operated before surgery to match the object marker attached to the object to be operated and the operation.
  • An object matching unit for deriving a correlation with respect to the position and posture between the objects;
  • a reference position storage unit for setting and storing a reference position with respect to at least one reference point of the surgical subject;
  • a position calculating unit that receives the position and posture information of the target marker from a tracker and calculates the position of the reference point of the operation subject based on the derived correlation;
  • a marker deformation determination unit determining that the target marker is deformed when the position of the reference point calculated by the position calculating unit deviates from the reference position.
  • a surgical navigation device for achieving the above object includes a first image including an object marker attached to an object to be operated on and a second image on the object to be operated before surgery to match the object marker attached to the object to be operated and the operation.
  • An object matching unit for deriving a correlation with respect to the position and posture between the objects;
  • a reference position storage unit that calculates and stores a reference position relationship with respect to a changeable position of the target marker using at least one reference point of the object to be operated as a reference position;
  • a marker deformation determination unit that receives position and posture information of the object marker from a tracker, and determines that the object marker is deformed when the position and posture of the object marker deviates from the reference position relationship.
  • a correlation regarding the position and posture between the robot marker and the surgical object, and a correlation regarding the position and posture between the robot marker and the reference point are derived.
  • the surgical navigation method for achieving the above object includes a first image including an object marker attached to an object to be operated on with a second image about the object to be operated before surgery, and the object marker attached to the object to be operated and the operation Deriving a correlation with respect to the position and posture between the objects; Setting and storing a reference position with respect to at least one reference point of the surgical subject; Receiving position and posture information of the object marker from a tracker, and calculating a position of the reference point of the operation object based on the derived correlation; And determining that the object marker is deformed when the calculated position of the reference point deviates from the reference position.
  • the present invention it is possible to easily detect whether a marker is deformed in real time by setting a reference position with respect to a reference point on an object to be operated and tracking the position of the reference point during surgery.
  • FIG. 1 schematically shows a surgical navigation system according to an embodiment of the present invention.
  • FIG. 2 is a control block diagram of a surgical navigation device according to an embodiment of the present invention
  • FIG. 3 is a view for explaining the operation of the control unit of the surgical navigation device according to an embodiment of the present invention.
  • FIG 4 is for explaining the operation of the marker deformation determination unit according to an embodiment of the present invention.
  • FIG. 5 is a block diagram of a control unit of a surgical navigation device according to an embodiment of the present invention.
  • FIG 6 is for explaining the operation of the robot matching unit according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a method of determining whether an object marker is deformed by the surgical navigation device according to an embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a method of determining whether an object marker is deformed by a surgical navigation device according to another embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a method of determining whether an object marker is deformed by a surgical navigation device and a recovery method according to another embodiment of the present invention.
  • the surgical navigation system according to an embodiment of the present invention includes an object marker 10 and 20 attached to a surgical object 1 and 2, a surgical robot 30, a tracker 40, and a surgical navigation system. Including the device 100.
  • the object to be operated (1, 2) refers to an object of surgery, and in the embodiment of the present invention, the object to be operated is a knee joint of the femur (1) (Femur) and the tibia (2) (Tibia), and the object marker 10, 20) will be described as an example that is attached to each of the femur (1) and tibia (2).
  • the surgical robot 30 is for joint surgery and includes a robot base and an arm, and a cutting tool may be positioned on an end effector of the arm.
  • a robot marker 31 is attached to the base of the surgical robot 30.
  • Optical markers may be used for the object markers 10 and 20 and the robot marker 31, and three or four bar types in different directions based on the center point are formed, and at the ends of the bars, respectively.
  • a highly reflective ball marker may be formed.
  • the tracker 40 is for tracking the position and posture of the robot marker 31 attached to the surgical robot 30 and the object markers 10 and 20 attached to the surgical objects 1 and 2, and are coordinated in three-dimensional space.
  • the position and posture of the image marker is sensed and transmitted to the surgical navigation device 100 to be described later.
  • the tracker 40 will be described as an example that is implemented as an optical tracking system (OTS).
  • OTS optical tracking system
  • the surgical navigation device 100 performs registration of the surgical objects 1 and 2 and the registration of the surgical robot 30, and receives a signal input from the tracker 40 to determine whether the object markers 10 and 20 are deformed. For determination, as shown in FIG. 1, it may be implemented including a computer or a microprocessor and a display. In FIG. 1, the surgical navigation device 100 is shown to be separated from the surgical robot 30 and implemented as a separate device, but in some cases, the computer or microprocessor of the surgical navigation device is installed in the surgical robot 30, The display of the surgical navigation device 100 may be connected to the tracker 40 and installed together. In this case, the surgical robot 30 is connected to the tracker 40 to receive the location/position information of the markers, processes it, and provides it to the display.
  • a surgical navigation device 100 includes a signal receiving unit 110, a user input unit 120, a display unit 130, a memory unit 140, and a control unit 150. Includes.
  • the signal receiving unit 110 is for receiving a signal from the outside, for example, an HDMI (High Definition Multimedia Interface) connector 11, a D-sub connector, or an Internet network for connection with an external device. It may include a communication module for connecting to other wired/wireless networks. The signal receiving unit 110 may include a wired/wireless communication module for interworking with the tracker 40 and the surgical robot 30.
  • HDMI High Definition Multimedia Interface
  • D-sub connector D-sub connector
  • Internet network for connection with an external device. It may include a communication module for connecting to other wired/wireless networks.
  • the signal receiving unit 110 may include a wired/wireless communication module for interworking with the tracker 40 and the surgical robot 30.
  • the user input unit 120 is for receiving a command from the user and transmitting it to the control unit 150 to be described later, and may include at least one of various user input means such as a keyboard, a mouse, and a button.
  • the display unit 130 is for displaying an image on a screen, such as a liquid crystal display (LCD) panel, a light emitting diode (LED) panel, an organic light emitting diode (OLED) panel, etc. It can be implemented as
  • the memory unit 140 may store various OSs, middleware, platforms, and various applications of the surgical navigation device 100, and store program codes, signal-processed image signals, audio signals, and data.
  • the memory unit 140 stores an operation target image, such as a patient CT image, of the operation object 1 and 2 acquired before the operation.
  • the memory unit 140 may be implemented as a read only memory (ROM), an erasable programmable read-only memory (EPROM), a random access memory (RAM), or the like.
  • the control unit 150 is in charge of overall control of the surgical navigation device 100 by a user command or an internal program input through the user input unit 120.
  • the control unit 150 may be implemented by including a computer program code for signal processing and control and a microprocessor executing the computer program.
  • the control unit 150 performs image matching and position tracking using position/position information received from the tracker 40 through the signal receiving unit 110, and detects whether the object markers 10 and 20 are deformed. Also, the controller 150 performs recovery when the object markers 10 and 20 are deformed.
  • control unit 150 includes an object matching unit 151, a reference position storage unit 153, and a marker deformation determining unit 155.
  • the object matching unit 151 includes a first image (optical image) including the object markers 10 and 20 obtained from the tracker 40 and a second image of the patient's surgical objects 1 and 2 taken before surgery. (E.g., 3D CT image) is matched to derive a correlation with respect to the position/position between the bone markers 10 and 20 (bone markers) attached to the surgery subjects 1 and 2 and the surgery subjects 1 and 2 . After attaching the object markers 10 and 20 to the surgical objects 1 and 2, the surgical objects 1 and 2 are in contact with and scratched by a plurality of points of the surgical objects 1 and 2 using a probe. And the position/position of the object markers 10 and 20 is recognized.
  • a first image optical image
  • 3D CT image 3D CT image
  • the object matching unit 151 receives an optical image about the position/position indicated by the probe through the tracker 40, and performs matching with 3D data of a patient, for example, a CT image previously stored in the memory unit 140, It is possible to derive a correlation regarding the position/position between the markers 10 and 20 and the surgical objects 1 and 2.
  • the object matching unit 151 may be implemented including a software algorithm for image matching.
  • the'surgical object' is also used in a broad sense of an object to be operated such as the femur (1) and the tibia (2). It is also used as an agreement to indicate a specific location or surgical site.
  • the object to be operated is a knee joint of the femur 1 and the tibia 2, and the object markers 10 and 20 are attached to the femur 1 and the tibia 2.
  • FMC Femur Marker Coordinate
  • HC Hip Coordinate
  • HJC Joint Center
  • the object registration unit 151 is based on a correlation between the position/position of the femur marker 10 and the position/position of the hip joint center point (HJC) of the femur 1 through image registration, for example, based on the femur marker 10
  • a transformation matrix (F T H ) can be derived as a coordinate transformation relationship between coordinate systems based on the hip joint center (HJC) of the femur 1 with respect to one coordinate system.
  • control unit 150 obtains the position and posture information of the femur marker 10 from the tracker 40, and converts the obtained position and posture information of the femur 1 into a transformation matrix derived from the object matching unit 151 It is possible to derive the position and posture of the center (HJC) of the hip joint of the femur (1) by multiplying it by ( F T H ).
  • TMC Tibia Marker Coordinate
  • AJC ankle joint center
  • T T A The transformation matrix (T T A ), which is the coordinate transformation relationship between the coordinate systems based on the ankle joint center (AJC) of the tibia 2 with respect to the coordinate system, is derived.
  • control unit 150 obtains the position and posture information about the tibia marker 20 from the tracker 40, and converts the position and posture information of the tibia marker 20 derived from the operation object matching unit 151
  • the position and posture of the ankle joint center (AJC) of the tibia 2 can be derived by multiplying the matrix ( T T A ).
  • the object matching unit 151 may derive a correlation with respect to the position/position of a plurality of points such as an implant origin. Accordingly, if the location/position of the object markers 10 and 20 is tracked by the tracker 40, the positions and postures of a plurality of points of the surgical objects 1 and 2 can be derived.
  • the surgical robot 30 When the registration of the surgical objects 1 and 2 is completed, the surgical robot 30 is moved close to the surgical objects 1 and 2 to prepare for surgery. At this time, the surgical objects 1 and 2 are fixed.
  • the hip joint center and the ankle joint center are fixed to prevent physical movement, and the hip joint center and the ankle joint center are fixed.
  • the femur (1) and tibia (2) may be in a moving or fixed state.
  • the reference position storage unit 153 is based on the correlation between the object markers 10 and 20 derived from the object matching unit 151 and the surgical objects 1 and 2, Set and save the reference position for the reference point.
  • the reference location storage unit 153 may be implemented by a memory or a register.
  • the reference point refers to a point used as a reference for movement in the surgical objects 1 and 2, and may be set differently according to the type of the surgical objects 1 and 2 or the type of surgery.
  • the hip joint center is set as a reference point, and the position and posture of the corresponding point after fixing the surgical objects (1, 2) are stored as a reference position
  • the ankle joint center is referenced.
  • the reference position of the reference point of the surgical object 1 and 2 stored in the reference position storage unit 153 means a position and posture based on the coordinate system of the tracker 40.
  • the position calculating unit 154 receives the position and posture information of the object markers 10 and 20 from the tracker 40, and based on the correlation derived from the object matching unit 151, the Calculate the position of the reference point.
  • the location calculation unit 154 may be implemented by including a software algorithm for location calculation.
  • the position calculation unit 154 calculates the position of the hip joint center by tracking the position and posture of the femur marker 10, and tracks the position and posture of the tibia marker 20 to track the position and posture of the ankle joint. Calculate the location.
  • the position calculation unit 154 continuously calculates the position of the reference point at a certain period during the operation. At this time, the position calculation unit 154 calculates the position and posture of the reference point based on the coordinate system of the tracker.
  • the marker deformation determination unit 155 is for determining whether a marker is deformed, and may be implemented including a software algorithm. When the position of the reference point calculated from the position calculation unit 154 deviates from the previously stored reference position, the marker deformation determination unit 155 determines that the object markers 10 and 20 are deformed, and when the position of the reference point calculated by the position calculation unit 154 is the same as the previously stored reference position, It is judged as normal. 3, the femur 1 is rotatable with the hip joint center as a reference point, and the tibia 2 is rotatable with the ankle joint center as the reference point.
  • the femur marker 10 may be located on the spherical surface of A1
  • the tibia marker 20 may be located on the spherical surface of A2. Since the correlation of the position and posture between the object markers 10 and 20 and the object does not change, even if the object markers 10 and 20 move during surgery, the position of the reference point, e.g., the hip joint center and the ankle joint center, will be It should be the same as the standard position of.
  • the marker deformation determination unit 155 determines whether the position of the reference point calculated from the position calculation unit 154 is the same as the reference position stored in the reference position storage unit 153, and if the same is determined as normal, and is not the same. If not, it is determined that the marker has been deformed.
  • the position calculation unit 154 acquires the position and posture information of the femur marker 10 from the tracker 40, and uses the coordinate correlation ( F T H ) calculated by the object registration unit 151 to be used for the operation object 1
  • the reference point of ,2) for example, the position of the hip joint center is calculated.
  • the femur marker 10 rotates to the right from the origin position based on the existing FMC coordinate system to move to the posture of the FMC' coordinate system, and accordingly, the hip joint center is moved from the existing HJC position to the HJC' position.
  • the marker deformation determination unit 155 determines whether the HJC position, which is the position of the reference point previously stored in the reference position storage unit 153, and the HJC′ position, which is the position of the currently tracked reference point, are the same. As shown in FIG. 4, since the reference position HJC of the reference point and the current position HJC' are different, the marker deformation determination unit 155 may determine that the marker is deformed due to an external force or the like.
  • the position calculation unit 154 acquires the position and posture information of the tibia marker 20 from the tracker 40, and uses the coordinate correlation T T A calculated by the object registration unit 151 to be used for the operation object 1 , 2), for example, the position of the center of the ankle joint is calculated.
  • the tibia marker 20 moves in parallel downward from the origin of the existing TMC coordinate system and moves to the origin of the TMC' coordinate system, and accordingly, the ankle joint center is moved from the existing AJC position to the AJC' position.
  • the marker deformation determination unit 155 determines whether the AJC, which is the position of the reference point previously stored in the reference position storage unit 153, and the AJC′, which is the position of the currently tracked reference point, are the same. As shown in FIG. 4, since the reference position AJC and the current position AJC' of the reference point are different, the marker deformation determination unit 155 may determine that the marker is deformed due to an external force or the like.
  • the surgical navigation device may further include a GUI generator 156.
  • the GUI generator 156 When a marker deformation is detected, the GUI generator 156 generates a message informing it and transmits the message to be displayed on the display unit 130.
  • the GUI generator may include a graphic processing module, for example, a graphic card, which processes data and generates an image. The user can know that the marker has been deformed through the message displayed on the screen.
  • 5 is a block diagram of the control unit 150 of the surgical navigation device according to an embodiment of the present invention.
  • the surgical navigation device according to the present embodiment may further include a robot matching unit 152b as the matching unit 152 and further include a recovery unit 157 as compared to the above-described embodiment. have.
  • the robot matching unit 152b is configured to determine the correlation with respect to the position and posture between the robot marker 31 and the surgical objects 1 and 2, and the correlation with respect to the position and posture between the robot marker 31 and the reference point. To derive.
  • the robot matching unit may be implemented including a software algorithm for position matching.
  • the robot matching part 152b attaches the robot marker 31 to the arm and the robot base of the robot, and moves the robot arm to determine the position and posture of the robot marker 31 by using the tracker 40.
  • the tracker 40 By tracking through, a correlation between the position/position of the robot base and the position/position of the robot marker 31 is derived to perform robot registration.
  • the surgical robot 30 is placed as an operable area, and the robot marker 31 and the object markers 10 and 20 installed on the base of the surgical robot 30 through the tracker 40, and the surgical object ( The relationship between the position and posture between 1 and 2) is derived.
  • the robot matching unit 152b is matched based on the coordinate system of the surgical robot 30 or the coordinate system of the robot marker 31 and the objects 1 and 2, the object markers 10 and 20, and the robot marker 31 ) Can derive the correlation regarding the position and posture.
  • 6 is for explaining the operation of the robot matching unit 152b according to an embodiment of the present invention.
  • RM Robot Marker
  • RMC Robot Marker Coordinate
  • the robot matching unit 152b is based on the position and posture information of the markers acquired from the tracker 40, and the correlations ( R T F , R) between the robot marker 31 and the object markers 10 and 20 on the position and posture. T T ) is derived. At this time, the robot matching unit 152b is based on the correlation (F T H , T T A ) with respect to the position and posture between the object markers 10 and 20 calculated by the object matching unit 151 and the reference point. It is possible to derive correlations (R T H , R T A ) regarding the position and posture between (31) and the surgical subject (1, 2).
  • R T H R T F x F T H
  • R T A R T T x T T A
  • R T F is the transformation matrix of the femur marker 10 to the robot marker 31
  • R T T is the transformation matrix of the tibia marker 20 to the robot marker 31
  • R T H is the robot marker 31
  • R T A denotes a transformation matrix at the center of the ankle joint for the robot marker 31.
  • the correlation (R T H , R T A ) about the position and posture between the robot marker 31 calculated by the robot matching unit 152b and the reference position of the reference point of the surgical object 1 and 2 is a reference position storage unit It is stored in 153.
  • correlations (R T F , R T T ) about the position and posture between the robot marker 31 and the object markers 10 and 20 calculated by the robot matching unit 152b are also used for the reference position storage unit ( 153).
  • the reference position storage unit 153 stores the reference position of the reference point of the surgical object (1, 2), but the reference point based on one of the robot marker 31 or the coordinate system of the base of the surgical robot 30 It is possible to store more location and posture information.
  • position and posture information of a reference point based on a fixed third position that does not move is required. Since the tracker 30 may be moved during surgery, in the present embodiment, the position and posture information of the reference point for recovery is not moved and is a coordinate system based on the surgical robot 30, that is, the surgical robot 30 ), the position and posture information of the reference position based on the coordinate system of the base or the coordinate system of the robot marker 31 are stored and used for recovery. In addition, in the present embodiment, the marker deformation determination unit 155 calculates the position of the reference point based on the coordinate system of the base of the robot marker 31 or the surgical robot 30, and the value stored in the reference position storage unit 153 By comparison, it is possible to determine whether or not the marker is deformed.
  • the marker deformation determination unit 155 calculates the position of the reference point based on the coordinate system of the tracker 40 and stores a value stored in the reference position storage unit 153 (e.g., a reference position based on the tracker coordinate system) and By comparison, it is also possible to determine whether or not the marker is deformed.
  • the recovery unit 157 is positioned before and after deformation of the object markers 10 and 20 based on a change in the correlation between the object markers 10 and 20 and the robot marker 31 before and after deformation of the object markers 10 and 20 And a correlation with respect to the posture, and reestablish the correlation between the object markers 10 and 20 and the surgical objects 1 and 2 based on the correlation before and after the deformation of the object markers 10 and 20.
  • the recovery unit 167 may be implemented including a software algorithm for calculating a location.
  • FIG. 7 is for explaining the operation of the recovery unit 157 according to an embodiment of the present invention.
  • the femur marker 10 rotates in the femur 1, and the tibia marker 20 is translated in the vertical direction in the tibia 2 It is assumed that it has occurred.
  • Marker deformation determination unit 155 is the position of the reference point calculated by the position calculation unit 154, for example, the hip joint center position (HJC') and the ankle joint center position (AJC') is the reference position, such as HJC and AJC It is determined whether it is the same as or not, and it is determined that deformation has occurred in both the femur marker 10 and the tibia marker 20.
  • the recovery unit 157 correlates the position and posture between the object markers 10 and 20 and the surgical objects 1 and 2 based on the changed marker. Reset the relationship.
  • the correlation between the object markers 10 and 20 after the deformation of the femur marker 10 and the robot marker 31 can be calculated from the position/position acquired through the tracker 40, and in FIG. 7, the femur marker 10 It is represented by the transformation matrix (R T F' ) of the tibia marker 20 with respect to the robot marker 31 after the deformation of
  • the reference position storage unit 153 stores the transformation matrix of the femur marker 10 with respect to the robot marker 31 before the marker deformation occurs.
  • the recovery unit 157 uses the transformation matrix of the femur marker 10 with respect to the robot marker 31 before and after the deformation of the femur marker 10 occurs, and the femur marker 10 before and after the deformation of the femur marker 10 occurs.
  • the correlation of the change in position and posture of the patient can be derived as follows.
  • F T F is the transformation matrix, R T F on the change of position and attitude of the deformation before and after the femoral marker 10 of the femoral marker 10, the robot after the deformation of the femoral marker 10, the marker
  • R T F is the transformation matrix of the femur marker 10 with respect to the robot marker 31 before the deformation of the femur marker 10 occurs
  • R T H is the femur marker ( The transformation matrix of the hip joint center (reference position) with respect to the robot marker 31 before deformation of 10)
  • F T H is the hip joint center (reference position) with respect to the femur marker 10 before the deformation of the femur marker 10 occurs.
  • the recovery unit 157 is a femur marker after deformation of the femur marker 10 based on the correlation (F T F′ ) with respect to the change in the position and posture of the femur marker 10 before and after the deformation of the femur marker 10 occurs. It can be reset by deriving the correlation between (10) and the hip joint center (HJC).
  • T H denotes the transformation matrix of the hip joint center (HJC) according to a modified femoral marker 10 after the femoral marker 10)
  • the recovery unit 157 uses the transformation matrix of the tibia marker 20 with respect to the robot marker 31 before and after the deformation of the tibia marker 20 occurs, and the tibia before and after the deformation of the tibia marker 20 occurs.
  • the correlation of the change in the position and posture of the marker 20 can be derived as follows.
  • T T T' inv( R T T )x R T T'... ... ... ... ... ... ... ... ... ... ... ... (3)
  • R T T' is the robot after the deformation of the tibia marker 20 marker
  • R T T is the transformation matrix of the tibia marker 20 with respect to the robot marker 31 before the deformation of the tibia marker 20 occurs
  • R T A is the tibia marker
  • T T A is the center of the angle joint with respect to the tibia marker 20 before the deformation of the tibia marker 20 occurs (reference position ) Means a transformation matrix of ).
  • the recovery unit 157 is a tibial marker after deformation of the tibia marker 20 based on a correlation (T T T′ ) with respect to a change in the position and posture of the tibia marker 20 before and after deformation of the tibia marker 20 occurs. It can be reset by deriving the correlation between (20) and the center of the ankle joint (AJC).
  • T 'T A inv (T T T') x T T A
  • T 'T A denotes a transformation matrix of the ankle joint center (AJC) according to a modified tibial marker 20 after the tibia marker 20)
  • the correlation between the object markers 10 and 20 reset by the recovery unit 157 and the surgical objects 1 and 2 is stored in the object matching unit 152a and the robot matching unit 152b, and the position calculating unit 154 ) May track the position and posture of the surgical objects 1 and 2 based on the correlation between the newly set object markers 10 and 20 and the surgical objects 1 and 2.
  • the robot marker 31 is deformed due to an external force or the like, it is preferable to separately provide a restoration marker for confirming the robot marker 31.
  • the marker deformation determination unit 155 acquires the position and posture information of the object markers 10 and 20 from the tracker 40, and stores the position and posture information of the object markers 10 and 20 in the reference position storage unit 153. It is possible to determine whether the marker is deformed by determining whether the reference position relationship is satisfied.
  • the object markers 10 and 20 may be positioned only on the sphere surfaces of A1 and A2 based on the reference position of the reference point. For example, the spherical surfaces of A1 and A2 may be the reference positional relationship.
  • the position and posture of the reference point is calculated from the position and posture information of the object markers 10 and 20, whereas in the present embodiment, the position and posture information of the object markers 10 and 20 determine the reference position relationship. The difference is that you make sure you are satisfied.
  • the tracker 40 is first performed by recognizing the positions of the target markers 10 and 20 and the targets 1 and 2 using a probe.
  • the first image of the marker acquired by) and the second image of the surgical objects 1 and 2 acquired before the surgery are matched (S10).
  • S10 a correlation between the object markers 10 and 20 and the surgical objects 1 and 2 is derived (S11).
  • a reference position with respect to a reference point which is a reference of the movement of the object, is set among the plurality of points of the surgical object 1 and 2, and the corresponding position value is stored (S12).
  • the surgical objects 1 and 2 include the femur 1 and the tibia 2, and the reference point includes the hip joint center and the ankle joint center.
  • the reference position of the reference point is stored as a position value based on the coordinate system of the tracker 40.
  • the position and posture of the target markers 10 and 20 are acquired through the tracker 40 during surgery (S13), and the target markers 10 and 20 calculated in the above-described registration process and the targets 1 and 2 are referenced.
  • the position and posture of the reference point is derived by using the correlation between the position and posture of the point (S14).
  • the correlation between the object markers 10 and 20 attached to the surgical objects 1 and 2 and the reference point is not changed by the movement of the tracker 40 or the movement of the surgical objects 1 and 2.
  • the marker deformation determination unit 155 determines whether the position of the reference point is the same as the reference position stored in the reference position storage unit 153 (S15). If there is a discrepancy, it is determined that the object markers 10 and 20 are deformed (S16), and if they do match, it is determined that the object markers 10 and 20 are in a normal state (S17).
  • FIG. 9 is a flowchart illustrating a method of determining whether object markers 10 and 20 are deformed by the surgical navigation device according to another embodiment of the present invention. A description redundantly with the above-described embodiment will be omitted.
  • a tracker 40 is first performed through a process of recognizing the positions of the target markers 10 and 20 and the surgical targets 1 and 2 using a probe.
  • the first image of the marker acquired by) and the second image of the surgical objects 1 and 2 acquired before the surgery are matched (S20).
  • S21 a correlation between the object markers 10 and 20 and the surgical objects 1 and 2 is derived (S21).
  • the reference position relationship with respect to the changeable position of the object markers 10 and 20 is calculated based on the position of the reference point, which is the reference point of the movement of the object, among the plurality of points of the surgical object 1 and 2 It is stored in the location storage unit 153 (S22).
  • the spherical surfaces of A1 and A2 may be the reference positional relationship.
  • the position and posture of the target markers 10 and 20 are acquired through the tracker 40 during surgery (S23), and the position and posture of the target markers 10 and 20 are stored in the reference position storage unit 153. It is determined whether or not (S24). If the position and posture of the object markers 10 and 20 are out of the reference positional relationship, it is determined as the deformation of the object markers 10 and 20 (S25), and if the reference positional relationship is satisfied, the object marker 10, 20) is determined to be in a normal state (S26).
  • FIG. 10 is a flowchart illustrating a method of determining whether object markers 10 and 20 are deformed and recovering by a surgical navigation device according to another embodiment of the present invention.
  • a tracker 40 is provided through a process of recognizing the positions of the target markers 10 and 20 and the surgical targets 1 and 2 using a probe.
  • the first image of the marker acquired by the method and the second image of the surgical objects 1 and 2 acquired before the surgery are matched (S30).
  • the correlation of the position and posture between the robot and the robot marker 31 is derived through the registration of the robot.
  • the robot is moved to the operable area and placed, and the surgical objects 1 and 2 are fixed.
  • the correlation between the position and posture between the robot marker 31 and the object and the position and posture between the robot marker 31 and the surgical objects 1 and 2 are derived through the tracker 40 (S31).
  • a reference position with respect to a reference point which is a reference point for movement of the object, is set among the plurality of points of the surgical object 1 and 2, and the corresponding position value is stored (S32).
  • the reference position with respect to the reference point includes position and posture information based on at least one of the coordinate system of the tracker 40, the coordinate system of the robot marker 31, and the coordinate system of the base of the surgical robot 30.
  • the reference position of the reference point based on the coordinate system of the robot marker 31 or the surgical robot 30 may be used for recovery, and the determination of whether the object marker is deformed is the coordinate system of the tracker 40, the robot marker. It may be based on any one of the coordinate system of (31) and the coordinate system of the surgical robot (30).
  • correlations (R T H , R T A ) regarding the position and posture between the robot marker 31 calculated by the robot matching unit 152b and the reference position of the reference point, and the robot marker 31 and the object marker are stored in the reference position storage unit 153 at the time of recovery, and are used at the time of recovery.
  • the position and posture of the object markers 10 and 20 are acquired through the tracker 40 (S33), and the reference points of the object markers 10 and 20 calculated in the above-described process and the operation objects 1 and 2
  • the position and posture of the reference point is calculated using the correlation between the position and posture of (S34).
  • the marker deformation determination unit 155 determines whether the position of the reference point is the same as the reference position stored in the reference position storage unit 153 (S35). If there is a discrepancy, it is determined that the object markers 10 and 20 are deformed (S36), and recovery is performed by the recovery unit 157. Meanwhile, when the position of the reference point is the same as the reference position, it is determined that the object markers 10 and 20 are in a normal state (S39).
  • the recovery unit 157 uses the robot marker 31 to determine the correlation with respect to the position and posture between the object markers 10 and 20 and the surgical objects 1 and 2 based on the changed object markers 10 and 20. Reset. The recovery unit 157 determines the position and posture of the object markers 10 and 20 based on a change in the correlation between the object markers 10 and 20 and the robot marker 31 before and after deformation of the object markers 10 and 20 occurs. Derive a correlation for the change of (S37), and based on this, the object markers 10 and 20 after the deformation of the object markers 10 and 20 and the surgical object 1 and 2, for example, the surgical object 1 and 2 It can be reset by deriving the correlation between the reference points of (S38).
  • the correlation between the object markers 10 and 20 reset by the recovery unit 157 and the surgical objects 1 and 2 is stored in the object matching unit 152a and the robot matching unit 152b, and the position calculating unit 154 ) May track the position and posture of the surgical objects 1 and 2 based on the correlation between the newly set object markers 10 and 20 and the surgical objects 1 and 2.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Transplantation (AREA)
  • Pathology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Cardiology (AREA)
  • Vascular Medicine (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Manipulator (AREA)
  • Image Processing (AREA)

Abstract

The present invention relates to a device and a method for surgical navigation. A device for surgical navigation according to the present invention comprises: an object matching unit for matching a first image including an object marker attached to a surgical object and a second image regarding the surgical object before surgery, thereby deriving a correlation regarding positions and postures between the object marker attached to the surgical object and the surgical object; a reference position storage unit for configuring and storing a reference position regarding at least one reference point of the surgical object on the basis of the correlation between the object marker and the surgical object derived by the object matching unit; a position calculation unit for receiving information regarding the position and posture of the object marker and calculating the position of the reference point of the surgical object on the basis of the derived correlation; and a marker deformation determination unit for determining that, when the position of the reference point calculated by the position calculation unit has deviated from the reference position, the object marker has deformed. Accordingly, whether the object marker has deformed can be easily detected in real time.

Description

수술 내비게이션 장치 및 그 방법Surgical navigation device and method thereof
본 발명은 수술 내비게이션 장치 및 그 방법에 관한 것으로, 더욱 상세하게는, 수술 대상체에 부착된 마커의 변형 여부를 감지하는 수술 내비게이션 장치 및 그 방법에 관한 것이다.The present invention relates to a surgical navigation device and a method thereof, and more particularly, to a surgical navigation device and a method for detecting whether a marker attached to a surgical object is deformed.
최근 의학 기술의 발달로 로봇과 컴퓨터 시스템을 이용한 내비게이션 수술이 활발히 도입되고 있으며, 인공 관절 수술 분야에서도 이러한 수술이 적용되고 있다.With the recent development of medical technology, navigation surgery using robots and computer systems is being actively introduced, and such surgery is also applied in the field of artificial joint surgery.
무릎 관절의 경우, 외상이나 여러 가지 감염, 질환에 의한 통증 및 행동 장애가 오면 정형외과적 수술을 통해 관절 전체나 부분에 치환술을 사용하여 치료를 하며, 이중 약 10~30%의 환자들은 무릎 내측 조인트의 마모가 오게 되어 무릎관절 부분치환수술을 시행한다.In the case of the knee joint, if pain or behavioral disorders due to trauma, various infections, or diseases come, treatment is performed using orthopedic surgery to replace all or part of the joint. Of these, about 10 to 30% of patients have an inner knee joint. Because of the wear and tear of the knee joint, partial replacement surgery is performed.
이러한 정형외과 관절 수술에 쓰이는 로봇 중에서는 전 수술 과정을 자동으로 수행하는 로봇이 있으며, 이러한 수술로봇은 미리 계획된 경로를 따라 사람의 개입 없이 자동으로 뼈를 절삭해낸다. Among the robots used in orthopedic joint surgery, there is a robot that automatically performs the entire surgical process, and these surgical robots automatically cut bones without human intervention along a pre-planned path.
종래 정형외과 수술로봇을 이용한 무릎 관절 수술을 진행 시, 미리 수술 경로를 결정하고, 수술 대상체에 광학식 마커를 장착하고, 광학센서로 광학식 마커의 위치와 자세를 트래킹하여, 수술 경로에 따라 환자 위치와 로봇의 위치를 모니터링하여 수술을 진행한다.When performing knee joint surgery using a conventional orthopedic surgical robot, the surgical path is determined in advance, an optical marker is mounted on the object to be operated, and the position and posture of the optical marker are tracked with an optical sensor, The operation is performed by monitoring the position of the robot.
한편, 수술 대상체에 장착되는 마커는 수술 중에 외력 등에 의해 변형되는 경우가 종종 발생한다. 이와 같이 마커의 변형이 발생하면 수술 대상체의 위치 및 자세 추정에 문제가 발생한다. 이러한 문제를 해결하기 위해 종래에는 수술 대상체에 추가적인 마커나 기구물을 부착하여 변형 여부를 확인하였다. On the other hand, a marker mounted on an object to be operated is often deformed during surgery due to external force or the like. When the marker is deformed as described above, a problem occurs in estimating the position and posture of the object to be operated on. In order to solve this problem, conventionally, an additional marker or instrument was attached to an object to be operated to determine whether it was deformed.
하지만, 이러한 종래 기술은 수술 대상체, 예컨대 뼈 등에 마커나 기구물을 더 부착해야 하기 때문에 환자의 뼈 손상 등의 부담이 된다. However, in such a conventional technique, since markers or instruments need to be further attached to an operation object, such as a bone, etc., a burden of damage to the patient's bones is caused.
본 발명은 상기와 같은 문제점을 해결하기 위하여 제안된 것으로, 수술 중에 수술 대상체에 부착된 마커의 변형 여부를 쉽게 검출할 수 있는 수술 내비게이션 장치 및 그 방법을 제공하는 것이다. The present invention has been proposed to solve the above problems, and is to provide a surgical navigation device and method capable of easily detecting whether a marker attached to a surgical object is deformed during surgery.
또한, 본 발명은 상기와 같은 문제점을 해결하기 위하여 제안된 것으로, 마커가 변형된 경우 쉽게 리커버리할 수 있는 수술 내비게이션 장치 및 그 방법을 제공하는 것이다. In addition, the present invention has been proposed to solve the above problems, and is to provide a surgical navigation device and a method that can be easily recovered when the marker is deformed.
상기와 같은 목적을 달성하기 위한 수술 내비게이션 장치는, 수술 대상체에 부착된 대상체 마커를 포함하는 제1 영상과 수술 전 수술 대상체에 관한 제2 영상을 정합하여 상기 수술 대상체에 부착된 대상체 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계를 도출하는 대상체 정합부; 상기 수술 대상체의 적어도 하나의 기준 포인트에 관한 기준위치를 설정하여 저장하는 기준 위치 저장부; 추적기로부터 상기 대상체 마커의 위치 및 자세정보를 수신하고, 상기 도출된 상관관계에 기초하여 상기 수술 대상체의 상기 기준 포인트의 위치를 산출하는 위치 산출부; 및 상기 위치 산출부로부터 산출된 상기 기준 포인트의 위치가 상기 기준위치에서 벗어난 경우 상기 대상체 마커의 변형으로 판단하는 마커 변형 판단부를 포함한다. A surgical navigation device for achieving the above object includes a first image including an object marker attached to an object to be operated on and a second image on the object to be operated before surgery to match the object marker attached to the object to be operated and the operation. An object matching unit for deriving a correlation with respect to the position and posture between the objects; A reference position storage unit for setting and storing a reference position with respect to at least one reference point of the surgical subject; A position calculating unit that receives the position and posture information of the target marker from a tracker and calculates the position of the reference point of the operation subject based on the derived correlation; And a marker deformation determination unit determining that the target marker is deformed when the position of the reference point calculated by the position calculating unit deviates from the reference position.
상기와 같은 목적을 달성하기 위한 수술 내비게이션 장치는, 수술 대상체에 부착된 대상체 마커를 포함하는 제1 영상과 수술 전 수술 대상체에 관한 제2 영상을 정합하여 상기 수술 대상체에 부착된 대상체 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계를 도출하는 대상체 정합부; 상기 수술 대상체의 적어도 하나의 기준 포인트를 기준위치로 하는 상기 대상체 마커의 변화가능위치에 관한 기준위치관계를 산출하여 저장하는 기준 위치 저장부; 추적기로부터 상기 대상체 마커의 위치 및 자세정보를 수신하고, 상기 대상체 마커의 위치 및 자세가 상기 기준위치관계를 벗어난 경우 상기 대상체 마커의 변형으로 판단하는 마커 변형 판단부를 포함한다. A surgical navigation device for achieving the above object includes a first image including an object marker attached to an object to be operated on and a second image on the object to be operated before surgery to match the object marker attached to the object to be operated and the operation. An object matching unit for deriving a correlation with respect to the position and posture between the objects; A reference position storage unit that calculates and stores a reference position relationship with respect to a changeable position of the target marker using at least one reference point of the object to be operated as a reference position; And a marker deformation determination unit that receives position and posture information of the object marker from a tracker, and determines that the object marker is deformed when the position and posture of the object marker deviates from the reference position relationship.
또한, 수술 로봇에 부착된 로봇 마커를 포함하는 영상에 기초하여 상기 로봇 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계, 및 상기 로봇 마커와 상기 기준 포인트 간의 위치 및 자세에 관한 상관관계를 도출하는 로봇 정합부; 및 상기 대상체 마커의 변형 전후의 상기 대상체 마커와 상기 로봇 마커 간의 상관관계의 변화에 기초하여 상기 대상체 마커의 변형 전후의 위치 및 자세에 관한 상관관계를 도출하고, 상기 대상체 마커의 변형 전후의 상관관계에 기초하여 상기 대상체 마커와 상기 수술 대상체 간의 상관관계를 재설정하는 리커버리부를 더 포함한다. In addition, based on an image including a robot marker attached to a surgical robot, a correlation regarding the position and posture between the robot marker and the surgical object, and a correlation regarding the position and posture between the robot marker and the reference point are derived. A robot matching unit; And deriving a correlation with respect to the position and posture of the object marker before and after the deformation of the object marker based on a change in the correlation between the object marker and the robot marker before and after the deformation of the object marker, and the correlation before and after the deformation of the object marker. It further includes a recovery unit for resetting the correlation between the object marker and the operation object based on.
상기와 같은 목적을 달성하기 위한 수술 내비게이션 방법은, 수술 대상체에 부착된 대상체 마커를 포함하는 제1 영상을 수술 전 수술 대상체에 관한 제2 영상과 정합하여 상기 수술 대상체에 부착된 대상체 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계를 도출하는 단계; 상기 수술 대상체의 적어도 하나의 기준 포인트에 관한 기준위치를 설정하여 저장하는 단계; 추적기로부터 상기 대상체 마커의 위치 및 자세정보를 수신하고, 상기 도출된 상관관계에 기초하여 상기 수술 대상체의 상기 기준 포인트의 위치를 산출하는 단계; 및 산출된 상기 기준 포인트의 위치가 상기 기준위치에서 벗어난 경우 상기 대상체 마커의 변형으로 판단하는 단계를 포함한다. The surgical navigation method for achieving the above object includes a first image including an object marker attached to an object to be operated on with a second image about the object to be operated before surgery, and the object marker attached to the object to be operated and the operation Deriving a correlation with respect to the position and posture between the objects; Setting and storing a reference position with respect to at least one reference point of the surgical subject; Receiving position and posture information of the object marker from a tracker, and calculating a position of the reference point of the operation object based on the derived correlation; And determining that the object marker is deformed when the calculated position of the reference point deviates from the reference position.
이상에서 설명된 바와 같이, 본 발명에 따르면, 수술 대상체에 기준 포인트에 관한 기준 위치를 설정하고 수술 중 기준 포인트의 위치를 추적함으로써 실시간으로 마커의 변형 여부를 쉽게 검출할 수 있다. As described above, according to the present invention, it is possible to easily detect whether a marker is deformed in real time by setting a reference position with respect to a reference point on an object to be operated and tracking the position of the reference point during surgery.
또한, 본 발명은 대상체 마커가 변형된 경우 로봇 마커를 이용하여 쉽게 리커버리할 수 있다.In addition, according to the present invention, when an object marker is deformed, it can be easily recovered using a robot marker.
도 1은 본 발명의 일 실시예에 따른 수술 내비게이션 시스템을 개략적으로 도시한 것이다. 1 schematically shows a surgical navigation system according to an embodiment of the present invention.
도 2는 본 발명의 일 실시예에 따른 수술 내비게이션 장치의 제어블록도이다 2 is a control block diagram of a surgical navigation device according to an embodiment of the present invention
도 3은 본 발명의 일 실시예에 따른 수술 내비게이션 장치의 제어부의 동작을 설명하기 위한 도면이다. 3 is a view for explaining the operation of the control unit of the surgical navigation device according to an embodiment of the present invention.
도 4는 본 발명의 일 실시예에 따른 마커 변형 판단부의 동작을 설명하기 위한 것이다. 4 is for explaining the operation of the marker deformation determination unit according to an embodiment of the present invention.
도 5는 본 발명의 일 실시예에 따른 수술 내비게이션 장치의 제어부의 블록도이다. 5 is a block diagram of a control unit of a surgical navigation device according to an embodiment of the present invention.
도 6은 본 발명의 일 실시예에 따른 로봇 정합부의 동작을 설명하기 위한 것이다. 6 is for explaining the operation of the robot matching unit according to an embodiment of the present invention.
도 7은 본 발명의 일 실시예에 따른 리커버리부의 동작을 설명하기 위한 것이다. 7 is for explaining the operation of the recovery unit according to an embodiment of the present invention.
도 8은 본 발명의 일 실시예에 따른 수술 내비게이션 장치에 의한 대상체 마커 변형 여부의 판단 방법을 설명하기 위한 흐름도이다. 8 is a flowchart illustrating a method of determining whether an object marker is deformed by the surgical navigation device according to an embodiment of the present invention.
도 9는 본 발명의 다른 실시예에 따른 수술 내비게이션 장치에 의한 대상체 마커 변형 여부의 판단 방법을 설명하기 위한 흐름도이다. 9 is a flowchart illustrating a method of determining whether an object marker is deformed by a surgical navigation device according to another embodiment of the present invention.
도 10은 본 발명의 다른 실시예에 따른 수술 내비게이션 장치에 의한 대상체 마커 변형 여부의 판단 및 리커버리 방법을 설명하기 위한 흐름도이다. 10 is a flowchart illustrating a method of determining whether an object marker is deformed by a surgical navigation device and a recovery method according to another embodiment of the present invention.
이하, 도면을 참조하여 본 발명의 바람직한 실시예에 대해 설명하기로 한다. 다만 하기의 설명 및 첨부된 도면에서 본 발명의 요지를 흐릴 수 있는 공지 기능 또는 구성에 대한 상세한 설명은 생략한다. 또한, 도면 전체에 걸쳐 동일한 구성 요소들은 가능한 한 동일한 도면 부호로 나타내고 있음에 유의하여야 한다. Hereinafter, a preferred embodiment of the present invention will be described with reference to the drawings. However, in the following description and the accompanying drawings, detailed descriptions of known functions or configurations that may obscure the subject matter of the present invention will be omitted. In addition, it should be noted that the same components are denoted by the same reference numerals as much as possible throughout the drawings.
이하에서 설명되는 본 명세서 및 청구범위에 사용된 용어나 단어는 통상적이거나 사전적인 의미로 한정해서 해석되어서는 아니 되며, 발명자는 그 자신의 발명을 가장 최선의 방법으로 설명하기 위한 용어의 개념으로 적절하게 정의할 수 있다는 원칙에 입각하여 본 발명의 기술적 사상에 부합하는 의미와 개념으로 해석되어야만 한다. 따라서 본 명세서에 기재된 실시예와 도면에 도시된 구성은 본 발명의 가장 바람직한 일 실시예에 불과할 뿐이고, 본 발명의 기술적 사상을 모두 대변하는 것은 아니므로, 본 출원시점에 있어서 이들을 대체할 수 있는 다양한 균등물과 변형 예들이 있을 수 있음을 이해하여야 한다. The terms or words used in the present specification and claims described below should not be construed as being limited to their usual or dictionary meanings, and the inventors are appropriate as the concept of terms for describing their own invention in the best way. It should be interpreted as a meaning and concept consistent with the technical idea of the present invention on the basis of the principle that it can be defined. Therefore, the embodiments described in the present specification and the configurations shown in the drawings are only the most preferred embodiments of the present invention, and do not represent all of the technical spirit of the present invention. It should be understood that there may be equivalents and variations.
이하에서는 도 1 내지 도 4을 참조하여 본 발명의 일 실시예에 따른 수술 내비게이션 장치에 관해 설명하기로 한다. Hereinafter, a surgical navigation device according to an embodiment of the present invention will be described with reference to FIGS. 1 to 4.
도 1은 본 발명의 일 실시예에 따른 수술 내비게이션 시스템을 개략적으로 도시한 것이다. 도 1을 참조하면, 본 발명의 일 실시예에 따른 수술 내비게이션 시스템은 수술 대상체(1, 2)에 부착된 대상체 마커(10, 20), 수술 로봇(30), 추적기(40), 및 수술 내비게이션 장치(100)를 포함한다. 1 schematically shows a surgical navigation system according to an embodiment of the present invention. Referring to FIG. 1, the surgical navigation system according to an embodiment of the present invention includes an object marker 10 and 20 attached to a surgical object 1 and 2, a surgical robot 30, a tracker 40, and a surgical navigation system. Including the device 100.
수술 대상체(1, 2)는 수술의 대상을 의미하는 것으로, 본 발명의 실시예에서는 수술 대상체가 대퇴골(1)(Femur)과 경골(2)(Tibia)의 무릎관절로서, 대상체 마커(10, 20)는 대퇴골(1)과 경골(2)에 각각 부착된 것을 일 예로 설명한다. The object to be operated (1, 2) refers to an object of surgery, and in the embodiment of the present invention, the object to be operated is a knee joint of the femur (1) (Femur) and the tibia (2) (Tibia), and the object marker 10, 20) will be described as an example that is attached to each of the femur (1) and tibia (2).
수술 로봇(30)은 관절 수술을 위한 것으로 로봇 베이스와 암을 포함하고, 암의 엔드이펙터에 절삭 도구가 위치할 수 있다. 수술 로봇(30)의 베이스에는 로봇 마커(31)가 부착된다.The surgical robot 30 is for joint surgery and includes a robot base and an arm, and a cutting tool may be positioned on an end effector of the arm. A robot marker 31 is attached to the base of the surgical robot 30.
대상체 마커(10, 20)와 로봇 마커(31)는 광학식 마커가 사용될 수 있으며, 중심점을 기준으로 서로 다른 방향의 가지 형태의 바(bar)가 3개 또는 4개가 형성되고, 바의 단부에는 각각 고반사의 볼마커가 형성될 수 있다. Optical markers may be used for the object markers 10 and 20 and the robot marker 31, and three or four bar types in different directions based on the center point are formed, and at the ends of the bars, respectively. A highly reflective ball marker may be formed.
추적기(40)는 수술 로봇(30)에 부착한 로봇 마커(31)와 수술 대상체(1, 2)에 부착된 대상체 마커(10, 20)의 위치와 자세를 추적하기 위한 것으로, 3차원 공간 좌표상의 마커의 위치 및 자세를 감지하여, 후술할 수술 내비게이션 장치(100)로 전달한다. 본 발명의 실시예에서 추적기(40)는 광학 추적 시스템(Optical Tracking System, OTS)으로 구현된 것을 일 예로 설명한다.The tracker 40 is for tracking the position and posture of the robot marker 31 attached to the surgical robot 30 and the object markers 10 and 20 attached to the surgical objects 1 and 2, and are coordinated in three-dimensional space. The position and posture of the image marker is sensed and transmitted to the surgical navigation device 100 to be described later. In the embodiment of the present invention, the tracker 40 will be described as an example that is implemented as an optical tracking system (OTS).
수술 내비게이션 장치(100)는 수술 대상체(1, 2)의 정합과 수술 로봇(30)의 정합을 수행하고, 추적기(40)로부터 입력되는 신호를 수신하여 대상체 마커(10, 20)의 변형 여부를 판단하기 위한 것으로, 도 1에 도시된 바와 같이, 컴퓨터 또는 마이크로 프로세서와 디스플레이를 포함하여 구현될 수 있다. 도 1에서는 수술 내비게이션 장치(100)가 수술 로봇(30)에 분리되어 별도의 장치로 구현되는 것으로 도시하였으나, 경우에 따라서는 수술 내비게이션 장치의 컴퓨터 또는 마이크로프로세서는 수술 로봇(30) 내에 설치되고, 수술 내비게이션 장치(100)의 디스플레이는 추적기(40)에 연결되어 함께 설치될 수도 있다. 이 경우, 수술 로봇(30)은 추적기(40)와 연결되어 마커들의 위치/자세정보를 수신하고, 이를 처리하여 디스플레이로 제공한다. The surgical navigation device 100 performs registration of the surgical objects 1 and 2 and the registration of the surgical robot 30, and receives a signal input from the tracker 40 to determine whether the object markers 10 and 20 are deformed. For determination, as shown in FIG. 1, it may be implemented including a computer or a microprocessor and a display. In FIG. 1, the surgical navigation device 100 is shown to be separated from the surgical robot 30 and implemented as a separate device, but in some cases, the computer or microprocessor of the surgical navigation device is installed in the surgical robot 30, The display of the surgical navigation device 100 may be connected to the tracker 40 and installed together. In this case, the surgical robot 30 is connected to the tracker 40 to receive the location/position information of the markers, processes it, and provides it to the display.
도 2는 본 발명의 일 실시예에 따른 수술 내비게이션 장치(100)의 제어블록도이다. 도 2를 참조하면, 본 발명의 일 실시예에 다른 수술 내비게이션 장치(100)는 신호수신부(110), 사용자 입력부(120), 디스플레이부(130), 메모리부(140) 및 제어부(150)를 포함한다.2 is a control block diagram of the surgical navigation device 100 according to an embodiment of the present invention. Referring to FIG. 2, a surgical navigation device 100 according to an embodiment of the present invention includes a signal receiving unit 110, a user input unit 120, a display unit 130, a memory unit 140, and a control unit 150. Includes.
신호수신부(110)는 외부로부터 신호를 수신하기 위한 것으로, 예컨대, 외부 기기와의 연결을 위한 HDMI(High Definition Multimedia Interface) 커넥터(11), 디-서브(D-sub) 커넥터, 또는 인터넷 망을 비롯한 유/무선 네트워크와 연결하기 위한 통신모듈을 포함할 수 있다. 신호수신부(110)는 추적기(40)와 수술 로봇(30)과의 연동을 위한 유무선 통신모듈을 포함할 수 있다. The signal receiving unit 110 is for receiving a signal from the outside, for example, an HDMI (High Definition Multimedia Interface) connector 11, a D-sub connector, or an Internet network for connection with an external device. It may include a communication module for connecting to other wired/wireless networks. The signal receiving unit 110 may include a wired/wireless communication module for interworking with the tracker 40 and the surgical robot 30.
사용자 입력부(120)는 사용자로부터 명령을 수신하여 후술할 제어부(150)로 전달하기 위한 것으로, 키보드, 마우스, 버튼 등의 다양한 사용자 입력수단 중 적어도 하나를 포함할 수 있다. The user input unit 120 is for receiving a command from the user and transmitting it to the control unit 150 to be described later, and may include at least one of various user input means such as a keyboard, a mouse, and a button.
디스플레이부(130)는 영상을 화면에 표시하기 위한 것으로, 액정 디스플레이(LCD, Liquid Crystal Display) 패널, 발광 다이오드(LED, Light Emitting Diode) 패널, 유기 발광 다이오드(OLED, Organic Light Emitting Diode) 패널 등으로 구현될 수 있다. The display unit 130 is for displaying an image on a screen, such as a liquid crystal display (LCD) panel, a light emitting diode (LED) panel, an organic light emitting diode (OLED) panel, etc. It can be implemented as
메모리부(140)는 수술 내비게이션 장치(100)의 다양한 OS, 미들웨어, 플랫폼, 및 각종 어플케이션을 저장할 수 있으며, 프로그램 코드 및 신호처리된 영상신호, 음성 신호 및 데이터를 저장할 수 있다. 메모리부(140)는 수술 전 획득한 수술 대상체(1, 2)에 관한 수술대상영상, 예컨대 환자 CT 영상 등을 저장한다. 메모리부(140)는 ROM(Read Only Memory), EPROM(Erasable Programmable Read-Only Memory), RAM(Random Access Memory) 등으로 구현 가능하다. The memory unit 140 may store various OSs, middleware, platforms, and various applications of the surgical navigation device 100, and store program codes, signal-processed image signals, audio signals, and data. The memory unit 140 stores an operation target image, such as a patient CT image, of the operation object 1 and 2 acquired before the operation. The memory unit 140 may be implemented as a read only memory (ROM), an erasable programmable read-only memory (EPROM), a random access memory (RAM), or the like.
제어부(150)는 사용자 입력부(120)를 통하여 입력된 사용자 명령 또는 내부 프로그램에 의하여 수술 내비게이션 장치(100)의 전반적인 제어를 담당한다. 제어부(150)는 신호 처리 및 제어를 위한 컴퓨터 프로그램 코드 및 컴퓨터 프로그램을 실행하는 마이크로 프로세서를 포함하여 구현될 수 있다. 제어부(150)는 신호수신부(110)를 통해 추적기(40)로부터 수신되는 위치/자세 정보를 이용하여 영상 정합 및 위치 추적을 수행하고, 대상체 마커(10, 20)의 변형 여부를 검출한다. 또한, 제어부(150)는 대상체 마커(10, 20)가 변형된 경우 리커버리를 수행한다. The control unit 150 is in charge of overall control of the surgical navigation device 100 by a user command or an internal program input through the user input unit 120. The control unit 150 may be implemented by including a computer program code for signal processing and control and a microprocessor executing the computer program. The control unit 150 performs image matching and position tracking using position/position information received from the tracker 40 through the signal receiving unit 110, and detects whether the object markers 10 and 20 are deformed. Also, the controller 150 performs recovery when the object markers 10 and 20 are deformed.
도 2를 참조하면, 제어부(150)는 대상체 정합부(151), 기준 위치 저장부(153), 마커 변형 판단부(155)를 포함한다.Referring to FIG. 2, the control unit 150 includes an object matching unit 151, a reference position storage unit 153, and a marker deformation determining unit 155.
대상체 정합부(151)는 추적기(40)로부터 획득한 대상체 마커(10, 20)가 포함된 제1 영상(광학영상)과 수술 전 촬영한 환자의 수술 대상체(1, 2)에 관한 제2 영상(예, 3D CT 영상)을 정합하여 수술 대상체(1, 2)에 부착된 대상체 마커(10, 20)(bone marker)와 수술 대상체(1, 2) 간의 위치/자세에 관한 상관관계를 도출한다. 수술 대상체(1, 2)에 대상체 마커(10, 20)를 부착한 후, 프로브(probe)를 사용하여 수술 대상체(1, 2)의 다수의 포인트에 접촉시켜 긁어가면서 수술 대상체(1, 2) 및 대상체 마커(10, 20)의 위치/자세를 인식시킨다. 대상체 정합부(151)는 추적기(40)를 통해 프로브가 지시한 위치/자세에 관한 광학영상을 수신하고, 메모리부(140)에 기 저장된 환자의 3D 데이터, 예컨대 CT 영상과 정합을 수행하여 대상체 마커(10, 20)와 수술 대상체(1, 2) 간의 위치/자세에 관한 상관관계를 도출할 수 있다. 대상체 정합부(151)는 영상 정합을 위한 소프트웨어 알고리즘을 포함하여 구현될 수 있다. The object matching unit 151 includes a first image (optical image) including the object markers 10 and 20 obtained from the tracker 40 and a second image of the patient's surgical objects 1 and 2 taken before surgery. (E.g., 3D CT image) is matched to derive a correlation with respect to the position/position between the bone markers 10 and 20 (bone markers) attached to the surgery subjects 1 and 2 and the surgery subjects 1 and 2 . After attaching the object markers 10 and 20 to the surgical objects 1 and 2, the surgical objects 1 and 2 are in contact with and scratched by a plurality of points of the surgical objects 1 and 2 using a probe. And the position/position of the object markers 10 and 20 is recognized. The object matching unit 151 receives an optical image about the position/position indicated by the probe through the tracker 40, and performs matching with 3D data of a patient, for example, a CT image previously stored in the memory unit 140, It is possible to derive a correlation regarding the position/position between the markers 10 and 20 and the surgical objects 1 and 2. The object matching unit 151 may be implemented including a software algorithm for image matching.
본 명세서에서 '수술 대상체'는 대퇴골(1)과 경골(2) 등 수술대상의 광의적인 의미로도 사용되고, 수술 대상체의 포인트, 예컨대, 임플란트 원점이나 조인트 중심 등의 수술 대상체(1, 2)의 특정 위치 또는 수술 부위를 나타내는 협의의 의미로도 사용된다. In the present specification, the'surgical object' is also used in a broad sense of an object to be operated such as the femur (1) and the tibia (2). It is also used as an agreement to indicate a specific location or surgical site.
도 3은 본 발명의 일 실시예에 따른 제어부(150)의 동작을 설명하기 위한 도면이다. 도 3을 참조하면, 수술 대상체는 대퇴골(1)과 경골(2)의 무릎관절이고 대상체 마커(10, 20)는 대퇴골(1)과 경골(2)에 부착되어 있다. 3 is a view for explaining the operation of the control unit 150 according to an embodiment of the present invention. Referring to FIG. 3, the object to be operated is a knee joint of the femur 1 and the tibia 2, and the object markers 10 and 20 are attached to the femur 1 and the tibia 2.
도 3에서 FMC(Femur Marker Coordinate)는 대퇴골 마커(10)의 위치와 자세를 기준으로 하는 대퇴골 마커(10)의 좌표계를 의미하고, HC(Hip Coordinate)는 대퇴골(1)의 힙 조인트 중심(Hip Joint Center, HJC)의 위치와 자세를 기준으로 하는 좌표계를 의미한다. 대상체 정합부(151)는 영상정합을 통해 대퇴골 마커(10)의 위치/자세와 대퇴골(1)의 힙 조인트 중심점(HJC)의 위치/자세에 관한 상관관계, 예컨대, 대퇴골 마커(10)를 기준으로 한 좌표계에 대한 대퇴골(1)의 힙 조인트 중심(HJC)을 기준으로 한 좌표계 간의 좌표변환관계로서 변환행렬(FTH)을 도출할 수 있다. 이에 의해, 제어부(150)는 추적기(40)로부터 대퇴골 마커(10)의 위치와 자세정보를 획득하고, 획득한 대퇴골(1)의 위치와 자세정보에 대상체 정합부(151)에서 도출한 변환행렬(FTH)을 곱하여 대퇴골(1)의 힙 조인트의 중심(HJC)의 위치와 자세를 도출할 수 있다. In FIG. 3, FMC (Femur Marker Coordinate) refers to the coordinate system of the femur marker 10 based on the position and posture of the femur marker 10, and HC (Hip Coordinate) is the hip joint center of the femur (1). Joint Center, HJC) refers to a coordinate system based on the position and posture. The object registration unit 151 is based on a correlation between the position/position of the femur marker 10 and the position/position of the hip joint center point (HJC) of the femur 1 through image registration, for example, based on the femur marker 10 As a result, a transformation matrix (F T H ) can be derived as a coordinate transformation relationship between coordinate systems based on the hip joint center (HJC) of the femur 1 with respect to one coordinate system. Accordingly, the control unit 150 obtains the position and posture information of the femur marker 10 from the tracker 40, and converts the obtained position and posture information of the femur 1 into a transformation matrix derived from the object matching unit 151 It is possible to derive the position and posture of the center (HJC) of the hip joint of the femur (1) by multiplying it by ( F T H ).
도 3에서 TMC(Tibia Marker Coordinate)는 경골 마커(20)의 위치와 자세를 기준으로 하는 좌표계를 의미하고, AC(Ankle Coordinate)는 경골(2)의 앵클 조인트 중심(Ankle Joint Center, AJC)의 위치와 자세를 기준으로 좌표계를 의미한다. 동일한 방법으로, 대상체 정합부(151)는 영상정합을 통해 경골(2)에 부착된 경골마커(20)에 대한 앵클 조인트 중심(AJC)의 위치/자세에 관한 상관관계, 예컨대, 경골 마커(20) 좌표계에 대한 경골(2)의 앵클 조인트 중심(AJC)을 기준으로 한 좌표계 간의 좌표변환관계인 변환행렬(TTA)을 도출한다. 이에 의해, 제어부(150)는 추적기(40)로부터 경골 마커(20)에 관한 위치와 자세정보를 획득하고, 경골 마커(20)의 위치와 자세정보에 수술 대상체 정합부(151)에서 도출한 변환행렬(TTA)을 곱하여 경골(2)의 앵클 조인트 중심(AJC)의 위치와 자세를 도출할 수 있다. In FIG. 3, Tibia Marker Coordinate (TMC) refers to a coordinate system based on the position and posture of the tibia marker 20, and AC (Ankle Coordinate) refers to the ankle joint center (AJC) of the tibia 2. It refers to a coordinate system based on position and posture. In the same way, the object registration unit 151 is correlated with the position/position of the ankle joint center AJC with respect to the tibia marker 20 attached to the tibia 2 through image registration, for example, the tibia marker 20 ) The transformation matrix (T T A ), which is the coordinate transformation relationship between the coordinate systems based on the ankle joint center (AJC) of the tibia 2 with respect to the coordinate system, is derived. Accordingly, the control unit 150 obtains the position and posture information about the tibia marker 20 from the tracker 40, and converts the position and posture information of the tibia marker 20 derived from the operation object matching unit 151 The position and posture of the ankle joint center (AJC) of the tibia 2 can be derived by multiplying the matrix ( T T A ).
대상체 정합부(151)는 전술한 대퇴골(1)의 힙 조인트 중심과 경골(2)의 앵클 조인트 중심 이외에 임플란트 원점 등 다수의 포인트의 위치/자세에 관한 상관관계를 도출할 수 있다. 이에 의해, 추적기(40)에 의해 대상체 마커(10, 20)의 위치/자세를 추적하면 수술 대상체(1, 2)의 다수의 포인트의 위치 및 자세를 도출할 수 있다. In addition to the center of the hip joint of the femur 1 and the center of the ankle joint of the tibia 2, the object matching unit 151 may derive a correlation with respect to the position/position of a plurality of points such as an implant origin. Accordingly, if the location/position of the object markers 10 and 20 is tracked by the tracker 40, the positions and postures of a plurality of points of the surgical objects 1 and 2 can be derived.
수술 대상체(1, 2)의 정합이 완료되면, 수술 로봇(30)을 수술 대상체(1, 2) 가까이 이동 배치하여 수술을 준비하게 된다. 이때, 수술 대상체(1, 2)를 고정하게 되는데, 예를 들어, 도 3에서 힙 조인트 중심과 앵클 조인트 중심을 고정하여 물리적으로 움직이지 못하도록 하고, 힙 조인트 중심과 앵클 조인트 중심이 고정된 상태에서 대퇴골(1)과 경골(2)은 움직이거나 고정된 상태일 수 있다. When the registration of the surgical objects 1 and 2 is completed, the surgical robot 30 is moved close to the surgical objects 1 and 2 to prepare for surgery. At this time, the surgical objects 1 and 2 are fixed. For example, in FIG. 3, the hip joint center and the ankle joint center are fixed to prevent physical movement, and the hip joint center and the ankle joint center are fixed. The femur (1) and tibia (2) may be in a moving or fixed state.
기준 위치 저장부(153)는 대상체 정합부(151)에서 도출된 대상체 마커(10, 20)와 상기 수술 대상체(1, 2) 간의 상관관계에 기초하여 수술 대상체(1, 2)의 적어도 하나의 기준 포인트에 관한 기준위치를 설정하여 저장한다. 기준 위치 저장부(153)는 메모리 또는 레지스터 등에 의해 구현될 수 있다. The reference position storage unit 153 is based on the correlation between the object markers 10 and 20 derived from the object matching unit 151 and the surgical objects 1 and 2, Set and save the reference position for the reference point. The reference location storage unit 153 may be implemented by a memory or a register.
여기서, 기준 포인트는 수술 대상체(1, 2)에서 움직임의 기준이 되는 포인트를 의미하며, 수술 대상체(1, 2)의 종류나 수술 종류에 따라 달리 설정될 수 있다. 본 실시예에서는 대퇴골(1)에서는 힙 조인트 중심을 기준 포인트로 설정하여 수술 대상체(1, 2) 고정 후 해당 포인트의 위치 및 자세를 기준위치로 저장하고, 경골(2)에서는 앵클 조인트 중심을 기준 포인트로 설정하여 해당 포인트의 위치 및 자세를 기준위치로 저장한다. 여기서, 기준 위치 저장부(153)에 저장되는 수술 대상체(1, 2)의 기준 포인트의 기준위치는 추적기(40)의 좌표계를 기준으로 하는 위치 및 자세를 의미한다. Here, the reference point refers to a point used as a reference for movement in the surgical objects 1 and 2, and may be set differently according to the type of the surgical objects 1 and 2 or the type of surgery. In this embodiment, in the femur (1), the hip joint center is set as a reference point, and the position and posture of the corresponding point after fixing the surgical objects (1, 2) are stored as a reference position, and in the tibia (2), the ankle joint center is referenced. Set as a point and store the position and posture of the point as a reference position. Here, the reference position of the reference point of the surgical object 1 and 2 stored in the reference position storage unit 153 means a position and posture based on the coordinate system of the tracker 40.
위치 산출부(154)는 추적기(40)로부터 대상체 마커(10, 20)의 위치 및 자세정보를 수신하고, 대상체 정합부(151)에서 도출된 상관관계에 기초하여 수술 대상체(1, 2)의 기준 포인트의 위치를 산출한다. 위치 산출부(154)는 위치 산출을 위한 소프트웨어 알고리즘을 포함하여 구현될 수 있다. The position calculating unit 154 receives the position and posture information of the object markers 10 and 20 from the tracker 40, and based on the correlation derived from the object matching unit 151, the Calculate the position of the reference point. The location calculation unit 154 may be implemented by including a software algorithm for location calculation.
도 3을 참조하면, 위치 산출부(154)는 대퇴골 마커(10)의 위치 및 자세를 추적하여 힙 조인트 중심의 위치를 산출하고, 경골 마커(20)의 위치 및 자세를 추적하여 앵클 조인트 중심의 위치를 산출한다. 위치 산출부(154)는 수술 중 일정 주기로 계속하여 기준 포인트의 위치를 산출한다. 이때, 위치 산출부(154)는 추적기의 좌표계를 기준으로 기준 포인트의 위치 및 자세를 산출한다. 3, the position calculation unit 154 calculates the position of the hip joint center by tracking the position and posture of the femur marker 10, and tracks the position and posture of the tibia marker 20 to track the position and posture of the ankle joint. Calculate the location. The position calculation unit 154 continuously calculates the position of the reference point at a certain period during the operation. At this time, the position calculation unit 154 calculates the position and posture of the reference point based on the coordinate system of the tracker.
마커 변형 판단부(155)는 마커의 변형 여부를 판단하기 위한 것으로, 소프트웨어 알고리즘을 포함하여 구현될 수 있다. 마커 변형 판단부(155)는 위치 산출부(154)로부터 산출된 기준 포인트의 위치가 기 저장된 기준위치에서 벗어난 경우 대상체 마커(10, 20)의 변형으로 판단하고, 기 저장된 기준위치와 동일한 경우에는 정상상태로 판단한다. 도 3을 참조하면, 대퇴골(1)은 힙 조인트 중심을 기준 포인트로 하여 회전 가능하고, 경골(2)은 앵클 조인트 중심을 기준 포인트로 하여 회전 가능하다. 따라서, 대퇴골 마커(10)는 A1의 구 표면에 위치할 수 있고, 경골 마커(20)는 A2의 구 표면에 위치할 수 있다. 대상체 마커(10, 20)와 대상체 간의 위치 및 자세에 관한 상관관계는 변하지 않으므로, 수술 중 대상체 마커(10, 20)가 움직이더라도 기준 포인트 예컨대, 힙 조인트 중심과 앵클 조인트 중심의 위치는 처음 고정 시의 기준위치와 동일해야 한다. 마커 변형 판단부(155)는 위치 산출부(154)로부터 산출된 기준 포인트의 위치가 기준 위치 저장부(153)에 저장된 기준 위치와 동일한지 여부를 판단하여 동일한 경우는 정상으로 판단하고, 동일하지 않은 경우에는 마커의 변형이 발생한 것으로 판단한다.The marker deformation determination unit 155 is for determining whether a marker is deformed, and may be implemented including a software algorithm. When the position of the reference point calculated from the position calculation unit 154 deviates from the previously stored reference position, the marker deformation determination unit 155 determines that the object markers 10 and 20 are deformed, and when the position of the reference point calculated by the position calculation unit 154 is the same as the previously stored reference position, It is judged as normal. 3, the femur 1 is rotatable with the hip joint center as a reference point, and the tibia 2 is rotatable with the ankle joint center as the reference point. Thus, the femur marker 10 may be located on the spherical surface of A1, and the tibia marker 20 may be located on the spherical surface of A2. Since the correlation of the position and posture between the object markers 10 and 20 and the object does not change, even if the object markers 10 and 20 move during surgery, the position of the reference point, e.g., the hip joint center and the ankle joint center, will be It should be the same as the standard position of. The marker deformation determination unit 155 determines whether the position of the reference point calculated from the position calculation unit 154 is the same as the reference position stored in the reference position storage unit 153, and if the same is determined as normal, and is not the same. If not, it is determined that the marker has been deformed.
도 4는 본 발명의 일 실시예에 따른 마커 변형 판단부(155)의 동작을 설명하기 위한 것으로, 대상체 마커(10, 20)가 수술 중에 외력 등에 의해 변형이 발생한 경우를 예시한 것이다. 대퇴골(1)에서는 대퇴골 마커(10)가 회전(rotation)이 발생한 경우를 가정하였다. 위치 산출부(154)는 추적기(40)로부터 대퇴골 마커(10)의 위치 및 자세정보를 획득하고, 대상체 정합부(151)에서 산출한 좌표상관관계(FTH)를 이용하여 수술 대상체(1,2)의 기준 포인트 예컨대, 힙 조인트 중심의 위치를 산출한다. 대퇴골 마커(10)는 기존의 FMC 좌표계 기준의 원점 위치에서 오른쪽으로 회전하여 FMC' 좌표 계의 자세로 이동하고, 이에 따라 힙 조인트 중심은 기존의 HJC 위치에서 HJC' 위치로 이동된다. 마커 변형 판단부(155)는 기준 위치 저장부(153)에 기 저장된 기준포인트의 위치인 HJC 위치와 현재 추적된 기준 포인트의 위치인 HJC' 위치가 동일한지 여부를 판단한다. 도 4에 도시된 바와 같이, 기준 포인트의 기준 위치(HJC)와 현재 위치(HJC')이 상이하기 때문에, 마커 변형 판단부(155)는 마커가 외력 등에 의해 변형이 발생한 것으로 판단할 수 있다.4 is for explaining the operation of the marker deformation determination unit 155 according to an embodiment of the present invention, and illustrates a case in which the object markers 10 and 20 are deformed due to an external force or the like during surgery. In the femur (1), it is assumed that the femur marker (10) is rotated. The position calculation unit 154 acquires the position and posture information of the femur marker 10 from the tracker 40, and uses the coordinate correlation ( F T H ) calculated by the object registration unit 151 to be used for the operation object 1 The reference point of ,2), for example, the position of the hip joint center is calculated. The femur marker 10 rotates to the right from the origin position based on the existing FMC coordinate system to move to the posture of the FMC' coordinate system, and accordingly, the hip joint center is moved from the existing HJC position to the HJC' position. The marker deformation determination unit 155 determines whether the HJC position, which is the position of the reference point previously stored in the reference position storage unit 153, and the HJC′ position, which is the position of the currently tracked reference point, are the same. As shown in FIG. 4, since the reference position HJC of the reference point and the current position HJC' are different, the marker deformation determination unit 155 may determine that the marker is deformed due to an external force or the like.
경골(2)에서는 경골 마커(20)가 수직방향으로 평행이동(translation)이 발생한 경우를 가정하였다. 위치 산출부(154)는 추적기(40)로부터 경골 마커(20)의 위치 및 자세정보를 획득하고, 대상체 정합부(151)에서 산출한 좌표상관관계(TTA)를 이용하여 수술 대상체(1, 2)의 기준 포인트 예컨대, 앵클 조인트 중심의 위치를 산출한다. 경골 마커(20)는 기존의 TMC 좌표계 기준의 원점에서 아래로 평행 이동하여 TMC' 좌표계의 원점으로 이동하고, 이에 따라 앵클 조인트 중심은 기존의 AJC 위치에서 AJC' 위치로 이동된다. 마커 변형 판단부(155)는 기준 위치 저장부(153)에 기 저장된 기준포인트의 위치인 AJC와 현재 추적된 기준 포인트의 위치인 AJC'이 동일한지 여부를 판단한다. 도 4에 도시된 바와 같이, 기준 포인트의 기준 위치(AJC)와 현재 위치(AJC')이 상이하기 때문에, 마커 변형 판단부(155)는 마커가 외력 등에 의해 변형이 발생한 것으로 판단할 수 있다.In the tibia 2, it is assumed that the tibia marker 20 is translated in the vertical direction. The position calculation unit 154 acquires the position and posture information of the tibia marker 20 from the tracker 40, and uses the coordinate correlation T T A calculated by the object registration unit 151 to be used for the operation object 1 , 2), for example, the position of the center of the ankle joint is calculated. The tibia marker 20 moves in parallel downward from the origin of the existing TMC coordinate system and moves to the origin of the TMC' coordinate system, and accordingly, the ankle joint center is moved from the existing AJC position to the AJC' position. The marker deformation determination unit 155 determines whether the AJC, which is the position of the reference point previously stored in the reference position storage unit 153, and the AJC′, which is the position of the currently tracked reference point, are the same. As shown in FIG. 4, since the reference position AJC and the current position AJC' of the reference point are different, the marker deformation determination unit 155 may determine that the marker is deformed due to an external force or the like.
본 발명의 일 실시예에 따른 수술 내비게이션 장치는 GUI 생성부(156)를 더 포함할 수 있다. GUI 생성부(156)는 마커의 변형이 검출된 경우 이를 알리는 메시지를 생성하여 디스플레이부(130)에 표시되도록 전달한다. GUI 생성부는 데이터를 처리하여 영상으로 생성하는 그래픽 처리모듈 예컨대, 그래픽 카드를 포함할 수 있다. 사용자는 화면에 표시된 메시지를 통해 마커가 변형되었음을 알 수 있다.The surgical navigation device according to an embodiment of the present invention may further include a GUI generator 156. When a marker deformation is detected, the GUI generator 156 generates a message informing it and transmits the message to be displayed on the display unit 130. The GUI generator may include a graphic processing module, for example, a graphic card, which processes data and generates an image. The user can know that the marker has been deformed through the message displayed on the screen.
이하에서는 도 5 내지 도 7을 참조하여 본 발명의 또 다른 일 실시예에 따른 수술 내비게이션 장치에서 마커 변형 발생 시 리커버리 방법에 관해 설명하기로 한다. 전술한 실시예와 중복되는 설명은 필요에 따라서 생략하기로 한다. 도 5는 본 발명의 일 실시예에 따른 수술 내비게이션 장치의 제어부(150)의 블록도이다. 도 5에 도시된 바와 같이 본 실시예에 따른 수술 내비게이션 장치는 전술한 실시예와 비교하여 정합부(152)로서 로봇 정합부(152b)를 더 포함하고, 리커버리부(157)를 더 포함할 수 있다. Hereinafter, a recovery method when a marker deformation occurs in a surgical navigation device according to another embodiment of the present invention will be described with reference to FIGS. 5 to 7. Descriptions overlapping with the above-described embodiments will be omitted as necessary. 5 is a block diagram of the control unit 150 of the surgical navigation device according to an embodiment of the present invention. As shown in Figure 5, the surgical navigation device according to the present embodiment may further include a robot matching unit 152b as the matching unit 152 and further include a recovery unit 157 as compared to the above-described embodiment. have.
로봇 정합부(152b)는 정합을 통해 로봇 마커(31)와 수술 대상체(1, 2) 간의 위치 및 자세에 관한 상관관계, 및 로봇 마커(31)와 기준 포인트 간의 위치 및 자세에 관한 상관관계를 도출한다. 로봇 정합부는 위치정합을 위한 소프트웨어 알고리즘을 포함하여 구현될 수 있다. The robot matching unit 152b is configured to determine the correlation with respect to the position and posture between the robot marker 31 and the surgical objects 1 and 2, and the correlation with respect to the position and posture between the robot marker 31 and the reference point. To derive. The robot matching unit may be implemented including a software algorithm for position matching.
로봇 정합부(152b)는 우선 로봇 정합부(152b)는 로봇의 암과 로봇 베이스에 로봇 마커(31)를 부착하고, 로봇 암을 움직이면서 로봇 마커(31)의 위치 및 자세를 추적기(40)를 통해 추적함으로써 로봇 베이스의 위치/자세와 로봇 마커(31)의 위치/자세 간의 상관관계를 도출하여 로봇 정합을 수행한다. 수술 준비가 완료되면 수술 로봇(30)을 수술 가능 영역으로 배치하고, 추적기(40)를 통해 수술 로봇(30)의 베이스에 설치된 로봇 마커(31)와 대상체 마커(10, 20) 및 수술 대상체(1, 2) 간의 위치 및 자세에 관한 상관관계를 도출한다. 여기서, 로봇 정합부(152b)는 수술 로봇(30)의 좌표계 또는 로봇 마커(31)의 좌표계를 기준으로 정합 및 수술 대상체(1, 2), 대상체 마커(10, 20), 및 로봇 마커(31) 간의 위치 및 자세에 관한 상관관계를 도출할 수 있다. 도 6은 본 발명의 일 실시예에 따른 로봇 정합부(152b)의 동작을 설명하기 위한 것이다. 도 6을 참조하면, RM(Robot Marker)은 로봇 마커(31)의 위치/자세를 의미하고, RMC(Robot Marker Coordinate)는 로봇 마커(31)의 위치를 기준으로 하는 좌표계로서 로봇 마커(31)의 위치 및 자세의 기준이 된다. 로봇 정합부(152b)는 추적기(40)로부터 획득한 마커들의 위치 및 자세 정보에 기초하여 로봇 마커(31)와 대상체 마커(10, 20) 간의 위치 및 자세에 관한 상관관계(RTF, RTT)를 도출한다. 이때, 로봇 정합부(152b)는 대상체 정합부(151)에서 산출한 대상체 마커(10, 20)와 기준 포인트 간의 위치 및 자세에 관한 상관관계(FTH, TTA)에 기초하여 로봇 마커(31)와 수술 대상체(1, 2) 간의 위치 및 자세에 관한 상관관계(RTH, RTA)를 도출할 수 있다. In the robot matching part 152b, first, the robot matching part 152b attaches the robot marker 31 to the arm and the robot base of the robot, and moves the robot arm to determine the position and posture of the robot marker 31 by using the tracker 40. By tracking through, a correlation between the position/position of the robot base and the position/position of the robot marker 31 is derived to perform robot registration. When the preparation for surgery is complete, the surgical robot 30 is placed as an operable area, and the robot marker 31 and the object markers 10 and 20 installed on the base of the surgical robot 30 through the tracker 40, and the surgical object ( The relationship between the position and posture between 1 and 2) is derived. Here, the robot matching unit 152b is matched based on the coordinate system of the surgical robot 30 or the coordinate system of the robot marker 31 and the objects 1 and 2, the object markers 10 and 20, and the robot marker 31 ) Can derive the correlation regarding the position and posture. 6 is for explaining the operation of the robot matching unit 152b according to an embodiment of the present invention. Referring to FIG. 6, RM (Robot Marker) means the position/position of the robot marker 31, and RMC (Robot Marker Coordinate) is a coordinate system based on the position of the robot marker 31, and the robot marker 31 It is the standard of the position and posture of the person. The robot matching unit 152b is based on the position and posture information of the markers acquired from the tracker 40, and the correlations ( R T F , R) between the robot marker 31 and the object markers 10 and 20 on the position and posture. T T ) is derived. At this time, the robot matching unit 152b is based on the correlation (F T H , T T A ) with respect to the position and posture between the object markers 10 and 20 calculated by the object matching unit 151 and the reference point. It is possible to derive correlations (R T H , R T A ) regarding the position and posture between (31) and the surgical subject (1, 2).
예컨대, 로봇 마커(31)와 힙 조인트 중심의 기준위치(HJC) 간의 위치 및 자세에 관한 상관관계는 RTH = RTF x FTH 이고, 로봇 마커(31)와 앵클 조인트 중심의 기준위치(AJC) 간의 위치 및 자세에 관한 상관관계는 RTA = RTT x TTA 로 도출할 수 있다. 여기서, RTF 는 로봇 마커(31)에 대한 대퇴골 마커(10)의 변환행렬, RTT 는 로봇 마커(31)에 대한 경골 마커(20)의 변환행렬, RTH 는 로봇 마커(31)에 대한 힙 조인트 중심의 변환행렬, RTA 는 로봇 마커(31)에 대한 앵클 조인트 중심의 변환행렬을 의미한다. For example, the correlation with respect to the position and posture between the robot marker 31 and the reference position (HJC) at the center of the hip joint is R T H = R T F x F T H , and the reference between the robot marker 31 and the ankle joint center The relationship between the positions (AJC) and the position and posture can be derived as R T A = R T T x T T A. Here, R T F is the transformation matrix of the femur marker 10 to the robot marker 31 , R T T is the transformation matrix of the tibia marker 20 to the robot marker 31 , and R T H is the robot marker 31 ), R T A denotes a transformation matrix at the center of the ankle joint for the robot marker 31.
로봇 정합부(152b)에서 산출된 로봇 마커(31)와 수술 대상체(1, 2)의 기준 포인트의 기준 위치 간의 위치 및 자세에 관한 상관관계(RTH, RTA)는 기준 위치 저장부(153)에 저장된다. 또한, 로봇 정합부(152b)에서 산출한 로봇 마커(31)와 대상체 마커(10, 20) 간의 위치 및 자세에 관한 상관관계(RTF, RTT)도 리커버리를 위해서 기준 위치 저장부(153)에 저장된다. 한편, 기준 위치 저장부(153)는 수술 대상체(1, 2)의 기준 포인트의 기준위치를 저장하되, 로봇 마커(31) 또는 수술 로봇(30)의 베이스의 좌표계 중 하나를 기준으로 하는 기준포인트의 위치 및 자세정보를 더 저장할 수 있다. 리커버리를 위해서는 움직이지 않는 고정된 제3의 위치를 기준으로 하는 기준 포인트의 위치 및 자세정보가 필요하다. 추적기(30)는 수술 중에 이동될 가능성이 있으므로, 본 실시예에서는 리커버리를 위한 기준 포인트의 위치 및 자세정보는 움직이지 않고 고정되어 있는 수술 로봇(30)을 기준으로 하는 좌표계 즉, 수술 로봇(30)의 베이스의 좌표계 또는 로봇 마커(31)의 좌표계를 기준으로 하는 기준위치의 위치 및 자세정보를 저장하여 리커버리 시 사용한다. 또한, 본 실시예에서는 마커 변형 판단부(155)가 로봇 마커(31) 또는 수술 로봇(30)의 베이스의 좌표계를 기준으로 기준 포인트의 위치를 산출하여 기준 위치 저장부(153)에 저장된 값과 비교하여 마커의 변형 여부를 판단할 수 있다. 다른 예에서는 마커 변형 판단부(155)가 추적기(40)의 좌표계를 기준으로 기준 포인트의 위치를 산출하여 기준 위치 저장부(153)에 저장된 값(예, 추적기 좌표계를 기준으로 한 기준위치)과 비교하여 마커의 변형 여부를 판단할 수도 있다. The correlation (R T H , R T A ) about the position and posture between the robot marker 31 calculated by the robot matching unit 152b and the reference position of the reference point of the surgical object 1 and 2 is a reference position storage unit It is stored in 153. In addition, correlations (R T F , R T T ) about the position and posture between the robot marker 31 and the object markers 10 and 20 calculated by the robot matching unit 152b are also used for the reference position storage unit ( 153). On the other hand, the reference position storage unit 153 stores the reference position of the reference point of the surgical object (1, 2), but the reference point based on one of the robot marker 31 or the coordinate system of the base of the surgical robot 30 It is possible to store more location and posture information. For recovery, position and posture information of a reference point based on a fixed third position that does not move is required. Since the tracker 30 may be moved during surgery, in the present embodiment, the position and posture information of the reference point for recovery is not moved and is a coordinate system based on the surgical robot 30, that is, the surgical robot 30 ), the position and posture information of the reference position based on the coordinate system of the base or the coordinate system of the robot marker 31 are stored and used for recovery. In addition, in the present embodiment, the marker deformation determination unit 155 calculates the position of the reference point based on the coordinate system of the base of the robot marker 31 or the surgical robot 30, and the value stored in the reference position storage unit 153 By comparison, it is possible to determine whether or not the marker is deformed. In another example, the marker deformation determination unit 155 calculates the position of the reference point based on the coordinate system of the tracker 40 and stores a value stored in the reference position storage unit 153 (e.g., a reference position based on the tracker coordinate system) and By comparison, it is also possible to determine whether or not the marker is deformed.
리커버리부(157)는 대상체 마커(10, 20)의 변형 전후의 대상체 마커(10, 20)와 로봇 마커(31) 간의 상관관계의 변화에 기초하여 대상체 마커(10, 20)의 변형 전후의 위치 및 자세에 관한 상관관계를 도출하고, 대상체 마커(10, 20)의 변형 전후의 상관관계에 기초하여 대상체 마커(10, 20)와 상기 수술 대상체(1, 2) 간의 상관관계를 재설정한다. 리커버리부(167)는 위치산출을 위한 소프트웨어 알고리즘을 포함하여 구현될 수 있다. The recovery unit 157 is positioned before and after deformation of the object markers 10 and 20 based on a change in the correlation between the object markers 10 and 20 and the robot marker 31 before and after deformation of the object markers 10 and 20 And a correlation with respect to the posture, and reestablish the correlation between the object markers 10 and 20 and the surgical objects 1 and 2 based on the correlation before and after the deformation of the object markers 10 and 20. The recovery unit 167 may be implemented including a software algorithm for calculating a location.
도 7은 본 발명의 일 실시예에 따른 리커버리부(157)의 동작을 설명하기 위한 것이다. 도 7을 참조하면, 도 4와 동일하게 대퇴골(1)에서는 대퇴골 마커(10)가 회전(rotation)이 발생하고, 경골(2)에서는 경골 마커(20)가 수직방향으로 평행이동(translation)이 발생한 경우를 가정한 것이다. 마커 변형 판단부(155)는 위치 산출부(154)에서 산출한 기준 포인트의 위치, 예컨대 힙 조인트 중심의 위치(HJC')와 앵클 조인트 중심의 위치(AJC')가 기준 위치, 예컨대 HJC와 AJC와 동일한지 여부를 판단하고, 대퇴골 마커(10)와 경골 마커(20)에서 모두 변형이 발생한 것으로 판단한다. 마커 변형 판단부(155)에서 마커의 변형이 발생한 것으로 검출하면, 리커버리부(157)는 변경된 마커를 기준으로 대상체 마커(10, 20)와 수술 대상체(1, 2) 간의 위치 및 자세에 관한 상관관계를 재설정한다. 7 is for explaining the operation of the recovery unit 157 according to an embodiment of the present invention. Referring to FIG. 7, in the same manner as in FIG. 4, the femur marker 10 rotates in the femur 1, and the tibia marker 20 is translated in the vertical direction in the tibia 2 It is assumed that it has occurred. Marker deformation determination unit 155 is the position of the reference point calculated by the position calculation unit 154, for example, the hip joint center position (HJC') and the ankle joint center position (AJC') is the reference position, such as HJC and AJC It is determined whether it is the same as or not, and it is determined that deformation has occurred in both the femur marker 10 and the tibia marker 20. When the marker deformation determination unit 155 detects that the marker is deformed, the recovery unit 157 correlates the position and posture between the object markers 10 and 20 and the surgical objects 1 and 2 based on the changed marker. Reset the relationship.
대퇴골 마커(10)의 변형 후의 대상체 마커(10, 20)와 로봇 마커(31)의 상관관계는 추적기(40)를 통해 획득한 위치/자세로부터 산출할 수 있으며, 도 7에서 대퇴골 마커(10)의 변형 발생 후의 로봇 마커(31)에 관한 경골 마커(20)의 변환행렬 (RTF' )로 나타낸다. 전술한 바와 같이, 기준 위치 저장부(153)에는 마커의 변형 발생 전의 로봇 마커(31)에 관한 대퇴골 마커(10)의 변환행렬이 저장되어 있다. The correlation between the object markers 10 and 20 after the deformation of the femur marker 10 and the robot marker 31 can be calculated from the position/position acquired through the tracker 40, and in FIG. 7, the femur marker 10 It is represented by the transformation matrix (R T F' ) of the tibia marker 20 with respect to the robot marker 31 after the deformation of As described above, the reference position storage unit 153 stores the transformation matrix of the femur marker 10 with respect to the robot marker 31 before the marker deformation occurs.
리커버리부(157)는 대퇴골 마커(10)의 변형 발생 전후의 로봇 마커(31)에 관한 대퇴골 마커(10)의 변환행렬을 이용하여, 대퇴골 마커(10)의 변형 발생 전후의 대퇴골 마커(10)의 위치 및 자세의 변화에 관한 상관관계를 아래와 같이 도출할 수 있다. The recovery unit 157 uses the transformation matrix of the femur marker 10 with respect to the robot marker 31 before and after the deformation of the femur marker 10 occurs, and the femur marker 10 before and after the deformation of the femur marker 10 occurs. The correlation of the change in position and posture of the patient can be derived as follows.
[수학식 1][Equation 1]
FTF' = inv(RTF)x RTF' ……………(1) F T F' = inv( R T F )x R T F'… … … … … (One)
= inv(RTH x inv(FTH))x RTF' ……………(2) = inv( R T H x inv( F T H ))x R T F'… … … … … (2)
(여기서, FTF'는 대퇴골 마커(10)의 변형 발생 전후의 대퇴골 마커(10)의 위치 및 자세의 변화에 관한 변환행렬, RTF'는 대퇴골 마커(10)의 변형 발생 후의 로봇 마커(31)에 관한 대퇴골 마커(10)의 변환행렬, RTF는 대퇴골 마커(10)의 변형 발생 전의 로봇 마커(31)에 관한 대퇴골 마커(10)의 변환행렬, RTH 는 대퇴골 마커(10)의 변형 발생 전의 로봇 마커(31)에 대한 힙 조인트 중심(기준 위치)의 변환행렬, FTH 는 대퇴골 마커(10)의 변형 발생 전의 대퇴골 마커(10)에 대한 힙 조인트 중심(기준 위치)의 변환행렬을 의미하고, inv는 역함수를 의미한다.) 이때, 리커버리부(157)는 기준 데이터 저장부(153)에 RTF 값이 저장된 경우, (1) 수식을 사용하고, 기준 데이터 저장부(153)에 RTH, FTH 값이 저장된 경우, (2) 수식을 사용하여 FTF'값을 산출할 수 있다. (Where, F T F 'is the transformation matrix, R T F on the change of position and attitude of the deformation before and after the femoral marker 10 of the femoral marker 10, the robot after the deformation of the femoral marker 10, the marker The transformation matrix of the femur marker 10 with respect to (31), R T F is the transformation matrix of the femur marker 10 with respect to the robot marker 31 before the deformation of the femur marker 10 occurs, R T H is the femur marker ( The transformation matrix of the hip joint center (reference position) with respect to the robot marker 31 before deformation of 10), F T H is the hip joint center (reference position) with respect to the femur marker 10 before the deformation of the femur marker 10 occurs. ), and inv means an inverse function.) At this time, when the R T F value is stored in the reference data storage unit 153, (1) When an equation is used and the values of R T H and F T H are stored in the reference data storage unit 153, the value of F T F'may be calculated using the equation (2).
리커버리부(157)는 대퇴골 마커(10)의 변형 발생 전후의 대퇴골 마커(10)의 위치 및 자세의 변화에 관한 상관관계(FTF')에 기초하여 대퇴골 마커(10)의 변형 후의 대퇴골 마커(10)와 힙 조인트 중심(HJC) 간의 상관관계를 도출하여 재설정할 수 있다. The recovery unit 157 is a femur marker after deformation of the femur marker 10 based on the correlation (F T F′ ) with respect to the change in the position and posture of the femur marker 10 before and after the deformation of the femur marker 10 occurs. It can be reset by deriving the correlation between (10) and the hip joint center (HJC).
[수학식 2][Equation 2]
F'TH = inv(FTF')x FTH F 'T H = inv (F T F') x F T H
(여기서, F'TH는 대퇴골 마커(10)의 변형 후의 대퇴골 마커(10)에 관한 힙 조인트 중심(HJC)의 변환행렬을 의미한다)(Where, F 'is T H denotes the transformation matrix of the hip joint center (HJC) according to a modified femoral marker 10 after the femoral marker 10)
동일한 방식으로, 리커버리부(157)는 경골 마커(20)의 변형 발생 전후의 로봇 마커(31)에 관한 경골 마커(20)의 변환행렬을 이용하여, 경골 마커(20)의 변형 발생 전후의 경골 마커(20)의 위치 및 자세의 변화에 관한 상관관계를 아래와 같이 도출할 수 있다. In the same way, the recovery unit 157 uses the transformation matrix of the tibia marker 20 with respect to the robot marker 31 before and after the deformation of the tibia marker 20 occurs, and the tibia before and after the deformation of the tibia marker 20 occurs. The correlation of the change in the position and posture of the marker 20 can be derived as follows.
[수학식 3][Equation 3]
TTT' = inv(RTT)x RTT' ………………………(3) T T T' = inv( R T T )x R T T'… … … … … … … … … (3)
= inv(RTA x inv(TTA))x RTT' ……………………………(4) = inv( R T A x inv( T T A ))x R T T'… … … … … … … … … … … (4)
(여기서, TTT'는 경골 마커(20)의 변형 발생 전후의 경골 마커(20)의 위치 및 자세의 변화에 관한 변환행렬, RTT'는 경골 마커(20)의 변형 발생 후의 로봇 마커(31)에 관한 경골 마커(20)의 변환행렬, RTT는 경골 마커(20)의 변형 발생 전의 로봇 마커(31)에 관한 경골 마커(20)의 변환행렬, RTA 는 경골 마커(20)의 변형 발생 전의 로봇 마커(31)에 대한 앵글 조인트 중심(기준 위치)의 변환행렬, TTA 는 경골 마커(20)의 변형 발생 전의 경골 마커(20)에 대한 앵글 조인트 중심(기준 위치)의 변환행렬을 의미한다.) 이때, 리커버리부(157)는 기준 데이터 저장부(153)에 RTT 값이 저장된 경우, (3) 수식을 사용하고, 기준 데이터 저장부(153)에 RTA, TTA 값이 저장된 경우, (4) 수식을 사용하여 TTT'값을 산출할 수 있다. (Where, T T T 'is converted according to the change of position and attitude of the deformation before and after the tibia marker 20 of the tibia marker 20 matrix, R T T' is the robot after the deformation of the tibia marker 20 marker The transformation matrix of the tibia marker 20 with respect to (31), R T T is the transformation matrix of the tibia marker 20 with respect to the robot marker 31 before the deformation of the tibia marker 20 occurs, and R T A is the tibia marker ( The transformation matrix of the center of the angle joint (reference position) with respect to the robot marker 31 before deformation of 20), T T A is the center of the angle joint with respect to the tibia marker 20 before the deformation of the tibia marker 20 occurs (reference position ) Means a transformation matrix of ).) At this time, when the R T T value is stored in the reference data storage unit 153, (3) When an equation is used and the R T A and T T A values are stored in the reference data storage unit 153, the T T T'value may be calculated using the equation (4).
리커버리부(157)는 경골 마커(20)의 변형 발생 전후의 경골 마커(20)의 위치 및 자세의 변화에 관한 상관관계(TTT')에 기초하여 경골 마커(20)의 변형 후의 경골 마커(20)와 앵클 조인트 중심(AJC) 간의 상관관계를 도출하여 재설정할 수 있다. The recovery unit 157 is a tibial marker after deformation of the tibia marker 20 based on a correlation (T T T′ ) with respect to a change in the position and posture of the tibia marker 20 before and after deformation of the tibia marker 20 occurs. It can be reset by deriving the correlation between (20) and the center of the ankle joint (AJC).
[수학식 4][Equation 4]
T'TA = inv(TTT')x TTA T 'T A = inv (T T T') x T T A
(여기서, T'TA는 경골 마커(20)의 변형 후의 경골 마커(20)에 관한 앵클 조인트 중심(AJC)의 변환행렬을 의미한다)(Here, T 'T A denotes a transformation matrix of the ankle joint center (AJC) according to a modified tibial marker 20 after the tibia marker 20)
리커버리부(157)에 의해 재설정된 대상체 마커(10, 20)와 수술 대상체(1, 2) 간의 상관관계는 대상체 정합부(152a) 및 로봇 정합부(152b)에 저장되고, 위치 산출부(154)는 새롭게 설정된 대상체 마커(10, 20)와 수술 대상체(1, 2) 간의 상관관계게 기초하여 수술 대상체(1, 2) 위치 및 자세를 추적할 수 있다. The correlation between the object markers 10 and 20 reset by the recovery unit 157 and the surgical objects 1 and 2 is stored in the object matching unit 152a and the robot matching unit 152b, and the position calculating unit 154 ) May track the position and posture of the surgical objects 1 and 2 based on the correlation between the newly set object markers 10 and 20 and the surgical objects 1 and 2.
만약, 로봇 마커(31)가 외력 등에 의해 변형이 발생한 경우에는 로봇 마커(31) 확인용 복구(restoration) 마커를 별도로 구비하는 것이 바람직하다.If the robot marker 31 is deformed due to an external force or the like, it is preferable to separately provide a restoration marker for confirming the robot marker 31.
전술한 실시예에서는 대상체 마커(10, 20)상의 기준 포인트의 기준 위치를 저장하고, 수술 중에 대상체 마커(10, 20)의 위치를 추적하고 이로부터 기준 포인트의 변화여부를 확인함으로써 대상체 마커(10, 20)의 변형 여부를 판단하였으나, 본 발명의 다른 실시예에 따르면, 기준 포인트의 기준 위치로 하는 대상체 마커(10, 20)의 변환 가능 위치에 관한 기준위치관계를 산출하고 이를 기준 위치 저장부(153)에 저장할 수도 있다.In the above-described embodiment, by storing the reference position of the reference point on the object markers 10 and 20, tracking the position of the object markers 10 and 20 during surgery, and checking whether or not the reference point has changed therefrom, the object marker 10 , 20) was determined, but according to another embodiment of the present invention, a reference position relationship with respect to the convertible position of the object markers 10 and 20 as the reference position of the reference point is calculated, and the reference position storage unit You can also save it to (153).
마커 변형 판단부(155)는 추적기(40)로부터 대상체 마커(10, 20)의 위치 및 자세정보를 획득하고 대상체 마커(10, 20)의 위치 및 자세정보가 기준 위치 저장부(153)에 저장된 기준위치관계를 만족하는지를 판단하여 마커의 변형 여부를 판단할 수 있다. 도 3을 참조하면, 대상체 마커(10, 20)는 기준 포인트의 기준위치를 기준으로 A1 및 A2의 구 표면에만 위치할 수 있다. 예를 들어, A1 및 A2의 구 표면이 기준위치관계가 될 수 있다. The marker deformation determination unit 155 acquires the position and posture information of the object markers 10 and 20 from the tracker 40, and stores the position and posture information of the object markers 10 and 20 in the reference position storage unit 153. It is possible to determine whether the marker is deformed by determining whether the reference position relationship is satisfied. Referring to FIG. 3, the object markers 10 and 20 may be positioned only on the sphere surfaces of A1 and A2 based on the reference position of the reference point. For example, the spherical surfaces of A1 and A2 may be the reference positional relationship.
전술한 실시예에서는 대상체 마커(10, 20)의 위치 및 자세정보로부터 기준 포인트의 위치 및 자세를 산출하는데 비해, 본 실시예에서는 대상체 마커(10, 20)의 위치 및 자세정보가 기준위치관계를 만족하는지 확인한다는 점에서 차이가 있다.In the above-described embodiment, the position and posture of the reference point is calculated from the position and posture information of the object markers 10 and 20, whereas in the present embodiment, the position and posture information of the object markers 10 and 20 determine the reference position relationship. The difference is that you make sure you are satisfied.
도 8은 본 발명의 일 실시예에 따른 수술 내비게이션 장치에 의한 대상체 마커(10, 20) 변형 여부의 판단 방법을 설명하기 위한 흐름도이다. 전술한 실시예와 중복되는 설명은 생략하기로 한다. 도 8을 참조하면, 본 발명의 일 실시예에 따른 수술 내비게이션 방법은 먼저 프로브를 이용하여 대상체 마커(10, 20)와 수술 대상체(1, 2)의 위치를 인식하는 과정을 통해, 추적기(40)에 의해 획득한 마커의 제1 영상과 수술전에 미리 획득한 수술 대상체(1, 2)에 관한 제2 영상의 정합을 수행한다(S10). 이와 같은 정합 과정을 통해 대상체 마커(10, 20)와 수술 대상체(1, 2) 간의 상관관계를 도출한다(S11). 8 is a flowchart illustrating a method of determining whether object markers 10 and 20 are deformed by the surgical navigation device according to an embodiment of the present invention. A description redundantly with the above-described embodiment will be omitted. Referring to FIG. 8, in the surgical navigation method according to an embodiment of the present invention, the tracker 40 is first performed by recognizing the positions of the target markers 10 and 20 and the targets 1 and 2 using a probe. The first image of the marker acquired by) and the second image of the surgical objects 1 and 2 acquired before the surgery are matched (S10). Through such a matching process, a correlation between the object markers 10 and 20 and the surgical objects 1 and 2 is derived (S11).
이때, 수술 대상체(1, 2)의 다수의 포인트 중에서 대상체의 움직임의 기준이 되는 기준 포인트에 관한 기준 위치를 설정하고 해당 위치값을 저장한다(S12). 본 실시예에서는 수술 대상체(1, 2)는 대퇴골(1) 및 경골(2), 기준 포인트는 힙 조인트 중심 및 앵클 조인트 중심을 포함하는 것으로 설명한다. 또한, 기준 포인트의 기준위치는 추적기(40)의 좌표계를 기준으로 하는 위치값으로 저장된다. In this case, a reference position with respect to a reference point, which is a reference of the movement of the object, is set among the plurality of points of the surgical object 1 and 2, and the corresponding position value is stored (S12). In the present embodiment, it will be described that the surgical objects 1 and 2 include the femur 1 and the tibia 2, and the reference point includes the hip joint center and the ankle joint center. In addition, the reference position of the reference point is stored as a position value based on the coordinate system of the tracker 40.
수술 중에 추적기(40)를 통해 대상체 마커(10, 20)의 위치 및 자세를 획득하고(S13), 전술한 정합과정에서 산출한 대상체 마커(10, 20)와 수술 대상체(1, 2)의 기준 포인트의 위치 및 자세의 상관관계를 이용하여 기준 포인트의 위치 및 자세를 도출한다(S14). 수술 대상체(1, 2)에 부착된 대상체 마커(10, 20)와 기준 포인트 사이의 상관관계는 추적기(40)의 이동이나 수술 대상체(1, 2)의 움직임에 의해서도 변경되지 않는다. The position and posture of the target markers 10 and 20 are acquired through the tracker 40 during surgery (S13), and the target markers 10 and 20 calculated in the above-described registration process and the targets 1 and 2 are referenced. The position and posture of the reference point is derived by using the correlation between the position and posture of the point (S14). The correlation between the object markers 10 and 20 attached to the surgical objects 1 and 2 and the reference point is not changed by the movement of the tracker 40 or the movement of the surgical objects 1 and 2.
마커 변형 판단부(155)는 기준 포인트의 위치가 기준 위치 저장부(153)에 저장된 기준 위치와 동일한지 여부를 판단한다(S15). 만약, 불일치하는 경우 대상체 마커(10, 20)의 변형으로 판단하고(S16), 만약 일치하는 경우에는 대상체 마커(10, 20)가 정상상태인 것으로 판단한다(S17). The marker deformation determination unit 155 determines whether the position of the reference point is the same as the reference position stored in the reference position storage unit 153 (S15). If there is a discrepancy, it is determined that the object markers 10 and 20 are deformed (S16), and if they do match, it is determined that the object markers 10 and 20 are in a normal state (S17).
도 9는 본 발명의 다른 실시예에 따른 수술 내비게이션 장치에 의한 대상체 마커(10, 20) 변형 여부의 판단 방법을 설명하기 위한 흐름도이다. 전술한 실시예와 중복되는 설명은 생략하기로 한다. 9 is a flowchart illustrating a method of determining whether object markers 10 and 20 are deformed by the surgical navigation device according to another embodiment of the present invention. A description redundantly with the above-described embodiment will be omitted.
도 9를 참조하면, 본 발명의 다른 실시예에 따른 수술 내비게이션 방법은 먼저 프로브를 이용하여 대상체 마커(10, 20)와 수술 대상체(1, 2)의 위치를 인식하는 과정을 통해, 추적기(40)에 의해 획득한 마커의 제1 영상과 수술전에 미리 획득한 수술 대상체(1, 2)에 관한 제2 영상의 정합을 수행한다(S20). 이와 같은 정합 과정을 통해 대상체 마커(10, 20)와 수술 대상체(1, 2) 간의 상관관계를 도출한다(S21). Referring to FIG. 9, in a surgical navigation method according to another embodiment of the present invention, a tracker 40 is first performed through a process of recognizing the positions of the target markers 10 and 20 and the surgical targets 1 and 2 using a probe. The first image of the marker acquired by) and the second image of the surgical objects 1 and 2 acquired before the surgery are matched (S20). Through such a matching process, a correlation between the object markers 10 and 20 and the surgical objects 1 and 2 is derived (S21).
이때, 수술 대상체(1, 2)의 다수의 포인트 중에서 대상체의 움직임의 기준이 되는 기준 포인트의 위치를 기준 위치로 하는 대상체 마커(10, 20)의 변화가능위치에 관한 기준위치관계를 산출하여 기준 위치 저장부(153)에 저장한다(S22). 도 3을 참조하면, A1 및 A2의 구 표면이 기준위치관계가 될 수 있다. At this time, the reference position relationship with respect to the changeable position of the object markers 10 and 20 is calculated based on the position of the reference point, which is the reference point of the movement of the object, among the plurality of points of the surgical object 1 and 2 It is stored in the location storage unit 153 (S22). Referring to FIG. 3, the spherical surfaces of A1 and A2 may be the reference positional relationship.
수술 중에 추적기(40)를 통해 대상체 마커(10, 20)의 위치 및 자세를 획득하고(S23), 대상체 마커(10, 20)의 위치와 자세가 기준 위치 저장부(153)에 저장된 기준 위치 관계를 만족하는지 여부를 판단한다(S24). 만약, 대상체 마커(10, 20)의 위치 및 자세가 기준 위치 관계를 벗어나는 경우 대상체 마커(10, 20)의 변형으로 판단하고(S25), 만약 기준 위치 관계를 만족하는 경우에는 대상체 마커(10, 20)가 정상상태인 것으로 판단한다(S26).The position and posture of the target markers 10 and 20 are acquired through the tracker 40 during surgery (S23), and the position and posture of the target markers 10 and 20 are stored in the reference position storage unit 153. It is determined whether or not (S24). If the position and posture of the object markers 10 and 20 are out of the reference positional relationship, it is determined as the deformation of the object markers 10 and 20 (S25), and if the reference positional relationship is satisfied, the object marker 10, 20) is determined to be in a normal state (S26).
도 10은 본 발명의 다른 실시예에 따른 수술 내비게이션 장치에 의한 대상체 마커(10, 20) 변형 여부의 판단 및 리커버리 방법을 설명하기 위한 흐름도이다. 10 is a flowchart illustrating a method of determining whether object markers 10 and 20 are deformed and recovering by a surgical navigation device according to another embodiment of the present invention.
도 10을 참조하면, 본 발명의 일 실시예에 따른 수술 내비게이션 방법은 프로브를 이용하여 대상체 마커(10, 20)와 수술 대상체(1, 2)의 위치를 인식하는 과정을 통해, 추적기(40)에 의해 획득한 마커의 제1 영상과 수술전에 미리 획득한 수술 대상체(1, 2)에 관한 제2 영상의 정합을 수행한다(S30). 한편, 로봇의 정합을 통해 로봇과 로봇 마커(31)와의 위치 및 자세에 관한 상관관계를 도출한다. 또한, 로봇을 수술 가능 영역으로 이동시켜 배치하고, 수술 대상체(1, 2)를 고정시킨다. 추적기(40)를 통해 로봇 마커(31)와 대상체 간의 위치 및 자세의 상관관계, 로봇 마커(31)와 수술 대상체(1, 2) 간의 위치 및 자세에 관한 상관관계를 도출한다(S31). Referring to FIG. 10, in a surgical navigation method according to an embodiment of the present invention, a tracker 40 is provided through a process of recognizing the positions of the target markers 10 and 20 and the surgical targets 1 and 2 using a probe. The first image of the marker acquired by the method and the second image of the surgical objects 1 and 2 acquired before the surgery are matched (S30). On the other hand, the correlation of the position and posture between the robot and the robot marker 31 is derived through the registration of the robot. In addition, the robot is moved to the operable area and placed, and the surgical objects 1 and 2 are fixed. The correlation between the position and posture between the robot marker 31 and the object and the position and posture between the robot marker 31 and the surgical objects 1 and 2 are derived through the tracker 40 (S31).
이때, 수술 대상체(1, 2)의 다수의 포인트 중에서 대상체의 움직임의 기준이 되는 기준 포인트에 관한 기준 위치를 설정하고 해당 위치값을 저장한다(S32). 여기서, 기준 포인트에 관한 기준위치는 추적기(40)의 좌표계, 로봇 마커(31)의 좌표계, 및 수술 로봇(30)의 베이스의 좌표계 중 적어도 하나를 기준으로 하는 위치 및 자세정보를 포함한다. 본 실시예에서는 로봇 마커(31) 또는 수술 로봇(30)의 좌표계를 기준으로 하는 기준 포인트의 기준위치는 리커버리 시 사용될 수 있으며, 대상체 마커의 변형 여부의 판단은 추적기(40)의 좌표계, 로봇 마커(31)의 좌표계 및 수술 로봇(30)의 좌표계 중 어느 하나를 기준으로 할 수 있다. 또한, 로봇 정합부(152b)에서 산출된 로봇 마커(31)와 기준 포인트의 기준 위치 간의 위치 및 자세에 관한 상관관계(RTH, RTA)와, 로봇 마커(31)와 대상체 마커(10, 20) 간의 위치 및 자세에 관한 상관관계(RTF, RTT)는 리커버리 시 기준 위치 저장부(153)에 저장되고, 리커버리 시 사용된다. At this time, a reference position with respect to a reference point, which is a reference point for movement of the object, is set among the plurality of points of the surgical object 1 and 2, and the corresponding position value is stored (S32). Here, the reference position with respect to the reference point includes position and posture information based on at least one of the coordinate system of the tracker 40, the coordinate system of the robot marker 31, and the coordinate system of the base of the surgical robot 30. In this embodiment, the reference position of the reference point based on the coordinate system of the robot marker 31 or the surgical robot 30 may be used for recovery, and the determination of whether the object marker is deformed is the coordinate system of the tracker 40, the robot marker. It may be based on any one of the coordinate system of (31) and the coordinate system of the surgical robot (30). In addition, correlations (R T H , R T A ) regarding the position and posture between the robot marker 31 calculated by the robot matching unit 152b and the reference position of the reference point, and the robot marker 31 and the object marker ( The correlations (R T F , R T T ) regarding the position and posture between 10 and 20) are stored in the reference position storage unit 153 at the time of recovery, and are used at the time of recovery.
수술 중에 추적기(40)를 통해 대상체 마커(10, 20)의 위치 및 자세를 획득하고(S33), 전술한 과정에서 산출한 대상체 마커(10, 20)와 수술 대상체(1, 2)의 기준 포인트의 위치 및 자세의 상관관계를 이용하여 기준 포인트의 위치 및 자세를 산출한다(S34). During surgery, the position and posture of the object markers 10 and 20 are acquired through the tracker 40 (S33), and the reference points of the object markers 10 and 20 calculated in the above-described process and the operation objects 1 and 2 The position and posture of the reference point is calculated using the correlation between the position and posture of (S34).
마커 변형 판단부(155)는 기준 포인트의 위치가 기준 위치 저장부(153)에 저장된 기준 위치와 동일한지 여부를 판단한다(S35). 만약, 불일치하는 경우 대상체 마커(10, 20)의 변형으로 판단하고(S36), 리커버리부(157)에 의해 리커버리가 수행된다. 한편, 기준 포인트의 위치가 기준 위치와 동일한 경우 대상체 마커(10, 20)가 정상상태인 것으로 판단한다(S39)The marker deformation determination unit 155 determines whether the position of the reference point is the same as the reference position stored in the reference position storage unit 153 (S35). If there is a discrepancy, it is determined that the object markers 10 and 20 are deformed (S36), and recovery is performed by the recovery unit 157. Meanwhile, when the position of the reference point is the same as the reference position, it is determined that the object markers 10 and 20 are in a normal state (S39).
리커버리부(157)는 로봇 마커(31)를 이용하여, 변경된 대상체 마커(10, 20)를 기준으로 대상체 마커(10, 20)와 수술 대상체(1, 2) 간의 위치 및 자세에 관한 상관관계를 재설정한다. 리커버리부(157)는 대상체 마커(10, 20)의 변형 발생 전후의 대상체 마커(10, 20)와 로봇 마커(31)의 상관관계의 변화에 기초하여 대상체 마커(10, 20)의 위치 및 자세의 변화에 관한 상관관계를 도출하고(S37), 이에 기초하여 대상체 마커(10, 20)의 변형 후의 대상체 마커(10, 20)와 수술 대상체(1, 2), 예컨대 수술 대상체(1, 2)의 기준 포인트 간의 상관관계를 도출하여 재설정할 수 있다(S38). The recovery unit 157 uses the robot marker 31 to determine the correlation with respect to the position and posture between the object markers 10 and 20 and the surgical objects 1 and 2 based on the changed object markers 10 and 20. Reset. The recovery unit 157 determines the position and posture of the object markers 10 and 20 based on a change in the correlation between the object markers 10 and 20 and the robot marker 31 before and after deformation of the object markers 10 and 20 occurs. Derive a correlation for the change of (S37), and based on this, the object markers 10 and 20 after the deformation of the object markers 10 and 20 and the surgical object 1 and 2, for example, the surgical object 1 and 2 It can be reset by deriving the correlation between the reference points of (S38).
리커버리부(157)에 의해 재설정된 대상체 마커(10, 20)와 수술 대상체(1, 2) 간의 상관관계는 대상체 정합부(152a) 및 로봇 정합부(152b)에 저장되고, 위치 산출부(154)는 새롭게 설정된 대상체 마커(10, 20)와 수술 대상체(1, 2) 간의 상관관계에 기초하여 수술 대상체(1, 2)의 위치 및 자세를 추적할 수 있다. The correlation between the object markers 10 and 20 reset by the recovery unit 157 and the surgical objects 1 and 2 is stored in the object matching unit 152a and the robot matching unit 152b, and the position calculating unit 154 ) May track the position and posture of the surgical objects 1 and 2 based on the correlation between the newly set object markers 10 and 20 and the surgical objects 1 and 2.

Claims (10)

  1. 수술 내비게이션 장치에 있어서,In the surgical navigation device,
    수술 대상체에 부착된 대상체 마커를 포함하는 제1 영상과 수술 전 수술 대상체에 관한 제2 영상을 정합하여 상기 수술 대상체에 부착된 대상체 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계를 도출하는 대상체 정합부;An object that derives a correlation with respect to the position and posture between the object marker attached to the surgical object and the surgical object by matching a first image including an object marker attached to an object to be operated on and a second image on the object to be operated before surgery Mating part;
    상기 수술 대상체의 적어도 하나의 기준 포인트에 관한 기준위치를 설정하여 저장하는 기준 위치 저장부;A reference position storage unit for setting and storing a reference position with respect to at least one reference point of the surgical subject;
    추적기로부터 상기 대상체 마커의 위치 및 자세정보를 수신하고, 상기 도출된 상관관계에 기초하여 상기 수술 대상체의 상기 기준 포인트의 위치를 산출하는 위치 산출부; 및A position calculating unit that receives the position and posture information of the target marker from a tracker and calculates the position of the reference point of the operation subject based on the derived correlation; And
    상기 위치 산출부로부터 산출된 상기 기준 포인트의 위치가 상기 기준위치에서 벗어난 경우 상기 대상체 마커의 변형으로 판단하는 마커 변형 판단부를 포함하는 것을 특징으로 하는 수술 내비게이션장치. And a marker deformation determination unit configured to determine as a deformation of the target marker when the position of the reference point calculated from the position calculation unit deviates from the reference position.
  2. 수술 내비게이션 장치에 있어서,In the surgical navigation device,
    수술 대상체에 부착된 대상체 마커를 포함하는 제1 영상과 수술 전 수술 대상체에 관한 제2 영상을 정합하여 상기 수술 대상체에 부착된 대상체 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계를 도출하는 대상체 정합부;An object that derives a correlation with respect to the position and posture between the object marker attached to the surgical object and the surgical object by matching a first image including an object marker attached to an object to be operated on and a second image on the object to be operated before surgery Mating part;
    상기 수술 대상체의 적어도 하나의 기준 포인트를 기준위치로 하는 상기 대상체 마커의 변화가능위치에 관한 기준위치관계를 산출하여 저장하는 기준 위치 저장부; A reference position storage unit that calculates and stores a reference position relationship with respect to a changeable position of the target marker using at least one reference point of the object to be operated as a reference position;
    추적기로부터 상기 대상체 마커의 위치 및 자세정보를 수신하고, 상기 대상체 마커의 위치 및 자세가 상기 기준위치관계를 벗어난 경우 상기 대상체 마커의 변형으로 판단하는 마커 변형 판단부를 포함하는 것을 특징으로 하는 수술 내비게이션장치.A surgical navigation apparatus comprising: a marker deformation determination unit configured to receive position and posture information of the target marker from a tracker, and determine as a deformation of the target marker when the position and posture of the target marker deviates from the reference position relationship. .
  3. 제1항 또는 제2항에 있어서, The method according to claim 1 or 2,
    수술 로봇에 부착된 로봇 마커를 포함하는 영상에 기초하여 상기 로봇 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계, 및 상기 로봇 마커와 상기 기준 포인트 간의 위치 및 자세에 관한 상관관계를 도출하는 로봇 정합부; 및A robot that derives a correlation regarding the position and posture between the robot marker and the surgical object, and a correlation regarding the position and posture between the robot marker and the reference point based on an image including a robot marker attached to a surgical robot Mating part; And
    상기 대상체 마커의 변형 전후의 상기 대상체 마커와 상기 로봇 마커 간의 상관관계의 변화에 기초하여 상기 대상체 마커의 변형 전후의 위치 및 자세에 관한 상관관계를 도출하고, 상기 대상체 마커의 변형 전후의 상관관계에 기초하여 상기 대상체 마커와 상기 수술 대상체 간의 상관관계를 재설정하는 리커버리부를 더 포함하는 것을 특징으로 하는 수술 내비게이션장치. Based on a change in the correlation between the object marker and the robot marker before and after the object marker is deformed, a correlation with respect to the position and posture before and after the deformation of the object marker is derived, and the correlation before and after the object marker is deformed. A surgical navigation device, further comprising a recovery unit configured to reset a correlation between the object marker and the surgical object based on the object marker.
  4. 제3항에 있어서,The method of claim 3,
    상기 수술 대상체는 대퇴골 및 경골 중 적어도 하나를 포함하고;The operation subject includes at least one of a femur and a tibia;
    상기 기준 포인트는 힙 조인트 중심 및 앵클 조인트 중심 중 적어도 하나를 포함하는 것을 특징으로 하는 수술 내비게이션장치. The reference point is a surgical navigation device comprising at least one of a hip joint center and an ankle joint center.
  5. 제4항에 있어서,The method of claim 4,
    상기 수술 대상체의 적어도 하나의 기준포인트의 기준위치는 상기 추적기의 좌표계와 상기 수술 로봇의 좌표계, 상기 로봇 마커의 좌표계 중 적어도 하나를 기준으로 한 위치인 것을 특징으로 하는 수술 내비게이션장치.The reference position of the at least one reference point of the object to be operated is a position based on at least one of a coordinate system of the tracker, a coordinate system of the surgical robot, and a coordinate system of the robot marker.
  6. 수술 내비게이션 방법에 있어서,In the surgical navigation method,
    수술 대상체에 부착된 대상체 마커를 포함하는 제1 영상을 수술 전 수술 대상체에 관한 제2 영상과 정합하여 상기 수술 대상체에 부착된 대상체 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계를 도출하는 단계;Matching a first image including an object marker attached to an object to be operated on with a second image relating to the object to be operated before surgery to derive a correlation regarding the position and posture between the object marker attached to the object to be operated and the object to be operated ;
    상기 수술 대상체의 적어도 하나의 기준 포인트에 관한 기준위치를 설정하여 저장하는 단계; Setting and storing a reference position with respect to at least one reference point of the surgical subject;
    추적기로부터 상기 대상체 마커의 위치 및 자세정보를 수신하고, 상기 도출된 상관관계에 기초하여 상기 수술 대상체의 상기 기준 포인트의 위치를 산출하는 단계; 및Receiving position and posture information of the object marker from a tracker, and calculating a position of the reference point of the operation object based on the derived correlation; And
    산출된 상기 기준 포인트의 위치가 상기 기준위치에서 벗어난 경우 상기 대상체 마커의 변형으로 판단하는 단계를 포함하는 것을 특징으로 하는 수술 내비게이션 방법.And determining that the target marker is deformed when the calculated position of the reference point deviates from the reference position.
  7. 수술 내비게이션 방법에 있어서,In the surgical navigation method,
    수술 대상체에 부착된 대상체 마커를 포함하는 제1 영상과 수술 전 수술 대상체에 관한 제2 영상을 정합하는 단계;Matching the first image including the object marker attached to the surgical object and the second image about the pre-operative surgical object;
    상기 수술 대상체에 부착된 대상체 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계를 도출하는 단계;Deriving a correlation with respect to a position and posture between an object marker attached to the surgical object and the surgical object;
    적어도 하나의 기준 포인트를 기준위치로 하는 상기 대상체 마커의 변화가능위치에 관한 기준위치관계를 산출하여 저장하는 단계; Calculating and storing a reference position relationship with respect to a changeable position of the object marker using at least one reference point as a reference position;
    추적기로부터 상기 대상체 마커의 위치 및 자세정보를 수신하는 단계; 및Receiving position and posture information of the object marker from a tracker; And
    상기 대상체 마커의 위치 및 자세가 상기 기준위치관계를 벗어난 경우 상기 대상체 마커의 변형으로 판단하는 단계를 포함하는 것을 특징으로 하는 수술 내비게이션 방법. And determining that the target marker is deformed when the position and posture of the target marker deviate from the reference position relationship.
  8. 제6항 또는 제7항에 있어서,The method according to claim 6 or 7,
    수술 로봇에 부착된 로봇 마커를 포함하는 영상에 기초하여 상기 로봇 마커와 상기 수술 대상체 간의 위치 및 자세에 관한 상관관계, 및 상기 로봇 마커와 상기 포인트 간의 위치 및 자세에 관한 상관관계를 도출하는 단계; Deriving a correlation with respect to the position and posture between the robot marker and the surgical object, and a correlation with respect to the position and posture between the robot marker and the point, based on an image including a robot marker attached to a surgical robot;
    상기 대상체 마커의 변형 전후의 상기 대상체 마커와 상기 로봇 마커 간의 위치 및 자세에 관한 상관관계의 변화에 기초하여 상기 대상체 마커의 변형 전후의 위치 및 자세에 관한 상관관계를 도출하는 단계; 및Deriving a correlation with respect to the position and posture of the object marker before and after the deformation of the object marker based on a change in the correlation with respect to the position and posture between the object marker and the robot marker before and after the object marker is deformed; And
    상기 대상체 마커의 변형 전후의 상관관계에 기초하여 상기 대상체 마커와 상기 포인트 간의 위치 및 자세에 관한 상관관계를 재설정하는 단계를 더 포함하는 것을 특징으로 하는 수술 내비게이션 방법. And reestablishing a correlation with respect to a position and a posture between the object marker and the point based on a correlation before and after the deformation of the object marker.
  9. 제6항 또는 제7항에 있어서,The method according to claim 6 or 7,
    상기 대상체 마커의 변형이 발생한 경우, 경고 메시지를 생성하여 출력하는 단계를 더 포함하는 것을 특징으로 하는 수술 내비게이션 방법.When the object marker is deformed, generating and outputting a warning message.
  10. 제6항 또는 제7항에 있어서,The method according to claim 6 or 7,
    상기 수술 대상체는 대퇴골 및 경골 중 적어도 하나를 포함하고;The operation subject includes at least one of a femur and a tibia;
    상기 기준 포인트는 힙 조인트 센터 및 앵클 조인트 센터 중 적어도 하나를 포함하며;The reference point includes at least one of a hip joint center and an ankle joint center;
    상기 기준포인트의 기준위치는 상기 추적기의 좌표계와 상기 수술 로봇의 좌표계, 상기 로봇 마커의 좌표계 중 적어도 하나를 기준으로 한 위치인 것을 특징으로 하는 수술 내비게이션 방법. The reference position of the reference point is a position based on at least one of a coordinate system of the tracker, a coordinate system of the surgical robot, and a coordinate system of the robot marker.
PCT/KR2020/011896 2019-09-18 2020-09-03 Device and method for surgical navigation WO2021054659A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190114451A KR102274175B1 (en) 2019-09-18 2019-09-18 Surgical navigation apparatus and the method thereof
KR10-2019-0114451 2019-09-18

Publications (2)

Publication Number Publication Date
WO2021054659A2 true WO2021054659A2 (en) 2021-03-25
WO2021054659A3 WO2021054659A3 (en) 2021-05-14

Family

ID=74884632

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/011896 WO2021054659A2 (en) 2019-09-18 2020-09-03 Device and method for surgical navigation

Country Status (2)

Country Link
KR (1) KR102274175B1 (en)
WO (1) WO2021054659A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115607286A (en) * 2022-12-20 2023-01-17 北京维卓致远医疗科技发展有限责任公司 Knee joint replacement surgery navigation method, system and equipment based on binocular calibration
CN117243699A (en) * 2023-11-14 2023-12-19 杭州三坛医疗科技有限公司 Displacement detection method and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10145587B4 (en) * 2001-09-15 2007-04-12 Aesculap Ag & Co. Kg Method and device for testing a marking element for displacement
KR101195994B1 (en) * 2011-02-11 2012-10-30 전남대학교산학협력단 Bone motion monitoring and path regenerating system using three-dimensional optical tracker
EP2861173B1 (en) * 2012-06-19 2016-04-06 Brainlab AG Method and apparatus for detecting undesirable rotation of medical markers
US9541630B2 (en) 2013-02-15 2017-01-10 Qualcomm Incorporated Method and apparatus for determining a change in position of a location marker
KR102296451B1 (en) * 2014-12-08 2021-09-06 큐렉소 주식회사 CT-Robot Registration System for Interventional Robot
KR101650821B1 (en) * 2014-12-19 2016-08-24 주식회사 고영테크놀러지 Optical tracking system and tracking method in optical tracking system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115607286A (en) * 2022-12-20 2023-01-17 北京维卓致远医疗科技发展有限责任公司 Knee joint replacement surgery navigation method, system and equipment based on binocular calibration
CN117243699A (en) * 2023-11-14 2023-12-19 杭州三坛医疗科技有限公司 Displacement detection method and device
CN117243699B (en) * 2023-11-14 2024-03-15 杭州三坛医疗科技有限公司 Displacement detection method and device

Also Published As

Publication number Publication date
WO2021054659A3 (en) 2021-05-14
KR20210033563A (en) 2021-03-29
KR102274175B1 (en) 2021-07-12

Similar Documents

Publication Publication Date Title
US11717351B2 (en) Navigation surgical system, registration method thereof and electronic device
WO2018080086A2 (en) Surgical navigation system
WO2021054659A2 (en) Device and method for surgical navigation
JP4331113B2 (en) How to determine the position of a joint point in a joint
US11259875B2 (en) Proximity-triggered computer-assisted surgery system and method
US5249581A (en) Precision bone alignment
WO2020101283A1 (en) Surgery assisting device using augmented reality
WO2014077613A1 (en) Robot for repositioning procedure, and method for controlling operation thereof
US20090306499A1 (en) Self-detecting kinematic clamp assembly
EP3273854A1 (en) Methods and systems for computer-aided surgery using intra-operative video acquired by a free moving camera
US11234770B2 (en) Femoral medial condyle spherical center tracking
CN110464457B (en) Surgical implant planning computer and method performed thereby, and surgical system
WO2019132427A1 (en) Laser target projection apparatus and control method thereof, and laser surgery induction system comprising laser target projection apparatus
EP2720634A1 (en) Method and device for determining the mechanical axis of a bone
WO2017090924A1 (en) System for identifying position of marker for orthopedic surgery and method for identifying same
WO2021206372A1 (en) Two-dimensional medical image-based spinal surgery planning apparatus and method
WO2021162287A1 (en) Method for verifying matching of surgery target, apparatus therefor, and system including same
WO2021045546A2 (en) Device for guiding position of robot, method therefor, and system comprising same
JP2022016415A (en) Instruments for navigated orthopedic surgery
WO2021153973A1 (en) Device for providing joint replacement robotic surgery information and method for providing same
WO2012108578A1 (en) System for bone motion monitoring and path correction using a three-dimensional optical measuring unit
CN114795376A (en) Auxiliary osteotomy system for joint replacement
WO2013105738A1 (en) Surgical robot control apparatus and method for controlling same
US20220218425A1 (en) System and method for ligament balancing with robotic assistance
WO2018124499A1 (en) Laser target projection apparatus and c-arm image matching method, recording medium for executing same, and laser surgery guidance system comprising matching tool

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20865697

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20865697

Country of ref document: EP

Kind code of ref document: A2