CN113303907A - System for robot-assisted correction of programs - Google Patents
System for robot-assisted correction of programs Download PDFInfo
- Publication number
- CN113303907A CN113303907A CN202110589543.1A CN202110589543A CN113303907A CN 113303907 A CN113303907 A CN 113303907A CN 202110589543 A CN202110589543 A CN 202110589543A CN 113303907 A CN113303907 A CN 113303907A
- Authority
- CN
- China
- Prior art keywords
- bone
- implant
- cutting tool
- surgical
- implant component
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012937 correction Methods 0.000 title description 6
- 239000007943 implant Substances 0.000 claims abstract description 216
- 210000000988 bone and bone Anatomy 0.000 claims abstract description 178
- 238000000034 method Methods 0.000 claims abstract description 113
- 238000005520 cutting process Methods 0.000 claims abstract description 85
- 230000007547 defect Effects 0.000 claims description 33
- 239000000523 sample Substances 0.000 claims description 31
- 238000003384 imaging method Methods 0.000 claims description 14
- 239000000945 filler Substances 0.000 claims description 9
- 238000002360 preparation method Methods 0.000 claims description 8
- 230000003014 reinforcing effect Effects 0.000 claims description 4
- 230000009471 action Effects 0.000 claims description 3
- 238000002059 diagnostic imaging Methods 0.000 claims 3
- 238000011882 arthroplasty Methods 0.000 claims 1
- 238000001356 surgical procedure Methods 0.000 abstract description 39
- 230000033001 locomotion Effects 0.000 abstract description 21
- 230000002596 correlated effect Effects 0.000 abstract description 3
- 239000012636 effector Substances 0.000 description 36
- 210000003127 knee Anatomy 0.000 description 26
- 238000003860 storage Methods 0.000 description 21
- 239000004568 cement Substances 0.000 description 16
- 230000008569 process Effects 0.000 description 14
- 206010065687 Bone loss Diseases 0.000 description 13
- 210000001624 hip Anatomy 0.000 description 12
- 238000012545 processing Methods 0.000 description 10
- 210000002303 tibia Anatomy 0.000 description 10
- 210000000689 upper leg Anatomy 0.000 description 10
- 238000001514 detection method Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 9
- 238000002604 ultrasonography Methods 0.000 description 9
- 238000002595 magnetic resonance imaging Methods 0.000 description 8
- 230000009466 transformation Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 210000003484 anatomy Anatomy 0.000 description 6
- 238000004590 computer program Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000002513 implantation Methods 0.000 description 6
- 238000005452 bending Methods 0.000 description 5
- 239000002639 bone cement Substances 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 230000001276 controlling effect Effects 0.000 description 4
- 238000011049 filling Methods 0.000 description 4
- 238000002594 fluoroscopy Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 125000006850 spacer group Chemical group 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 239000012620 biological material Substances 0.000 description 3
- 230000000875 corresponding effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 230000000399 orthopedic effect Effects 0.000 description 3
- 230000000644 propagated effect Effects 0.000 description 3
- 230000005855 radiation Effects 0.000 description 3
- 238000002271 resection Methods 0.000 description 3
- 238000013515 script Methods 0.000 description 3
- 210000004872 soft tissue Anatomy 0.000 description 3
- 208000027418 Wounds and injury Diseases 0.000 description 2
- 210000003423 ankle Anatomy 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 210000001981 hip bone Anatomy 0.000 description 2
- 210000004394 hip joint Anatomy 0.000 description 2
- 208000015181 infectious disease Diseases 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 210000000629 knee joint Anatomy 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000012829 orthopaedic surgery Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 210000002832 shoulder Anatomy 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 206010002091 Anaesthesia Diseases 0.000 description 1
- 241000287509 Piciformes Species 0.000 description 1
- 238000002679 ablation Methods 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000037005 anaesthesia Effects 0.000 description 1
- 238000012550 audit Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000002051 biphasic effect Effects 0.000 description 1
- 230000037182 bone density Effects 0.000 description 1
- 238000007469 bone scintigraphy Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 210000003109 clavicle Anatomy 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001054 cortical effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000005553 drilling Methods 0.000 description 1
- 238000012377 drug delivery Methods 0.000 description 1
- 210000001513 elbow Anatomy 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 238000002073 fluorescence micrograph Methods 0.000 description 1
- 210000002683 foot Anatomy 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000002262 irrigation Effects 0.000 description 1
- 238000003973 irrigation Methods 0.000 description 1
- 238000013150 knee replacement Methods 0.000 description 1
- 210000002414 leg Anatomy 0.000 description 1
- 210000003041 ligament Anatomy 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 210000004417 patella Anatomy 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 238000004321 preservation Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000002432 robotic surgery Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000004513 sizing Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 238000013179 statistical model Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 239000003826 tablet Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000008733 trauma Effects 0.000 description 1
- 210000003462 vein Anatomy 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- 230000002747 voluntary effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/14—Surgical saws ; Accessories therefor
- A61B17/15—Guides therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/17—Guides or aligning means for drills, mills, pins or wires
- A61B17/1703—Guides or aligning means for drills, mills, pins or wires using imaging means, e.g. by X-rays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/76—Manipulators having means for providing feel, e.g. force or tactile feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/02—Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computerised tomographs
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0875—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of bone
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/46—Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor
- A61F2/4603—Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor for insertion or extraction of endoprosthetic joints or of accessories thereof
- A61F2/4607—Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor for insertion or extraction of endoprosthetic joints or of accessories thereof of hip femoral endoprostheses
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/46—Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor
- A61F2002/4632—Special tools or methods for implanting or extracting artificial joints, accessories, bone grafts or substitutes, or particular adaptations therefor using computer-controlled surgery, e.g. robotic surgery
Abstract
A method of performing a revision surgery using a robot-assisted surgery system comprising: determining information related to an interface region between the implant component and the bone; and generating a planned virtual boundary in the representation of the implant and the bone associated with the portion of the interface region to be removed based at least in part on the information related to the interface region. The method further comprises the following steps: tracking movement of the cutting tool in physical space such that the movement of the cutting tool is correlated with movement of the virtual tool; and providing a constraint to the cutting tool while the cutting tool removes the portion of the interface region. The constraint is based on a relationship between the virtual tool and the planned virtual boundary. Removing the portion of the interface region to remove the implant component from the bone.
Description
The present application is a divisional application of chinese patent application having an application number of 201780056264.0, an application date of 2017, 7 and 13, and an invention name of "system for robot-assisted rework procedure".
Cross reference to related patent applications
This application claims priority and benefit from U.S. provisional patent application No. 62/363,037 filed on 2016, 7, 15, which is hereby incorporated by reference in its entirety.
Technical Field
The present invention relates to robot-assisted orthopaedic surgery, and in particular to robot-assisted revision surgery.
Background
Currently, surgeons manually perform revision surgeries, such as procedures to revise the knee and hip. Such manual surgery is not always accurate, difficult to perform, and may result in greater bone loss than desired, which reduces the strength and integrity of the bone. The occurrence of limited access and imprecise cutting, removal of the implant, and cementation of the implant can cause significant bone loss. During the procedure, the surgeon may use a chisel and a mini-saw to manually cut the implant. The surgeon must perform this procedure very slowly in order to protect the bone. However, due to the duration of anesthesia, the timing of the surgery may be critical to the patient. In addition, performing such procedures requires a significant amount of training.
Disclosure of Invention
According to one exemplary embodiment, there is a method of performing a revision surgery using a robot-assisted surgery system. The method includes determining, by processing circuitry associated with a computer, information related to an interface region between an implant component and a bone in which the implant component is implanted. The method also includes generating, by the processing circuitry, a planned virtual boundary in the representation of the implant component and the bone based at least in part on the information related to the interface region, the planned virtual boundary associated with the portion of the interface region to be removed. The method further comprises the following steps: tracking, by a navigation system associated with the computer, movement of the cutting tool in physical space such that the movement of the cutting tool is correlated with movement of the virtual tool; and providing a constraint to the cutting tool while the cutting tool removes the portion of the interface region, the constraint based on a relationship between the virtual tool and the planned virtual boundary. The portion of the interface region is removed to remove the implant component from the bone.
In some embodiments, determining information related to the interface region includes receiving an image of the bone and an implant component implanted on the bone. In some embodiments, the images are obtained in connection with a first procedure during which the implant component is implanted on the bone. In some embodiments, the image is received by at least one imaging modality from the group consisting of: CT, X-ray, fluoroscopy, MRI, ultrasound, video camera, and tracking markers. In some embodiments, determining information related to the interface region includes digitizing the interface region with a tracked probe.
In some embodiments, the method further includes receiving input for adjusting the virtual boundary relative to the representation of the implant and the bone. In some embodiments, the virtual boundary is a haptic boundary, and wherein providing the constraint comprises providing haptic feedback to the cutting tool. In some embodiments, the virtual boundary is an autonomous control boundary, and wherein providing the constraint includes autonomously controlling the surgical tool to remain within the control boundary. In some embodiments, the cutting tool is one or more tools selected from the group consisting of, but not limited to: flat saws, curved saws, lasers, water jet, ultrasonic vibration and abrasive drilling (burr).
In some embodiments, the method further comprises determining, by the processing circuitry, information related to at least one of a size, a number, and a location of a bone defect in the vicinity of the interface requiring the enhanced block. In some embodiments, the information is determined preoperatively. In some embodiments, the information is determined by digitizing the bone defect with a tracking probe.
In some embodiments, the method further comprises: obtaining an image of the bone using the camera after the implant component has been removed; and generating a bone model of the bone based on the image for planning the replacement of the implant component.
In some embodiments, the method further comprises determining a desired pose of the replacement implant component to be implanted on the bone. In some embodiments, the method further includes determining, by the processing circuitry, a second planned virtual boundary in the representation of the bone representative of one or more cuts in the bone to prepare the bone to receive a replacement implant. In some embodiments, the method further includes providing a constraint to the cutting tool while the cutting tool performs one or more cuts to prepare the bone, the constraint based on a relationship between the virtual tool and the second planned virtual boundary.
In another exemplary embodiment, there is a system for performing a revision surgery. The system comprises: a robotic system including an articulated arm and a surgical tool coupled to the articulated arm; a navigation system configured to characterize movement of at least one of an articulated arm, a surgical tool, and a portion of a patient anatomy for revision; and a processor operatively coupled to the robotic system and the navigation system. The processor is configured to: determining information about an interface region between the implant component and a bone in which the implant component is implanted; generating a planned virtual boundary based on the representation of the implant component and the bone based at least in part on the information related to the interface region, the planned virtual boundary associated with the portion of the interface region to be removed; tracking, using a navigation system, movement of the cutting tool in physical space such that the movement of the cutting tool is correlated with movement of the virtual tool; and providing a constraint to the cutting tool while the cutting tool removes the portion of the interface region, the constraint based on a relationship between the virtual tool and the planned virtual boundary.
In some embodiments, the system further comprises an imaging system operatively coupled to the processor to determine information related to the interface region, wherein the imaging system comprises at least one imaging modality from the group consisting of: CT, X-ray, fluoroscopy, MRI, ultrasound, video camera, and tracking markers. In some embodiments, the system further comprises a tracking probe for digitizing the interface region.
In some embodiments, a surgical tool coupled to an articulated arm includes an end effector. The end effector includes: at least one flexible bending element movable in two degrees of freedom, the bending element comprising a distal end, a proximal end, and an internal channel; a shaft coupled to the proximal end of the flexible bending element and configured to secure the end effector to a surgical system; and a motor housed in the shaft and coupled to the cutting tool to provide power to the cutting tool. A cutting element is coupled to the distal end of the flexible bending element.
In one embodiment, the robotic system is used to assist in knee or hip revision procedures. The robotic system may include a navigation system to register the real bone to the pre-scan CT image and to precisely guide the robotic arm to navigate through the patient anatomical space. The robotic system may have haptic capabilities where the user can shape the haptic volume based on patient anatomical features to protect important bone structures and soft tissues (e.g., ligaments, nerves, and veins). The system may also include a flexible end effector having multiple degrees of freedom and bendable 90 degrees in any direction to allow a cutting tool attached to the flexible arm to access a small area to resect the bone implant. The system may have a large database that holds patient bone and implant models and planning history during their first knee or hip procedure, which information is available for use in revision cases.
In another embodiment, a robotic system using previous/first knee and hip information of a patient can assist in revising a case by: creating a haptic revision boundary using a patient's previous/first knee and hip implant models to haptically guide a revision procedure; registering the bones to the robotic coordinates using a prior/first knee and hip bone model of the patient, wherein the patient does not need to take additional CT images for the revised case; and using the patient's prior/first knee and hip planning information to identify the relative position between the bone and the implant in the revision case, where there is no relative movement between the bone and the implant, the implant surface being used to register the bone to the robot coordinates.
In some embodiments, the robotic system creates a customized revision haptic boundary to prevent bone from being over cut and minimize bone loss during a revision procedure.
In some embodiments, based on the first-time knee and hip implant model, the robotic system accurately creates a modified haptic boundary around the first-time implant to constrain the cutting tool and minimize excessive cutting of the bone. In some embodiments, the implant and bone are registered to the first knee and hip CT images during revision using one of the following methods: a trackable probe for digitizing the implant surface and then registering the bone to the first CT image; shooting a plurality of fluorescence images; and/or attaching a camera or optical sensor to the robot to scan the implant surface and then register it to the first CT bone model. In some embodiments, the robotic system includes a smart flexible end effector system having multiple degrees of freedom and being bendable in all directions to allow the robot to cut bone also with limited access space, and the flexible end effector carries a high-speed rotary burr for cutting bone. In some embodiments, a camera or ultrasound device is used to generate an initial bone or implant model and/or to register the bone with the bone model.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate several embodiments and together with the detailed description, serve to explain the principles and features of the invention.
Fig. 1 illustrates a perspective view of an embodiment of a surgical system according to an exemplary embodiment.
FIG. 2 shows a block diagram of a computing system, according to an example embodiment.
Fig. 3A-3B illustrate X-rays showing a femur, a tibia, a femoral implant, and a tibial implant, according to an exemplary embodiment.
Fig. 4 shows the bone model and the implant model shown on the user interface during the first partial knee procedure.
Fig. 5A illustrates a flexible end effector for use with the surgical system of fig. 1, according to an exemplary embodiment.
Fig. 5B illustrates the flexible end effector of fig. 5A according to an exemplary embodiment.
Fig. 5C illustrates a close-up view of the flexible portion of the flexible end effector of fig. 5A, according to an exemplary embodiment.
Fig. 6A-6C illustrate various views of a femur, femoral implant, and end effector, according to an exemplary embodiment.
Fig. 7A and 7B illustrate the femoral implant and femur after non-robotic or manual removal.
FIG. 8 is a flowchart of a method of performing a revision surgery according to an exemplary embodiment.
Detailed Description
Before turning to the figures, which illustrate exemplary embodiments in detail, it should be understood that the application is not limited to the details or methodology set forth in the description or illustrated in the figures. It is also to be understood that the terminology is for the purpose of description and should not be regarded as limiting.
The present disclosure introduces a robot-assisted method to support revision procedures of joints, such as knee and hip joints, by allowing precise removal of a primary implant with minimal bone loss while reducing the time required to remove the primary implant. When bone loss is minimized, the number of revision procedures that may be performed on individual patients throughout their lifetime increases.
Although the present disclosure makes reference to and revision of knee and hip joints, the systems and methods disclosed herein are equally applicable to other orthopedic revision procedures for other bones or joints, including but not limited to the shoulder, wrist, ankle, spine, etc
The robotic-assisted surgical system of the present disclosure is designed to assist in revision procedures to minimize the amount of bone removed and/or damage to the bone. The robotic-assisted surgical system is also designed to shorten the lengthy learning curve for the surgeon to perform the revision procedure. A robotic-assisted surgical system may help reduce the time to make a revision and allow better bone recovery because bone may be less "damaged" due to the use of the robotic system. Furthermore, the present disclosure addresses one of the major problems of previously used systems, namely the visibility of interface failure progression. In some embodiments, the robotic system can provide a plan to the user to completely remove the interface and then assist the user in executing the plan while providing feedback during the removal process.
Exemplary robot System
Various features of the robot-assisted surgical systems and methods according to the present disclosure will now be described in more detail. Fig. 1 provides a schematic diagram of an exemplary Computer Assisted Surgery (CAS) system 100 in which processes and features associated with certain disclosed embodiments may be implemented. Surgical system 100 may be configured to perform various orthopaedic surgical procedures, such as, for example, knee revision procedures. The surgical system 100 includes a tracking system 101, a computing system 102, one or more display devices 103a, 103b, and a robotic system 104. It should be appreciated that the system 100 and the methods and processes described herein may be applicable to many different types of joint revision procedures. Although certain disclosed embodiments may be described with respect to knee revision procedures, the concepts and methods described herein may be applicable to other types of orthopaedic surgery, such as hip revision, shoulder revision procedures, and other types of orthopaedic procedures. Moreover, surgical system 100 may include additional components or fewer components than those described to facilitate surgery (e.g., surgical beds, etc.).
The surgeon may use the robotic system 104 in an interactive manner to perform a surgical procedure, such as a revision procedure, on the patient. As shown in fig. 1, the robotic system 104 includes a base 105, an articulated arm 106, a force system (not shown), and a controller (not shown). A surgical tool 110 (e.g., an end effector having an operating member, such as a saw, reamer, or burr) may be coupled to the articulated arm 106. The surgeon may manipulate the surgical tool 110 by grasping and manually moving the articulated arm 106 and/or the surgical tool 110.
The force system and controller are configured to provide cutting restraint guidance to the surgeon by controlling or guiding during manipulation of the surgical tool. The force system is configured to provide at least some of the force to the surgical tool via the articulated arm 106, and the controller is programmed to generate control signals for controlling the force system. In one embodiment, the force system includes an actuator and a back-drivable transmission that provide haptic (or force) feedback to constrain or inhibit a surgeon from manually moving a surgical tool outside of predefined haptic boundaries defined by haptic objects, such as described in U.S. patent No.8,010,180 and/or U.S. patent application serial No.12/654,519 (U.S. patent application publication No.2010/0170362), filed 12/22/2009, which are hereby incorporated by reference in their entirety. The force system and controller may be housed within the robotic system 104. In some embodiments, guidance of the cutting is provided by a hand manipulator or hand-held Robotic device (e.g., in U.S. patent publication No.9,399,298 entitled "Apparatus and Method for Providing an Adjustable Positive Stop in Space", U.S. patent No.9,060,794 entitled "System and Method for Robotic Surgery", and U.S. patent publication No. 3 entitled "cutting access which extends from and through an actuator for the positioning of the cutting access relative to the housing"), or by each of the teachings of the hand manipulator 2013/0060278 incorporated herein by reference in its entirety.
The tracking system 101 is configured to determine the pose (i.e., position and orientation) of one or more objects during a surgical procedure to detect movement of the objects. For example, the tracking system 101 may comprise a detection device that obtains the pose of the object with respect to a reference coordinate system of the detection device. As the object moves in the reference coordinate system, the detection device tracks the pose of the object to detect (or enable the surgical system 100 to determine) the movement of the object. As a result, the computing system 102 may capture data in response to movement of the tracked object or objects. The tracked objects may include, for example, tools/instruments, patient anatomy, implants/prosthetic devices, and components of the surgical system 100. Using the pose data from the tracking system 101, the surgical system 100 can also register (or map or correlate) coordinates in one space to coordinates in another space to achieve spatial alignment or correspondence (e.g., using well-known coordinate transformation procedures). The objects in physical space may be registered to any suitable coordinate system, such as a coordinate system used by a process running on a surgical controller and/or computer device of the robotic system 104. For example, using the pose data from the tracking system 101, the surgical system 100 can associate a physical anatomical feature, such as a patient's tibia, with a representation of the anatomical feature (such as an image displayed on the display device 103). Based on the tracked object and the registration data, the surgical system 100 can determine, for example, a spatial relationship between the image of the anatomical feature and the relevant anatomical feature.
The registration may include any known registration technique, such as, for example, image-to-image registration (e.g., unimodal registration, in which images of the same type or modality, e.g., fluoroscopic images or MR images, and/or multimodal registration, in which images of different types or modalities, e.g., MRI and CT, are registered), image-to-physical space registration (e.g., image-to-patient registration, in which a digital dataset of patient anatomical features obtained by conventional imaging techniques is registered with actual anatomical features of the patient), combined image-to-image and image-to-physical space registration (e.g., registration of pre-operative CT and MRI images with an intra-operative scene), and/or registration using a camera or ultrasound. Computing system 102 may also include a coordinate transformation process to map (or transform) coordinates in one space to coordinates in another space to achieve spatial alignment or correspondence. For example, the surgical system 100 may use a coordinate transformation process to map the location of a tracked object (e.g., a patient anatomical feature, etc.) into a coordinate system used by a process running on a computer and/or surgical controller of the haptic device. As is well known, the coordinate transformation process may include any suitable transformation technique, such as, for example, a rigid body transformation, a non-rigid body transformation, an affine transformation, and the like. In some embodiments, the camera includes a tracker and a bone scan to obtain a model and register the model. For example, an initial 3D model may be created and automatically registered. In some embodiments, a camera may be used to register the 3D model corresponding to the CT scan. According to some embodiments, a camera or ultrasound may be used for initial model creation and registration.
The tracking system 101 may be any tracking system that enables the surgical system 100 to continuously determine (or track) the pose of relevant anatomical features of the patient. For example, the tracking system 101 may include a non-mechanical tracking system, a mechanical tracking system, or any combination of non-mechanical and mechanical tracking systems suitable for use in a surgical environment. The non-mechanical tracking system may include an optical (or visual), magnetic, radio, or acoustic tracking system. Such systems typically comprise a detection device adapted to locate in a predefined coordinate space a specifically identifiable trackable element (or tracker) detectable by the detection device and configured to be attached to or be an inherent part of the object to be tracked. For example, the trackable element may include an array of markers having a unique geometric arrangement and known geometric relationships relative to the tracked object when the trackable element is attached to the tracked object. The known geometric relationship may be, for example, a predefined geometric relationship between the trackable elements and the end points and axes of the tracked object. Thus, the detection device may identify a particular tracked object at least in part from the geometry of the marker (if unique), the orientation of the axis, and the location of the endpoint within the reference frame inferred from the position of the marker.
The markers may include any known markers such as, for example, external markers (or fiducials) and/or internal features of the tracked object. External markers are artificial objects attached to the patient (e.g., markers attached to the skin, markers implanted in bone, stereotactic frames, etc.) and are designed to be visible to and accurately detectable by the detection device. Internal features are prominent and precisely localizable portions of the tracked object that are well defined and are discernable to function as identifiable markers (e.g., landmarks, contours of anatomical features, shapes, colors, or any other sufficiently identifiable visual indication). The markers may be located using any suitable detection method, such as, for example, well-known optical, electromagnetic, radio or acoustic methods. For example, an optical tracking system with a fixed stereo camera pair sensitive to infrared radiation may be used to track markers that emit infrared radiation either actively (such as a light emitting diode or LED) or passively (such as a spherical marker with a surface that reflects infrared radiation). Similarly, a magnetic tracking system may include a fixed field generator that emits a spatially varying magnetic field that is sensed by a small coil integrated into the tracked object.
The computing system 102 may be communicatively connected to the tracking system 101 and may be configured to receive tracking data from the tracking system 101. Based on the received tracking data, the computing system 102 may determine a position and orientation associated with one or more registration features of the surgical environment, such as the surgical tool 110 or a portion of a patient anatomical feature. The computing system 102 may also include surgical planning and surgical assistance software that may be used by a surgeon or surgical support personnel during a surgical procedure. For example, during a joint replacement procedure, the computing system 102 may display images related to a surgical procedure on one or both of the display devices 103a, 103 b.
Computing system 102 (and/or one or more components of surgical system 100) may include hardware and software for operating and controlling surgical system 100. Such hardware and/or software is configured to enable system 100 to perform the techniques described herein.
FIG. 2 illustrates a block diagram of computing system 102, according to an example embodiment. The computing system 102 includes a surgical controller 112, a display device 103 (e.g., display devices 103a and 103b), and an input device 116.
The surgical controller 112 may be any known computing system, but is preferably a programmable, processor-based system. For example, surgical controller 112 may include a microprocessor, a hard disk drive, Random Access Memory (RAM), Read Only Memory (ROM), input/output (I/O) circuitry, and any other known computer components. The surgical controller 112 is preferably adapted for use with various types of storage devices (permanent and removable) such as, for example, a portable drive, magnetic storage, solid state storage (e.g., flash memory cards), optical storage, and/or network/internet storage. Surgical controller 112 may comprise one or more computers, including, for example, a personal computer or workstation operating under a suitable operating system, and may include a Graphical User Interface (GUI).
Still referring to fig. 2, in an exemplary embodiment, the surgical controller 112 includes a processing circuit 120 having a processor 122 and a memory 124. The processor 122 may be implemented as a general-purpose processor executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit), a set of processing elements, or other suitable electronic processing elements. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. Memory 124 (e.g., memory unit, storage device, etc.) includes one or more devices (e.g., RAM, ROM, flash memory, hard disk storage, etc.) for storing data and/or computer code that completes or facilitates the various processes described herein. The memory 124 may be or include volatile memory or non-volatile memory. Memory 124 may include database components, object code components, script components, or any other type of information structure for supporting the various activities described herein. According to an exemplary embodiment, memory 124 is communicatively connected to processor 122 and includes computer code for performing one or more of the processes described herein. Memory 124 may contain various modules, each capable of storing data and/or computer code related to a particular type of function. In one embodiment, the memory 124 contains several modules related to the surgical procedure, such as a planning module 124a, a navigation module 124b, a registration module 124c, and a robot control module 124 d.
Alternatively or in addition, the program instructions may be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus. The computer storage medium may be or be included in a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Further, although the computer storage medium is not a propagated signal, the computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium may also be or be included in one or more separate components or media (e.g., multiple CDs, disks, or other storage devices). Thus, computer storage media may be tangible and non-transitory.
A computer program (also known as a program, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such a device. Furthermore, the computer may be embedded in another device, e.g., a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a Universal Serial Bus (USB) flash drive), to name a few. Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, such as internal hard disks or removable disks; magneto-optical disks; as well as CDROM and DVD-ROM discs. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a web browser through which a user can interact with an embodiment of the subject matter described in this specification, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network.
Referring to the embodiment of the surgical system 100 depicted in fig. 2, the surgical controller 112 also includes a communication interface 130. The communication interface 130 of the computing system 102 is connected via an interface to a computing device (not shown) of the robotic system 104 and via an interface to the tracking system 101. These interfaces may include physical interfaces and software interfaces. The physical interface of communication interface 130 may be or include a wired or wireless interface (e.g., jack, antenna, transmitter, receiver, transceiver, wired terminal, etc.) for communicating data with an external source via a direct connection or a network connection (e.g., an internet connection, LAN, WAN or WLAN connection, etc.). The software interface may reside on the surgical controller 112, a computing device (not shown) of the robotic system 104, and/or the tracking system 101. In some embodiments, the surgical controller 112 and the computing device (not shown) are the same computing device. The software may also operate on a remote server housed in the same building as surgical system 100, or on an external server site.
The display device 103 may be used to display any information useful to a medical procedure, such as, for example, images of anatomical features generated from an image dataset obtained using conventional imaging techniques, graphical models (e.g., CAD models of implants, instruments, anatomical features, etc.), graphical representations of tracked objects (e.g., anatomical features, tools, implants, etc.), constraint data (e.g., axes, articular surfaces, etc.), representations of implant components, digital or video images, registration information, calibration information, patient data, user data, measurement data, software menus, selection buttons, status information, and the like.
In addition to the display device 103, the computing system 102 may include an acoustic device (not shown) for providing audible feedback to the user. The acoustic device is connected to the surgical controller 112 and may be any known device for generating sound. For example, the acoustic device may include a speaker and sound card, a motherboard with integrated audio support, and/or an external sound controller. In operation, the acoustic device may be adapted to convey information to a user. For example, the surgical controller 112 may be programmed to signal the acoustic device to produce a sound, such as a speech synthesis to verbally indicate "complete," to indicate that the steps of the surgical procedure are complete. Similarly, the acoustic device may be used to alert the user to sensitive conditions, such as producing a tone to indicate that the surgical cutting tool is approaching a critical portion of soft tissue or is approaching a virtual control boundary.
To provide for other interactions with the user, embodiments of the subject matter described in this specification can be implemented on a computer having an input device 116, the input device 116 enabling the user to communicate with the surgical system 100. The input device 116 is connected to the surgical controller 112 and may include any device that enables a user to provide input to a computer. For example, the input device 116 may be a known input device such as a keyboard, mouse, trackball, touch screen, touch pad, voice recognition hardware, dials, switches, buttons, trackable probe, foot pedal, remote control device, scanner, camera, microphone, and/or joystick. For example, the input device 116 may allow a user to manipulate the virtual control boundary. Other types of devices may also be used to provide interaction with the user, for example, feedback provided to the user may be any form of sensory feedback, such as visual feedback, auditory feedback, or tactile feedback, and input from the user may be received in any form, including acoustic, speech, or tactile input. In addition, the computer may interact with the user by sending documents to and receiving documents from a device used by the user, for example, by sending web pages to a web browser on the user's client device in response to requests received from the web browser.
General surgical planning and navigation to perform the exemplary methods described above, including Haptic control and feedback described in connection with the surgical System 100, may be performed by a computerized surgical System, such as the System described in "tactile Guidance System and Method" of U.S. patent No.8,010,180 to Quaid et al, which is hereby incorporated by reference in its entirety.
Virtual object for robot-assisted surgery
Fig. 3A-3B illustrate example X-rays showing a femur (F), a tibia (T), a femoral implant 302, and a tibial implant 306, according to an example embodiment. Although X-ray images are shown in fig. 3A and 3B, other images may be acquired and used to generate the bone model using any of a variety of imaging techniques (e.g., CT, MRI, ultrasound, camera, etc.). As shown, the femoral implant 302 includes a protrusion, such as a peg 304 that extends into the femur F, and the tibial implant 306 includes a keel 308, for example. During implantation, cement is disposed under the planar portion of the baseplate of the tibial implant 306 and along the planar surface of the femoral implant 302. In some embodiments, the femoral implant 302 includes 5 flat portions, namely portion ab, portion bc, portion cd, portion de, and portion ef. In some embodiments, the cement is located on some or all of the flat portions of the femoral implant 302. During revision surgery, implants 302 and 306 (including staples 304 and keel 308) must be circumcised for removal. Over time, however, the keel 308 may grow with the tibia T, which can cause the bone pieces to break off during removal. To reduce bone loss during removal, images of the implant (e.g., obtained by CT, MRI, video, ultrasound, etc.) may be used to create a model of the bone and implant to generate a surgical plan for removal. In some embodiments, a tracking probe may be used to probe regions near points a, b, c, d, e, and f, or along the edges of portions ab, bc, cd, de, and ef, for example, to create a model of the interface between the femoral implant 302 and the bone.
Fig. 4 shows a graphical user interface showing a model of a bone 402 and a model of an implant 404 during a first partial knee procedure, according to an example embodiment. Specifically, fig. 4 depicts the distal end of a femur 402 receiving a femoral implant 404. As shown, the femoral implant 404 includes an elongated projection 406 (e.g., a nail, screw, keel, etc.) that is received by an aperture in the femur. The elongated projections 406 further secure the femoral implant 404 to the bone 402 and help prevent movement between the implant 404 and the bone 402. The bone may have been prepared with a keel (not shown) that interfaces with a keel on the femoral implant 404 to improve security between the bone 402 and the implant 404. The model may allow the user to modify the view of the implant model by rotation of the model or selection of different viewing modes. In some embodiments, the model may allow the user to view different cross-sectional views of the implant, bone, or a combination thereof. In some embodiments, the model may also provide information to help plan revision surgeries (e.g., size, location, materials, etc.).
The surgical system 100 of fig. 1 may be configured to establish a virtual control object associated with a current prosthetic implant component and associated with or related to one or more features of a patient's anatomy. Surgical system 100 may be configured to create a virtual representation of a surgical site including, for example, virtual representations of patient anatomical features, surgical instruments used during a surgical procedure, probe tools used to register other objects within the surgical site, and any other objects associated with the surgical site
In addition to physical objects, the surgical system 100 may be configured to generate virtual objects that exist in software and are useful during execution of a surgical procedure. For example, the surgical system 100 may be configured to generate a virtual boundary or virtual control boundary that corresponds to a plan for a surgeon to prepare a bone, such as a boundary that defines an area of bone that the surgeon plans to cut, remove, or otherwise alter. In the case of revision surgery, the virtual boundary may correspond to the surgeon's plan for removing the necessary bone and the cement that constitutes the interface between the implanted prosthetic component and the bone on which the prosthetic component is implanted. Alternatively or additionally, the surgical system 100 may define a virtual object that corresponds to a desired path or route over which a portion of the surgical tool 110 (e.g., the end effector 200) should navigate to perform a particular task.
The surgical system 100 may also be configured to generate virtual objects or boundaries as part of a particular surgical plan. In some embodiments, a surgical plan is generated based on a database of implants, wherein the surgical plan corresponds to a registered model of an implant or bone. If the implant is known in the database, the surgical plan may be suggested to the user. The surgical plan may include which tools should be used, what access is needed to bypass portions of the implant, virtual boundaries, and the like. The proposed surgical plan may include virtual objects around the keels and pegs, and tool changes may be proposed to cut around these implant features. In some embodiments, the surgical plan may be modified by the user, including but not limited to the tools to be used, the desired access, the shape of the implant, and the virtual boundaries. In some embodiments, the general surgical plan may be modified based on model capture of patient anatomical features or implants or automatically based on specific implants.
Virtual boundaries and other virtual objects may define points, lines or planes within a virtual coordinate space (typically defined relative to the patient's anatomy) that serve as boundaries at which constraints are provided to the surgical instrument when its tracked position interacts with the virtual boundary or object. In some embodiments, the constraint is provided by tactile or force feedback. For example, as the surgeon performs a bone cutting operation, the tracking system of the surgical system 100 tracks the position of the cutting tool and, in most cases, allows the surgeon to freely move the tool in the workspace. However, as the tool approaches the virtual boundary (which has been registered with the patient's anatomical features), the surgical system 100 controls the force feedback system to provide a guide that limits the surgeon from penetrating the virtual boundary with the cutting tool. For example, the virtual boundary may be associated with the geometry of a virtual model of the prosthetic implant, and the haptic guidance may include forces and/or moments that are mapped to the virtual boundary and experienced as resistance by the surgeon to limit tool movement to penetrate the virtual boundary. Thus, the surgeon may feel as if the cutting tool has encountered a physical object, such as a wall. The force feedback system of the surgical system 100 then communicates information to the surgeon regarding the position of the tool relative to the virtual boundary and provides physical force feedback to guide the cutting tool during the actual cutting process. In this way, the virtual boundary functions as a virtual cutting guide. The force feedback system of the surgical system 100 may also be configured to limit the user's ability to manipulate the surgical tool. A robotic system or hand tool may be attached to the implant to measure the force applied for removal. Monitoring the position of the implant relative to the bone and the applied force can give the surgeon an indication of easy removal. This may indicate that additional cutting is required to minimize unintentional bone loss. In some embodiments, the virtual boundary defines an autonomous cutting control that allows the surgical robot to autonomously perform all or some of the steps of the surgical plan. In some embodiments, the virtual boundary defines a combination of an autonomous and a manual cutting boundary. In some embodiments, when autonomous cutting control is used, feedback may be used to indicate contact with the implant (e.g., contact with the nail when the tool is cutting along a flat interface surface), and the surgical plan or boundary may be adjusted to avoid portions of the implant based on the feedback. This is particularly useful, for example, where the shape of the keel is unknown or unrecognizable prior to the beginning of the cut, so the original boundary does not take into account the keel. The surgical plan or virtual boundary may be modified based on the detected difference in the surgical plan and/or virtual boundary from the keel. In some implementations, the virtual boundary corresponds to a haptic boundary that defines a haptic object. In some embodiments, the haptic boundary is configured to provide haptic feedback when the haptic boundary is encountered. The haptic boundary may generate haptic feedback that is tactile, audible, visual, olfactory (i.e., odor), or other means of providing feedback.
In some implementations, the presented application also creates a virtual object (not shown) representing a path from the first location to the second location. For example, the virtual object may include a virtual guide wire (e.g., a wire) that defines a path from a first location (e.g., a location of a tool in a physical space used with the surgical system 100) to a second location that includes a target (e.g., a target object such as a virtual object). The virtual object may be activated such that movement of the tool is constrained along a path defined by the virtual object. When the tool reaches the second position and activates the target object (e.g., the virtual object), the surgical system 100 may deactivate the object. When the object is activated, the tool may be automatically placed in a control mode, such as a haptic control (or drill) mode. In a preferred embodiment, the object may be deactivated to deviate the tool from the path. Thus, the user can override the guidance associated with the object to deviate from the guide wire path and maneuver the tool around untracked objects (e.g., screws, distractors, lights, etc.) that are not considered in generating the virtual guide wire.
In the control mode, the robotic system 104 is configured to provide guidance to the user during surgical activities such as bone preparation. In one embodiment, the presented application may include a virtual object defining a cut volume on a tibia T. The virtual object may have a shape that substantially corresponds to the shape of the surface of the tibial component, such as when prepared for implantation. In revision surgery, the virtual object may have a shape that substantially corresponds to, for example, the shape of the interface between the tibial component and the tibia on which the tibial component is implanted or the path that bone removal is to follow. For example, the robotic system 104 may automatically enter the control mode when the tip of the tool approaches a predetermined point associated with the feature of interest. In some implementations, a tool can be disabled whenever the tool is outside of a virtual object. In another embodiment, the tool may be disabled unless the robotic system 104 generates a control feedback force.
In operation, the surgical system 100 may be used for surgical planning and navigation. In addition to preparing revision surgeries, the surgical system 100 may, for example, be used to perform knee replacement procedures or other joint replacement procedures that audit implant installation. The implant may include any implant or prosthetic device, such as a total knee implant; a unicondylar knee implant; a modular knee implant; implants for other joints including the hip, shoulder, elbow, wrist, ankle and spine; and/or any other orthopedic and/or musculoskeletal implant, including implants of traditional materials and more unique implants such as, for example, orthopedic biologies, drug delivery implants, and cell delivery implants.
Robotic revision surgery
Revision surgery, such as knee revision, is a complicated procedure and requires a very high level of expertise. This procedure is made more complicated for several reasons. The surgeon must remove the original implant, which may be cemented or non-cemented. The implant may grow bone within the implant and the surgeon must struggle and preserve as much bone as possible while removing the original implant. Further, the implant may include surfaces, keels, pegs, screws, or other components that require an circum-or through-cut. In some embodiments, the implant may be a plurality of implants that need to be individually cut and removed. The surgeon must ensure that most of the bond between the cement and the bone and/or between the implant and the bone is broken, which results in a time consuming and complex process. The prior solutions required the surgeon to chisel with a manual or powered instrument at the interface of the bone implant or bone cement. These include osteotomes, gill saws and punches. Powered instruments such as powered saws and drills or ultrasonic devices may also be used. Despite efforts to protect bone, there is always some amount of bone loss, and the surgeon must precisely fill all bone defects due to bone loss during implant removal. There may also be pre-existing bone defects that need to be noted after the implant is removed. The robotic system and specialized instrument of the present disclosure may help address some of the issues faced during implantation.
The need for correction may include, for example, infection, misalignment, and wear. In knee correction due to infection, the surgery may be a two-stage surgery. In the first stage, the infected implant is removed and the wound is cleaned. A spacer mass is added at the joint and the wound is closed. The second stage removes the spacers and adds a new revision implant.
The present disclosure addresses the previously facing problem of knee and/or hip correction by using a robotic-assisted approach. The present disclosure also describes a flexible end effector carrying a high-speed cutting burr. The flexible end effector is very dexterous to allow access to a small area, such as the posterior surface of the tibia, for removal of the implant. Referring to fig. 5A-5C, a flexible end effector 200 is shown according to an exemplary embodiment, the flexible end effector 200 may be used with a robotic arm 106 to perform robotically-assisted hip and knee revision procedures. In some embodiments, flexible end effector 200 may be an end effector according to any of the embodiments described in U.S. patent application No.15/436460, which is incorporated herein by reference in its entirety.
Manual removal of the implant may be difficult due to limited access to the region of the clavicle (e.g., the posterior surface of the tibia). The flexible end effector 200 is able to extend the performance of the robotic arm and allow access to those small areas. As shown in fig. 5A and 5B, the flexible end effector 200 includes two flexible bending elements 202 and 204. Each element has two degrees of freedom and can be bent in three dimensions less than or more than 90 degrees, as shown in fig. 5C. The end effector 200 may include a large internal channel to carry the flexible shaft. The flexible shaft is, for example, a hollow tube with a small wall thickness and is capable of rotating a cutting burr. In some embodiments, the hollow tube enables the cutting burr to rotate at 60000 rpm. The internal channel of the flexible shaft may also be used for irrigation or aspiration channels. In some embodiments, the flexible elements 202 and 204 provide enhanced access to areas that are otherwise difficult to access.
The end effector 200 includes a housing 206 having a base 208 and a support 210. The base 208 secures the end effector 200 to the robotic arm 106 and provides stability to the end effector 200. The mount 210 secures the shaft 212 of the end effector 200. The shaft 212 houses a motor 214, which motor 214 powers a cutting tool 216 located at the distal end of the end effector 200. In some embodiments, the end effector 200 further includes a suction aperture 218. The suction holes 218 are connected to the inner passage of the flexible shaft. In some embodiments, the robotic arm 106 may be stationary and the end effector 200 may move autonomously to perform a planned cut, as described below.
A variety of cutting tools 216 may be selected for the type of bone to be cut. The saw may be used for planar cutting, the burr for curved surfaces, the curved saw may be used to gain access around nails, keels and/or screws (which may be either cut around or through and removed separately), or another cutting tool more suitable for entering the bone and the type of cut to be made may be used. For the critical posterior part of the knee, a curved tool or a tool capable of performing a curved cut is preferred. In an exemplary embodiment, the saw may be used to perform an initial cut, and then a more specific cut may be made using the dedicated end effector 200. In some embodiments, an ultrasonic tool may be used to vibrate and break bone cement for removal. In some embodiments, the cement may be dissolved using a laser. In some embodiments, the cement may be cut or broken using a water knife.
Fig. 6A-6C illustrate various views of another exemplary embodiment of a femur F, a femoral implant 302, and an end effector 200. The end effector 200 in fig. 6A-6C may include a base 208, a support 210, and a cutting tool 216. The end effector 200 may be a vibrating chisel. In some embodiments, the cutting tool is capable of cutting and cutting away cement between the bone and the implant. The end effector 200 may be controlled and advanced by the surgeon, but may be constrained by a tactile boundary between the bone and the implant to reduce the skiving effect and ensure that all cement attachments are accessible to retain as much bone as possible. During revision surgery, end effector 200 may be used to remove an implant by cutting along portions ab, bc, cd, de, and ef. The end effector may also be used to prepare bone for a new implant. In addition to the staple holes 310 for receiving the staples 304, bone may also be prepared by creating surfaces ab, bc, cd, de, and ef using the cutting tool 216 or various other cutting tools.
Fig. 7A and 7B illustrate the femoral implant 302 after removal from the femur, with or without the use of a robotic system to perform the removal. As can be seen in the figures, in some revision procedures, when the implant 302 is removed, excess bone is removed, which is shown as bone 312 remaining on the implant 302. When excess bone is removed, an uneven surface 314 is created on the bone. Often, excessive bone removal occurs directly around the keel, nail, or on the back of the implant where the cement is difficult to cut. To properly prepare the bone for a new implant, it may be necessary to fill the defect with a reinforcing block, cone, or other filling method. After the implant is removed, video or ultrasound techniques may be used to determine the characteristics of the remaining bone to help correct the defect and plan for re-implantation.
The surgeon may perform a revision procedure using the robotic system to assist the surgeon in removing the primary implant using various methods described below.
Fig. 8 is a flow chart of a method 800 of performing a revision surgery according to an exemplary embodiment. Prior to starting the procedure, information about the interface area between the implanted implant component and the bone on which the implant component is implanted must be obtained. This may be done using an image of the correction site or using other tools to understand the relationship rather than the image. These variations for obtaining interface information, described by optional steps 802, 804 and 806 in fig. 8, are described below.
A first exemplary embodiment of a revision surgery method utilizes an image of a patient's anatomy to plan a revision. When a first case of a patient (e.g., an initial surgery) is performed by the robotic assistance system, the bone model and implant information may already be available and no image recapture is required to perform the revision. In revision surgery, the patient's first knee and hip bone model and implant information may be used in a robotic assistance system, as shown in fig. 4. Additionally, the model of the implant may be known and stored in a library of the surgical system for use during planning.
In other cases, patient imaging data may not be available or it may be desirable to obtain a new image. Therefore, an initial or new scan must be performed before the rework procedure is planned and performed. In such embodiments, the patient's anatomy is imaged using any preferred imaging modality, such as CT or MRI scanning, fluoroscopy, ultrasound, tracking markers, or by using a camera, as shown in optional step 802. The images captured by the imaging device are used to create bone and implant models for use in the planning stage. (in some embodiments, imaging may also be performed after the spacer mass is implanted in the case of a biphasic revision. Then, at optional step 804, the scan is segmented or transformed into a bone model. The scan may be segmented in a predetermined manner, or the surgeon may be able to select the segmentation parameters. In some embodiments, when a camera is used, a 3D model may be created without segmentation. In some embodiments, the 3D model may be created using imaging, statistical models, and the like. As described above, registration of the image with the physical space/anatomical feature is performed by the surgical system 100.
In another exemplary embodiment, the method may capture data intraoperatively using optional step 806 without capturing image data on anatomical features of the patient or using image data. In this step, the cement is digitized to the periphery of the bone or implant to bone interface using a tracking probe. The position data of the tracking probe is captured by a tracking system, such as tracking system 101, to determine the position of the interface to be released. In addition to models when image data, bone models, and/or implant models are available and/or used for planning, digitization of the interface may also be performed. The implant may then be registered to the first bone model. In yet another embodiment, a camera or optical sensor may be attached to the robotic arm to scan and register the implant surface to the bone model. In another embodiment, the camera may be moved around the patient to scan the bone and implant surfaces and create a 3D model. The surface of the implant may be probed to register known implant locations or known features of the implant. In some embodiments, if the implant is known, the probing may identify the implant and register the implant with the implant model.
Planning of the implant removal cut is performed at step 808. In embodiments where image data may be obtained from a pre-operative scan (whether the most recent scan or a scan from a first implantation procedure), the removal cut may be based on the location of the image and the interface between the cement and the bone or between the implant and the bone. In some embodiments, a camera is used to define the plane and the virtual boundary. In some embodiments, a probe may be used to define the plane and virtual boundary. Alternatively, the removal cut may be made based on the intended replacement implant. In this way, the planned resection cut may be planned to properly accommodate the new implant while also removing the current implant. In some embodiments, the planning software generates a bone preparation plan to achieve proper alignment of the patient, e.g., proper alignment of the tibia and femur. According to some embodiments, the robotic system 104 helps the surgeon plan and perform proper alignment of the knee in 3D. The bone preparation plan may be automatically executed by the planning software, or the surgeon may assist in creating the bone preparation plan. Using previous image data (X-ray, CT, MRI, fluoroscopy, video, etc.) and intraoperative landmarks, a visualization of the ideal natural anatomical features (e.g., joint line, etc.) can be constructed. The system may also use the range of motion and soft tissue compliance as inputs to aid in the planning procedure. In some embodiments, fiducials may be placed in the spacer blocks to speed registration for the re-implantation procedure. After implant removal, the remaining cement may be removed using a robot or hand tool. A handheld tracking probe or a probe attachment of a robotic arm may be used to identify the remaining cement in the 3D model. The model may be used to generate another robotic cutting path for cement removal.
In other embodiments, planning an implant removal cut at step 808 may be performed based on data collected by a digital probe. Generally, a total knee arthroscopic implant design may include several flat bone-facing surfaces. During digitization, points on each side of the implant may be collected to identify a flat surface. Using the perimeter data, the robotic system calculates a plan to separate the interfaces of interest. The probe is used to collect points to define a plane that can be used for a virtual boundary. The transition region of the implant is detected (the point between the two flat surfaces may help identify the virtual boundary). Intraoperative imaging or the use of a camera may create a model of the bone defect after implant removal. The existing model can be updated by detecting defects to indicate where additional bone loss exists. The defect model may be used in a revision implant plan to ensure proper implant selection to cover the defect. The defect model indicates that additional enhancement devices are required to fill the defect. After selecting the revision implant, a virtual boundary is created to cut the bone for implant insertion. Once the model and defect are created, a new plan can be created to implement the modifications and additions to the bone that are required due to bone loss to accommodate the insertion of a new implant.
As part of this planning step 808, the surgical system 100 generates control objects, such as haptic objects, as described above. The control object may define a virtual boundary corresponding to a plan for the surgeon to prepare the bone. Specifically, for revision procedures, the virtual boundary is associated with a portion of the interface region that the surgeon plans to cut, remove, or otherwise change in order to remove the current implant from the bone. A modified virtual boundary is created around at least a portion of the interface region to allow a surgeon to precisely cut the bonding area between the bone and the implant. In a preferred embodiment, a virtual boundary for revision is created adjacent to the implant surface to protect the bone from being over cut. In this way, the revision boundary will minimize the bone removed, will reduce the risk of bone tearing, and can increase the number of potential additional revision procedures performed on the patient over the life of the patient. The boundary may be a planar boundary to which the cutting tool will be constrained by a virtual boundary, or a contoured boundary of any shape. The boundary may be created automatically by the system based on the received image and location data, or may be created manually based on user input. In other embodiments, the boundary may be customizable or adjustable, e.g., the surgeon may choose to move the boundary closer to or further away from the interface to accommodate the quality of the bone. Using the control objects defining the virtual boundary, the robotic system can ensure that the cutting tool does not move outside of the desired cutting area, remove a minimal amount of bone, and perform the cut accurately. If the implant is known, the virtual boundary may be identified using the proposed surgical plan. The proposed surgical plan may be used as a starting template for the surgeon, which may be modified to suit the particular needs and/or conditions of the surgery. In some embodiments, the proposed plan is generic. In some embodiments, the proposed plan provides proposed tools and/or proposed access locations for preparing bone for implant features such as keels, spikes, or any other structure or shape that needs to be avoided. In some embodiments, a generic shape template may be used in the virtual boundary plan, or a custom template may be drawn, created, or selected during the surgical plan. In particular, the surgical plan may be customized based on the characteristics of the bone remaining after the initial implant removal. In some embodiments, the entry location and size may be identified in the proposed surgical plan. In some embodiments, the entry path may be summarized in the proposed surgical plan.
In some embodiments, the planning software may also determine the size and number of enhancement blocks needed at step 810. The planning software may select an enhanced size based on the information database. In another embodiment, the planning software allows the user to input the size and number of enhancement blocks required. For example, the surgeon may tell the system, through a graphical user interface, to add a 5mm posterior augment or a medial 20 ° tibial wedge, and the planning software allows this cut to be performed by a surgical tool connected to a robotic arm, rather than through a jig. By using a robotic system according to exemplary embodiments disclosed herein, preoperative enhancement block size determination and planning are made possible, saving valuable operating room time and making the procedure more efficient and accurate.
In step 812, the robotic system 104 tracks the movement of the cutting tool using the navigation system 101 and guides the surgeon while performing the planned cut. The system 104 can direct the performance of the cut by providing a constraint on the cutting tool based on a relationship between the virtual tool (associated with the cutting tool) and the virtual boundary. Based on the control objects generated by the system corresponding to the surgical plan, guidance may be provided using haptics, or the system may autonomously perform the cut. When using haptics, the surgeon will receive feedback indicating when the haptic boundary is reached, preventing the surgeon from removing too much bone. In some embodiments, the surgeon may perform the cut using a combination of tactile control and voluntary action. In some embodiments, the robotic system may also provide feedback related to the bone implant or bone cement breaking process. For example, the robotic system may provide information to the surgeon regarding the process of bone cement or implant bone interface disruption. This prevents any accidental loss of bone when the implant is pulled out, in case the interface has not been properly broken.
In some implementations, the robotic system 104 can remove the hardware by bumping. Robotic systems may use the force of a robotic arm to "shake" an implant loose or by using an end effector that acts like a woodpecker to "shake" an implant.
The use of a robotic system allows for the use of a variety of cutting tools based on the type of bone cut to be performed. The saw may be used for planar cutting, the burr may be used for curved surfaces, the curved saw may be used to gain access around the nail or keel and/or around or through the access of the screw, or another cutting tool may be used that is more suitable for entering the bone and the type of cut to be made. The robotic system tracks the cutting tool and the patient to monitor the cutting procedure and provide information to the user regarding the progress of the bone cement or implant bone interface fracture. Again, this reduces the accidental loss of bone that can occur when the implant is extracted before the interface is properly released. In some embodiments, a value, such as a percentage, of the surface ablation may be displayed. This may give the surgeon an indication of the appropriate time to attempt to remove the implant. In some embodiments, if the surgeon is concerned about bone loss in a particular region, the display may display the amount of bone removed for the particular region of interest. In some embodiments, the surgeon may identify a particular region of interest to be calculated before or during the procedure.
Further, a robotic system or a manual tool may be attached to the implant to measure the force applied for removal. Monitoring the position of the implant relative to the bone and the applied force can give the surgeon an indication of easy removal. This may indicate when additional cutting is needed to minimize unintentional bone loss.
After implant removal, if there are any bone defects that are not considered by the planning software, then at optional step 814, the bone defects may be digitized or identified by another means or device. At step 816, the planning software may generate sizing information for filling the defect with various implants or biomaterial fillers. The implant and/or biomaterial filler may be selected from a database of available implants or fillers. In some embodiments, the software may generate a plan and create a custom implant or filler for filling the defect. In some embodiments, the software selects the implant and/or biomaterial filler based on several factors (e.g., defect size, bone density, etc.). According to some embodiments, the robotic system 104 is able to determine the correct size of the cone for filling the defect. In other embodiments, the bone filler material may be cut to the size of the defect, or the system may be configured to inject a liquid filler into the defect, which may cure within the patient.
In step 818, the planning software determines a desired pose of the replacement implant in the bone and plans the bone preparation to receive the replacement implant. The plan may be executed and control objects created in a similar manner as described in step 808. In addition to performing the above steps, there are several additional ways in which the robotic system assisted rework procedure may be performed. With respect to planning and preparing to implant a new implant component, the display device 103 may display a limb alignment/balance screen for assessing how to install the implant, which will help plan the adjustment. Further, the system can help assess stem length (straight versus arcuate, cemented versus press fit) based on bone morphology, quality, and adjacent hardware, among others. The assessment may be patient specific or predicted from a predefined data set. In another embodiment, the robotic system may apply traction through leg holders, stretchers, balancers, or the like to assess and define side branch tension. This can be performed in multiple poses and a graphical user interface or internal algorithm can be employed to calculate joint line arrangements and component dimensions to best recover motion and function. In yet another embodiment, the robotic system may be used to assist in revision surgeries from partial knee, bi-ankle or tri-ankle to cruciate preservation, cruciate replacement or posterior stabilization.
The camera may also be used in step 814 to create a model of the bone after the implant has been removed. This identifies the current bone geometry without an implant, including any bone defects that need attention. The camera and images obtained from the camera may then be used to plan a bone cut in step 818, and to perform a bone cut in step 820 (described below) to prepare the bone for a replacement implant, and to place the implant. In some embodiments, the camera may be used during other stages of a procedure, for example for model creation, registration, or tracking of the position of anatomical features or surgical tools used during a procedure. In some embodiments, the model may also include identification of the incision, either by selecting edges in the system software, using color recognition, image detection of the retractor holding the incision open, or by applying a camera detectable material around the incision. The camera may then also be used to track the position of the incision during the procedure.
In step 820, a cut for preparing the bone surface is performed to place the reinforcing block, cone, filler and final implant. The surgeon may perform a bone preparation cut using guidance from the system, such as tactile feedback. In another embodiment, the robotic system 104 may autonomously perform the preparation cut. As described above, the system may resect bone for the new plan as a step to remove existing hardware. Thus, instead of sawing/chipping at the existing implant, a resection is made, which will facilitate removal of the implant, but also a proper cut for the next implant.
In some embodiments, the robotic system may be used in another manner while cutting the bone. For example, the system may help perform adaptive cutting, where making subsequent cuts may be made based on various possible inputs of previous cut data or other landmarks/targets. For example, the probe may define a distal plane and a posterior tangent, and for implants of defined dimensions, the remaining cuts (femoral or tibial or patella) are made. The input may be an existing resection or a target joint tangency. The calculated cut may be programmed based on the input to create a desired result. Furthermore, the system can be used for cutting refinement. The surface is probed and then a skiving cut (e.g., a 0.5-1mm cut) is made. Control boundaries, such as haptics, may be intraoperatively updated or generated as the cut is made. In another embodiment, the robotic system 104 may control saw performance based on bone quality. Ultra-soft/cancellous or hard bony bone may require "lighter" or "harder" contact in terms of speed and/or feed, or even require different blades.
The robotic system may also aid in implant replacement and evaluation of the implant after replacement. For example, a display device displaying a graphical user interface may be used to guide the surgeon in placing a femoral or tibial component with a stem according to an offset or angled coupling, or to slightly manipulate the anterior-posterior or medial-lateral position to achieve fewer cusp or cortical stress points. In addition, robotic arms may be used to hold the implant in place relative to the bone while the cement cures. In yet another embodiment, the system may help assess whether the new construct is sufficiently stable via a range of motion/balance graphical user interface.
In some embodiments, the robotic system 104 may visualize the implant path or cement area when considering other aspects of the procedure, such as when translating the tibial tuberosity and segmenting and moving the window of the tibia, where the hardware is a trauma plate or the like relative to a knee implant or the like.
The construction and arrangement of the systems and methods as shown in the various exemplary embodiments are illustrative only. Although only a few embodiments have been described in detail in this disclosure, many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, use of materials, colors, orientations, etc.). For example, the position of elements may be reversed or otherwise varied, and the nature or number of discrete elements or positions may be altered or varied. Accordingly, all such modifications are intended to be included within the scope of this disclosure. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Other substitutions, modifications, changes and omissions may be made in the design, operating conditions and arrangement of the exemplary embodiments without departing from the scope of the present disclosure.
The present disclosure contemplates methods, systems, and program products on any machine-readable media for performing various operations. Embodiments of the present disclosure may be implemented using an existing computer processor, or by a special purpose computer processor for a suitable system, incorporated for this or another purpose, or by a hardwired system. Implementations within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, other magnetic storage devices, solid state storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine, the machine properly views the connection as a machine-readable medium. Thus, any such connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although a particular order of method steps may be described, the order of the steps may be different from that described. Further, two or more steps may be performed simultaneously or partially simultaneously. Such variations will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the present disclosure. Likewise, a software implementation can be accomplished with standard programming techniques with rule based logic and other logic to accomplish any of the connecting steps, processing steps, comparing steps and determining steps.
Claims (20)
1. A method for intraoperatively planning and facilitating revision arthroplasty procedures, the method comprising:
capturing a position of a tracking probe while the tracking probe contacts an interface region between a bone and a first implant component implanted on the bone;
intraoperatively generating a virtual boundary using the position of the tracking probe without using preoperative medical imaging, the virtual boundary corresponding to a portion of the interface region to be removed to remove the primary implant component from the bone;
facilitating removal of the first implant component from the bone by providing a constraint on operation of a cutting tool while the cutting tool removes the portion of the interface region, the constraint based on a relationship between the cutting tool and the virtual boundary.
2. The method of claim 1, wherein capturing the position of the tracking probe comprises collecting points on multiple sides of the primary implant component; and is
Intraoperatively generating the virtual boundary includes identifying one or more planar surfaces of the primary implant component based on points on the plurality of sides of the primary implant component.
3. The method of claim 1, further comprising providing a second constraint on operation of the cutting tool when the cutting tool is used to perform one or more additional cuts to prepare the bone for receiving a revision implant.
4. The method of claim 1, further comprising creating a model of a bone defect after removing the first implant component from the bone.
5. The method of claim 4, comprising creating the model of the bone defect using a camera.
6. The method of claim 4, comprising creating the model of the bone defect using intraoperative imaging.
7. The method of claim 4, further comprising updating the model of the bone defect using additional positions of the tracking probe.
8. The method of claim 4, further comprising determining whether a reinforcing block is needed and selecting a revision implant based on the model of the bone defect.
9. The method of claim 4, further comprising:
intraoperatively generating one or more additional virtual boundaries using the model of the bone defect; and
providing a second constraint to the cutting tool based on a relationship between the cutting tool and the one or more additional virtual boundaries while the cutting tool performs one or more additional cuts to prepare the bone.
10. The method of claim 1, wherein the virtual boundary is a haptic boundary, and wherein providing the constraint comprises providing haptic feedback to the cutting tool.
11. A surgical system, the surgical system comprising:
a robot device;
a surgical tool coupled to the robotic device;
a probe;
a navigation system configured to track a position of the probe; and
a computing system programmed to:
obtaining position data from the navigation system, the position data representing a position of the probe when the probe contacts an interface region between a bone and a first implant component implanted on the bone;
intraoperatively generating a virtual boundary using the position of the probe without using preoperative medical imaging, the virtual boundary corresponding to a portion of the interface region to be removed to remove the primary implant component from the bone;
controlling the robotic device based on a relationship between a cutting tool and the virtual boundary to facilitate removal of the first implant component using the surgical tool.
12. The surgical system of claim 11, wherein the position of the probe corresponds to points on multiple sides of the primary implant component; and is
Wherein the computing system is programmed to generate the virtual boundary by identifying one or more planar surfaces of the primary implant component based on points on the plurality of sides of the primary implant component.
13. The surgical system of claim 11, wherein the computing system is further configured to control the robotic device to facilitate preparation of the bone to receive a revision implant component based on a second relationship between the cutting tool and a second virtual boundary.
14. The surgical system of claim 11, wherein the computing system is further programmed to create a model of a bone defect after removing the first implant component from the bone.
15. The surgical system of claim 14, further comprising a camera, wherein the computing system is programmed to create the bone model of the bone defect based on input from the camera.
16. The surgical system of claim 14, wherein the computing system is further programmed to update the model of the bone defect using additional locations of the probe as tracked by the navigation system.
17. The surgical system of claim 14, wherein the computing system is further programmed to generate a plan for placement of a revision implant and a reinforcing block or filler based on the model of the bone defect.
18. A non-transitory computer readable memory storing a program executable by a processor to perform actions comprising:
obtaining position data from a navigation system, the position data representing a position of a probe when the probe contacts an interface region between a bone and a first implant component implanted on the bone;
intraoperatively generating a virtual boundary using the position of the probe without using preoperative medical imaging, the virtual boundary corresponding to a portion of the interface region to be removed to remove the primary implant component from the bone;
generating a control configured to provide a constraint to operation of a cutting tool while the cutting tool removes the portion of the interface region to remove the first implant component from the bone.
19. The non-transitory computer-readable memory of claim 18, wherein the actions further comprise:
creating a model of a bone defect after removing the first implant component from the bone; and
correcting placement of an implant on the bone using the model plan.
20. The non-transitory computer-readable memory of claim 19, wherein the acts further comprise generating a control configured to provide a second constraint to operation of the cutting tool while the cutting tool is preparing the bone according to the planned placement of the revision implant.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662363037P | 2016-07-15 | 2016-07-15 | |
US62/363,037 | 2016-07-15 | ||
CN201780056264.0A CN109688963B (en) | 2016-07-15 | 2017-07-13 | System for robot-assisted correction of programs |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780056264.0A Division CN109688963B (en) | 2016-07-15 | 2017-07-13 | System for robot-assisted correction of programs |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113303907A true CN113303907A (en) | 2021-08-27 |
Family
ID=59409763
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110589543.1A Pending CN113303907A (en) | 2016-07-15 | 2017-07-13 | System for robot-assisted correction of programs |
CN201780056264.0A Active CN109688963B (en) | 2016-07-15 | 2017-07-13 | System for robot-assisted correction of programs |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780056264.0A Active CN109688963B (en) | 2016-07-15 | 2017-07-13 | System for robot-assisted correction of programs |
Country Status (8)
Country | Link |
---|---|
US (5) | US10716630B2 (en) |
EP (1) | EP3484398A1 (en) |
JP (2) | JP7123031B2 (en) |
KR (2) | KR20190031281A (en) |
CN (2) | CN113303907A (en) |
AU (4) | AU2017295728B2 (en) |
CA (1) | CA3030831A1 (en) |
WO (1) | WO2018013848A1 (en) |
Families Citing this family (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7708741B1 (en) | 2001-08-28 | 2010-05-04 | Marctec, Llc | Method of preparing bones for knee replacement surgery |
US10932866B1 (en) * | 2014-12-08 | 2021-03-02 | Think Surgical, Inc. | Implant based planning, digitizing, and registration for total joint arthroplasty |
CA2974996C (en) * | 2015-02-02 | 2023-02-28 | The University Of Western Ontario | Navigation by bending forces |
JP2018532454A (en) * | 2015-09-04 | 2018-11-08 | メイヨ フォンデーシヨン フォー メディカル エジュケーション アンド リサーチ | System and method for medical imaging of patients with medical grafts used in revision surgery planning |
WO2018013848A1 (en) * | 2016-07-15 | 2018-01-18 | Mako Surgical Corp. | Systems for a robotic-assisted revision procedure |
US11000336B2 (en) * | 2016-09-23 | 2021-05-11 | Koninklijke Philips N.V. | Visualization of an image object relating to an instrucment in an extracorporeal image |
US20200046215A1 (en) * | 2017-02-09 | 2020-02-13 | Gynisus Ltd. | A medical monitoring system and method |
US11166775B2 (en) | 2017-09-15 | 2021-11-09 | Mako Surgical Corp. | Robotic cutting systems and methods for surgical saw blade cutting on hard tissue |
US11272985B2 (en) * | 2017-11-14 | 2022-03-15 | Stryker Corporation | Patient-specific preoperative planning simulation techniques |
EP3740149A1 (en) | 2018-01-17 | 2020-11-25 | Mako Surgical Corp. | Systems and methods for robotic infection treatment of a prosthesis |
JP7314175B2 (en) | 2018-05-18 | 2023-07-25 | オーリス ヘルス インコーポレイテッド | Controller for robotic remote control system |
US11160672B2 (en) | 2018-09-24 | 2021-11-02 | Simplify Medical Pty Ltd | Robotic systems and methods for distraction in intervertebral disc prosthesis implantation |
US20200188134A1 (en) * | 2018-12-14 | 2020-06-18 | Howmedica Osteonics Corp. | Augmented, Just-in-Time, Patient-Specific Implant Manufacture |
US11478307B2 (en) | 2018-12-18 | 2022-10-25 | Mako Surgical Corp. | Systems and methods for fiber optic tracking |
CN109620415B (en) * | 2019-02-14 | 2024-03-26 | 北京水木天蓬医疗技术有限公司 | Robot-assisted ultrasonic bone power system |
EP3934519A4 (en) * | 2019-03-05 | 2022-11-23 | MAKO Surgical Corp. | Systems and methods for surgical registration |
EP3934558A4 (en) * | 2019-03-07 | 2022-12-14 | PROCEPT BioRobotics Corporation | Robotic arms and methods for tissue resection and imaging |
AU2020241316A1 (en) | 2019-03-15 | 2021-10-07 | Mako Surgical Corp. | Robotic surgical system and methods utilizing a cutting bur for bone penetration and cannulation |
FR3095331A1 (en) * | 2019-04-26 | 2020-10-30 | Ganymed Robotics | Computer-assisted orthopedic surgery procedure |
US20220338935A1 (en) * | 2019-06-18 | 2022-10-27 | Smith & Nephew, Inc. | Computer controlled surgical rotary tool |
US20210038328A1 (en) * | 2019-08-09 | 2021-02-11 | Olivier Boisvert | Revision robotics |
AU2020331936B2 (en) * | 2019-08-16 | 2023-10-26 | Howmedica Osteonics Corp. | Pre-operative planning of surgical revision procedures for orthopedic joints |
WO2021041155A1 (en) | 2019-08-29 | 2021-03-04 | Mako Surgical Corp. | Robotic surgery system for augmented hip arthroplasty procedures |
US11426178B2 (en) | 2019-09-27 | 2022-08-30 | Globus Medical Inc. | Systems and methods for navigating a pin guide driver |
US11864857B2 (en) | 2019-09-27 | 2024-01-09 | Globus Medical, Inc. | Surgical robot with passive end effector |
US11071601B2 (en) | 2019-11-11 | 2021-07-27 | Procept Biorobotics Corporation | Surgical probes for tissue resection with robotic arms |
AU2021224529B2 (en) * | 2020-02-18 | 2024-02-15 | Howmedica Osteonics Corp. | Computer-implemented surgical planning based on bone loss during orthopedic revision surgery |
JP7361225B2 (en) | 2020-03-26 | 2023-10-13 | ローズマウント インコーポレイテッド | Two-wire industrial process field device power supply circuit |
WO2021207757A2 (en) * | 2020-04-10 | 2021-10-14 | Wright Medical Technology, Inc. | Devices and methods for removing bone |
EP4146111A1 (en) * | 2020-05-04 | 2023-03-15 | Howmedica Osteonics Corp. | Surgical system for revision orthopedic surgical procedures |
WO2021231994A1 (en) * | 2020-05-15 | 2021-11-18 | Jeffrey Wilde | Joint implant extraction and placement system and localization device used therewith |
US11096753B1 (en) | 2020-06-26 | 2021-08-24 | Procept Biorobotics Corporation | Systems and methods for defining and modifying range of motion of probe used in patient treatment |
US11877818B2 (en) | 2020-06-26 | 2024-01-23 | Procept Biorobotics Corporation | Integration of robotic arms with surgical probes |
WO2022046966A1 (en) * | 2020-08-26 | 2022-03-03 | Smith & Nephew, Inc. | Methods for protecting anatomical structures from resection and devices thereof |
DE102020128199A1 (en) * | 2020-10-27 | 2022-04-28 | Carl Zeiss Meditec Ag | Individualization of generic reference models for operations based on intraoperative status data |
AU2021261864A1 (en) | 2020-11-03 | 2022-05-19 | Mako Surgical Corp. | Multi-component locking implant |
CN112381872A (en) * | 2020-11-13 | 2021-02-19 | 山东中医药大学附属医院 | Maximum bearing capacity detection method and system based on CT value |
CN112998863B (en) * | 2021-03-12 | 2022-05-06 | 杭州柳叶刀机器人有限公司 | Robot safety boundary interaction device, electronic apparatus, and storage medium |
KR20230007736A (en) * | 2021-07-06 | 2023-01-13 | 가톨릭대학교 산학협력단 | Method to readout artificial knee joint loosening |
US20230010852A1 (en) * | 2021-07-12 | 2023-01-12 | Simplex Designs, Llc | System and Method for Revision Hip Surgery |
US11925426B2 (en) | 2021-07-16 | 2024-03-12 | DePuy Synthes Products, Inc. | Surgical robot with anti-skive feature |
CN113558771B (en) * | 2021-07-29 | 2022-12-16 | 杭州柳叶刀机器人有限公司 | Robot plane limit control method and device and surgical robot |
CN113842217B (en) * | 2021-09-03 | 2022-07-01 | 北京长木谷医疗科技有限公司 | Method and system for limiting motion area of robot |
CN113855236B (en) * | 2021-09-03 | 2022-05-31 | 北京长木谷医疗科技有限公司 | Method and system for tracking and moving surgical robot |
CN113907889A (en) * | 2021-09-03 | 2022-01-11 | 北京长木谷医疗科技有限公司 | Control method and system for robot mechanical arm |
CN113842213B (en) * | 2021-09-03 | 2022-10-11 | 北京长木谷医疗科技有限公司 | Surgical robot navigation positioning method and system |
ES2898124B2 (en) * | 2021-09-15 | 2022-10-13 | Gonzalez Gonzalez Igor | SYSTEM FOR NAVIGATION IN THE IMPLANTATION OF KNEE REVISION PROSTHESIS |
US11464650B1 (en) | 2021-10-06 | 2022-10-11 | Ix Innovation Llc | Synchronized placement of surgical implant hardware |
CN113942015A (en) * | 2021-11-23 | 2022-01-18 | 杭州柳叶刀机器人有限公司 | Cutting tool activity space limiting method and device and terminal equipment |
EP4245242A1 (en) | 2022-03-18 | 2023-09-20 | Stryker Australia PTY LTD | Bone resection scoring and planning |
US20240024035A1 (en) * | 2022-07-21 | 2024-01-25 | Jpi Consulting | Preoperative imaging combined with intraoperative navigation before and after alteration of a surgical site to create a composite surgical three dimensional structural dataset |
KR20240041681A (en) * | 2022-09-23 | 2024-04-01 | 큐렉소 주식회사 | Apparatus for planning cutting path of surgical robot, and mehtod thereof |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104869918A (en) * | 2012-12-21 | 2015-08-26 | 玛口外科股份有限公司 | Methods and systems for planning and performing osteotomy |
Family Cites Families (247)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5716361A (en) | 1995-11-02 | 1998-02-10 | Masini; Michael A. | Bone cutting guides for use in the implantation of prosthetic joint components |
US5769092A (en) | 1996-02-22 | 1998-06-23 | Integrated Surgical Systems, Inc. | Computer-aided system for revision total hip replacement surgery |
US5880976A (en) * | 1997-02-21 | 1999-03-09 | Carnegie Mellon University | Apparatus and method for facilitating the implantation of artificial components in joints |
US6917827B2 (en) | 2000-11-17 | 2005-07-12 | Ge Medical Systems Global Technology Company, Llc | Enhanced graphic features for computer assisted surgery system |
US6685711B2 (en) | 2001-02-28 | 2004-02-03 | Howmedica Osteonics Corp. | Apparatus used in performing femoral and tibial resection in knee surgery |
US20020133162A1 (en) | 2001-03-17 | 2002-09-19 | Axelson Stuart L. | Tools used in performing femoral and tibial resection in knee surgery |
US20040162619A1 (en) | 2001-08-27 | 2004-08-19 | Zimmer Technology, Inc. | Tibial augments for use with knee joint prostheses, method of implanting the tibial augment, and associated tools |
US7708741B1 (en) | 2001-08-28 | 2010-05-04 | Marctec, Llc | Method of preparing bones for knee replacement surgery |
US6711431B2 (en) | 2002-02-13 | 2004-03-23 | Kinamed, Inc. | Non-imaging, computer assisted navigation system for hip replacement surgery |
US8010180B2 (en) | 2002-03-06 | 2011-08-30 | Mako Surgical Corp. | Haptic guidance system and method |
US7831292B2 (en) * | 2002-03-06 | 2010-11-09 | Mako Surgical Corp. | Guidance system and method for surgical procedures with improved feedback |
US7206627B2 (en) | 2002-03-06 | 2007-04-17 | Z-Kat, Inc. | System and method for intra-operative haptic planning of a medical procedure |
US9155544B2 (en) | 2002-03-20 | 2015-10-13 | P Tech, Llc | Robotic systems and methods |
ATE409006T1 (en) | 2002-05-21 | 2008-10-15 | Plus Orthopedics Ag | ARRANGEMENT FOR DETERMINING FUNCTIONAL GEOMETRIC SIZE OF A JOINT OF A VERTEBRATE |
US7736368B2 (en) | 2002-08-23 | 2010-06-15 | Orthosoft Inc. | Surgical universal positioning block and tool guide |
EP1545368B1 (en) | 2002-10-04 | 2009-03-11 | Orthosoft Inc. | Computer-assisted hip replacement surgery |
US7559931B2 (en) | 2003-06-09 | 2009-07-14 | OrthAlign, Inc. | Surgical orientation system and method |
CA2439850A1 (en) | 2003-09-04 | 2005-03-04 | Orthosoft Inc. | Universal method for determining acetabular and femoral implant positions during navigation |
US7835778B2 (en) | 2003-10-16 | 2010-11-16 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation of a multiple piece construct for implantation |
US7392076B2 (en) | 2003-11-04 | 2008-06-24 | Stryker Leibinger Gmbh & Co. Kg | System and method of registering image data to intra-operatively digitized landmarks |
US20050187562A1 (en) | 2004-02-03 | 2005-08-25 | Grimm James E. | Orthopaedic component inserter for use with a surgical navigation system |
FR2865928B1 (en) | 2004-02-10 | 2006-03-17 | Tornier Sa | SURGICAL DEVICE FOR IMPLANTATION OF A TOTAL HIP PROSTHESIS |
EP1729665A1 (en) | 2004-03-31 | 2006-12-13 | Smith and Nephew, Inc. | Methods and apparatuses for providing a reference array input device |
US7333013B2 (en) | 2004-05-07 | 2008-02-19 | Berger J Lee | Medical implant device with RFID tag and method of identification of device |
US9808262B2 (en) | 2006-02-15 | 2017-11-07 | Howmedica Osteonics Corporation | Arthroplasty devices and related methods |
US8337426B2 (en) | 2009-03-24 | 2012-12-25 | Biomet Manufacturing Corp. | Method and apparatus for aligning and securing an implant relative to a patient |
JP2009529954A (en) | 2006-03-14 | 2009-08-27 | マコ サージカル コーポレーション | Prosthetic device and system and method for implanting a prosthetic device |
US8337508B2 (en) | 2006-03-20 | 2012-12-25 | Perception Raisonnement Action En Medecine | Distractor system |
US7949386B2 (en) | 2006-03-21 | 2011-05-24 | A2 Surgical | Computer-aided osteoplasty surgery system |
JP2009537229A (en) | 2006-05-19 | 2009-10-29 | マコ サージカル コーポレーション | Method and apparatus for controlling a haptic device |
GB0610079D0 (en) | 2006-05-22 | 2006-06-28 | Finsbury Dev Ltd | Method & system |
US7594933B2 (en) | 2006-08-08 | 2009-09-29 | Aesculap Ag | Method and apparatus for positioning a bone prosthesis using a localization system |
US8400312B2 (en) | 2006-10-10 | 2013-03-19 | Saga University | Operation assisting system |
CA2673987C (en) | 2006-12-27 | 2015-04-28 | Mako Surgical Corp. | Apparatus and method for providing an adjustable positive stop in space |
EP2166992B1 (en) | 2007-06-07 | 2016-10-12 | Sam Hakki | Apparatus and method of determining acetabular center axis |
JP2009056299A (en) | 2007-08-07 | 2009-03-19 | Stryker Leibinger Gmbh & Co Kg | Method of and system for planning surgery |
US9179983B2 (en) | 2007-08-14 | 2015-11-10 | Zimmer, Inc. | Method of determining a contour of an anatomical structure and selecting an orthopaedic implant to replicate the anatomical structure |
CA2696990C (en) | 2007-08-21 | 2015-05-12 | Socovar Societe En Commandite | Acetabular reamer with locking collar |
US9017335B2 (en) | 2007-11-19 | 2015-04-28 | Blue Ortho | Hip implant registration in computer assisted surgery |
US8160345B2 (en) | 2008-04-30 | 2012-04-17 | Otismed Corporation | System and method for image segmentation in generating computer models of a joint to undergo arthroplasty |
US10687856B2 (en) | 2007-12-18 | 2020-06-23 | Howmedica Osteonics Corporation | System and method for image segmentation, bone model generation and modification, and surgical planning |
ES2595366T3 (en) | 2008-01-09 | 2016-12-29 | Stryker European Holdings I, Llc | Computer-assisted stereotactic surgery system based on a three-dimensional visualization |
EP2237729B1 (en) | 2008-01-16 | 2016-04-13 | Orthosoft, Inc. | Pinless system for computer assisted orthopedic surgery |
FR2932677B1 (en) | 2008-06-20 | 2010-06-25 | Univ Bretagne Occidentale | SYSTEM FOR ASSISTING THE IMPLANTATION OF A HIP PROSTHESIS ON AN INDIVIDUAL. |
EP2344078B1 (en) | 2008-07-24 | 2018-04-18 | OrthAlign, Inc. | Systems for joint replacement |
ES2750264T3 (en) | 2008-09-10 | 2020-03-25 | Orthalign Inc | Hip surgery systems |
US8249318B2 (en) | 2008-09-26 | 2012-08-21 | OsteoWare, Inc. | Method for identifying implanted reconstructive prosthetic devices |
EP2379284B1 (en) | 2008-12-23 | 2018-02-21 | Mako Surgical Corp. | Transmission with connection mechanism for varying tension force |
WO2010099123A2 (en) | 2009-02-24 | 2010-09-02 | Mako Surgical Corp. | Prosthetic device, method of planning bone removal for implantation of prosthetic device, and robotic system |
CN102647962A (en) | 2009-11-24 | 2012-08-22 | 株式会社力克赛 | Preoperative planning program for artificial hip joint replacement surgery and surgery support jig |
US8709016B2 (en) | 2009-12-11 | 2014-04-29 | Curexo Technology Corporation | Surgical guide system using an active robot arm |
EP2512354A4 (en) | 2009-12-18 | 2015-09-09 | Biomimedica Inc | Method, device, and system for shaving and shaping of a joint |
GB0922640D0 (en) | 2009-12-29 | 2010-02-10 | Mobelife Nv | Customized surgical guides, methods for manufacturing and uses thereof |
JP2011217787A (en) | 2010-04-05 | 2011-11-04 | Ntn Corp | Remotely operated actuator |
CN102933163A (en) * | 2010-04-14 | 2013-02-13 | 史密夫和内修有限公司 | Systems and methods for patient- based computer assisted surgical procedures |
US8679125B2 (en) * | 2010-09-22 | 2014-03-25 | Biomet Manufacturing, Llc | Robotic guided femoral head reshaping |
EP2651344A4 (en) | 2010-12-17 | 2015-08-19 | Intellijoint Surgical Inc | Method and system for aligning a prosthesis during surgery |
WO2012097874A1 (en) | 2011-01-20 | 2012-07-26 | Brainlab Ag | Method for planning the positioning of a ball joint prosthesis |
US8979859B2 (en) | 2011-02-14 | 2015-03-17 | Mako Surgical Corporation | Depth of Impaction |
US9125669B2 (en) | 2011-02-14 | 2015-09-08 | Mako Surgical Corporation | Haptic volumes for reaming during arthroplasty |
WO2012154407A2 (en) | 2011-05-09 | 2012-11-15 | Smith & Nephew, Inc. | Patient specific instruments |
WO2012166888A2 (en) | 2011-06-03 | 2012-12-06 | Smith & Nephew, Inc. | Prosthesis guide comprising patient-matched features |
US9220510B2 (en) | 2011-06-15 | 2015-12-29 | Perception Raisonnement Action En Medecine | System and method for bone preparation for an implant |
US9498231B2 (en) | 2011-06-27 | 2016-11-22 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US8498744B2 (en) | 2011-06-30 | 2013-07-30 | Mako Surgical Corporation | Surgical robotic systems with manual and haptic and/or active control modes |
US8597365B2 (en) | 2011-08-04 | 2013-12-03 | Biomet Manufacturing, Llc | Patient-specific pelvic implants for acetabular reconstruction |
CN103841924B (en) | 2011-08-15 | 2016-02-17 | 康复米斯公司 | For revising update the system, the tool and method of junction surface joint replacement implants |
WO2013025927A2 (en) | 2011-08-17 | 2013-02-21 | New York Society For The Ruptured And Crippled Maintaining The Hospital For Special Surgery | Method for orienting an acetabular cup and instruments for use therewith |
CA2847182C (en) | 2011-09-02 | 2020-02-11 | Stryker Corporation | Surgical instrument including a cutting accessory extending from a housing and actuators that establish the position of the cutting accessory relative to the housing |
EP2760362B1 (en) | 2011-09-29 | 2018-11-07 | ArthroMeda, Inc. | System for precise prosthesis positioning in hip arthroplasty |
US9386993B2 (en) | 2011-09-29 | 2016-07-12 | Biomet Manufacturing, Llc | Patient-specific femoroacetabular impingement instruments and methods |
US9060794B2 (en) | 2011-10-18 | 2015-06-23 | Mako Surgical Corp. | System and method for robotic surgery |
US9044173B2 (en) | 2011-10-23 | 2015-06-02 | Eron D Crouch | Implanted device x-ray recognition and alert system (ID-XRAS) |
US8556074B2 (en) | 2011-10-25 | 2013-10-15 | Warsaw Orthopedic, Inc | Encapsulated data carrier tag for track and trace purposes |
US9301812B2 (en) | 2011-10-27 | 2016-04-05 | Biomet Manufacturing, Llc | Methods for patient-specific shoulder arthroplasty |
US20130105577A1 (en) | 2011-10-28 | 2013-05-02 | Warsaw Orthopedic, Inc. | Tracability tag sorter |
EP3181052A1 (en) | 2011-11-30 | 2017-06-21 | Rush University Medical Center | System and methods for identification of implanted medical devices and/or detection of retained surgical foreign objects from medical images |
US9639156B2 (en) | 2011-12-29 | 2017-05-02 | Mako Surgical Corp. | Systems and methods for selectively activating haptic guide zones |
WO2013101753A1 (en) | 2011-12-30 | 2013-07-04 | Mako Surgical Corp. | Systems and methods for customizing interactive haptic boundaries |
US8891847B2 (en) | 2012-01-23 | 2014-11-18 | Medtronic Navigation, Inc. | Automatic implant detection from image artifacts |
EP2849683A4 (en) | 2012-05-18 | 2015-11-25 | Orthalign Inc | Devices and methods for knee arthroplasty |
US10945801B2 (en) | 2012-05-22 | 2021-03-16 | Mako Surgical Corp. | Soft tissue cutting instrument and method of use |
US11864839B2 (en) | 2012-06-21 | 2024-01-09 | Globus Medical Inc. | Methods of adjusting a virtual implant and related surgical navigation systems |
EP2874550B1 (en) | 2012-07-23 | 2017-09-27 | Orthosoft, Inc. | Patient-specific instrumentation for implant revision surgery |
US9649160B2 (en) | 2012-08-14 | 2017-05-16 | OrthAlign, Inc. | Hip replacement navigation system and method |
US11020189B2 (en) | 2012-10-02 | 2021-06-01 | Radlink, Inc. | System and method for component positioning by registering a 3D patient model to an intra-operative image |
US9114014B2 (en) | 2012-11-07 | 2015-08-25 | Scott Kelley | Methods and devices for a surgical hip replacement procedure |
US9351782B2 (en) | 2012-11-09 | 2016-05-31 | Orthosensor Inc. | Medical device motion and orientation tracking system |
US10398449B2 (en) | 2012-12-21 | 2019-09-03 | Mako Surgical Corp. | Systems and methods for haptic control of a surgical tool |
US9031284B2 (en) | 2012-12-28 | 2015-05-12 | William Bradley Spath | Implant identification system and method |
US9125702B2 (en) | 2013-02-28 | 2015-09-08 | Biomet Manufacturing, Llc | Acetabular drill pin |
US9220572B2 (en) | 2013-03-14 | 2015-12-29 | Biomet Manufacturing, Llc | Method for implanting a hip prosthesis and related system |
US9247998B2 (en) | 2013-03-15 | 2016-02-02 | Intellijoint Surgical Inc. | System and method for intra-operative leg position measurement |
AU2014232933A1 (en) | 2013-03-15 | 2015-10-29 | Arthromeda, Inc. | Systems and methods for providing alignment in total knee arthroplasty |
US10405910B2 (en) | 2013-03-15 | 2019-09-10 | Think Surgical, Inc. | Systems and processes for revision total joint arthroplasty |
EP2967813A4 (en) | 2013-03-15 | 2016-11-09 | Conformis Inc | Kinematic and parameterized modeling for patient-adapted implants, tools, and surgical procedures |
US9585768B2 (en) | 2013-03-15 | 2017-03-07 | DePuy Synthes Products, Inc. | Acetabular cup prosthesis alignment system and method |
EP3041424A4 (en) | 2013-09-05 | 2017-05-10 | Smith & Nephew, Inc. | Patient-matched acetabular augment with alignment guide |
WO2015054745A1 (en) | 2013-10-14 | 2015-04-23 | Silesco Pty Ltd | Alignment apparatus for use in hip arthroplasty |
WO2015065969A1 (en) | 2013-10-28 | 2015-05-07 | Stryker Corporation | Implant design using heterogeneous bone properties and probabilistic tools to determine optimal geometries for fixation features |
EP3925574A1 (en) | 2013-11-08 | 2021-12-22 | Imascap | Pre-operatively planned adaptive glenoid implants and method for planning its design |
US20150142372A1 (en) | 2013-11-19 | 2015-05-21 | Polaris Surgical, LLC | Prosthetic placement tool and associated methods |
CN104720877A (en) * | 2013-12-18 | 2015-06-24 | 王旭东 | Application of digitization technology to oral approach mandibular condylar lesion surgical excision |
US10478318B2 (en) | 2013-12-29 | 2019-11-19 | Kambiz Behzadi | Prosthesis installation systems and methods |
KR102470649B1 (en) | 2013-12-31 | 2022-11-23 | 마코 서지컬 코포레이션 | Systems and methods for generating customized haptic boundaries |
FR3017227B1 (en) | 2014-02-04 | 2017-06-09 | Stephane Naudi | IMPLANT DATA MANAGEMENT DEVICE, SYSTEM COMPRISING SAID DEVICE AND USE THEREOF. |
US10758198B2 (en) | 2014-02-25 | 2020-09-01 | DePuy Synthes Products, Inc. | Systems and methods for intra-operative image analysis |
US10433914B2 (en) | 2014-02-25 | 2019-10-08 | JointPoint, Inc. | Systems and methods for intra-operative image analysis |
AU2015227303B2 (en) | 2014-03-05 | 2019-11-21 | Blue Belt Technologies, Inc. | Computer-aided prosthesis alignment |
WO2016007492A1 (en) | 2014-07-07 | 2016-01-14 | Smith & Nephew, Inc. | Alignment precision |
AU2015337755B2 (en) | 2014-10-29 | 2019-07-25 | Intellijoint Surgical Inc. | Systems, methods and devices for anatomical registration and surgical localization |
EP3215042A4 (en) | 2014-11-06 | 2018-06-20 | Orthosoft Inc. | Instrument navigation in computer-assisted hip surgery |
US10537388B2 (en) * | 2014-12-01 | 2020-01-21 | Blue Belt Technologies, Inc. | Systems and methods for planning and performing image free implant revision surgery |
CN106999245B (en) * | 2014-12-08 | 2020-11-10 | 思想外科有限公司 | Planning, digitizing, and registration based implants for use in total joint arthroplasty |
US10932866B1 (en) | 2014-12-08 | 2021-03-02 | Think Surgical, Inc. | Implant based planning, digitizing, and registration for total joint arthroplasty |
US9955984B2 (en) | 2015-01-12 | 2018-05-01 | Biomet Manufacturing, Llc | Augmented glenoid and method for preparation |
CN107205782B (en) | 2015-02-02 | 2020-08-11 | 奥尔索夫特无限责任公司 | Method and apparatus for cup implant using inertial sensors |
US10363149B2 (en) | 2015-02-20 | 2019-07-30 | OrthAlign, Inc. | Hip replacement navigation system and method |
US11666447B1 (en) | 2015-03-05 | 2023-06-06 | Taq Ortho, LLC | Bone implant augment and offset device |
US10321904B2 (en) | 2015-03-24 | 2019-06-18 | Omnilife Science Inc. | Orthopedic joint distraction device |
EP3274912B1 (en) | 2015-03-26 | 2022-05-11 | Biomet Manufacturing, LLC | System for planning and performing arthroplasty procedures using motion-capture data |
US10499996B2 (en) | 2015-03-26 | 2019-12-10 | Universidade De Coimbra | Methods and systems for computer-aided surgery using intra-operative video acquired by a free moving camera |
CA2983650C (en) | 2015-04-24 | 2021-03-16 | Biomet Manufacturing, Llc | Patient-specific augmented glenoid systems and methods |
US11241287B2 (en) | 2015-08-05 | 2022-02-08 | Friedrich Boettner | Fluoroscopy-based measurement and processing system and method |
US10092361B2 (en) | 2015-09-11 | 2018-10-09 | AOD Holdings, LLC | Intraoperative systems and methods for determining and providing for display a virtual image overlaid onto a visual image of a bone |
US10716628B2 (en) | 2015-10-29 | 2020-07-21 | Intellijoint Surgical Inc. | Systems, methods and devices for calculating hip center of rotation, adjusting parameters of joint replacement for pelvic tilt and calculating leg length and offset |
US10321961B2 (en) | 2015-11-05 | 2019-06-18 | Howmedica Osteonics Corp. | Patient specific implantation method for range of motion hip impingement |
EP3373815A4 (en) | 2015-11-13 | 2019-07-17 | Stryker European Holdings I, LLC | Adaptive positioning technology |
CN108348305A (en) | 2015-11-16 | 2018-07-31 | 思想外科有限公司 | Method for confirming the registration for being tracked bone |
JP6934861B2 (en) | 2015-11-24 | 2021-09-15 | シンク サージカル, インコーポレイテッド | Active robot pin placement in total knee osteoarthritis |
GB201521501D0 (en) | 2015-12-07 | 2016-01-20 | Depuy Ireland | Apparatus and method for aligning an acetabular cup |
US10991070B2 (en) | 2015-12-18 | 2021-04-27 | OrthoGrid Systems, Inc | Method of providing surgical guidance |
US11386556B2 (en) | 2015-12-18 | 2022-07-12 | Orthogrid Systems Holdings, Llc | Deformed grid based intra-operative system and method of use |
US10433921B2 (en) | 2015-12-28 | 2019-10-08 | Mako Surgical Corp. | Apparatus and methods for robot assisted bone treatment |
US20170215967A1 (en) | 2016-02-03 | 2017-08-03 | William B. Spath | Implant recommendation system and method |
AU2017227791B2 (en) | 2016-03-02 | 2022-05-26 | Think Surgical, Inc. | Automated arthroplasty planning |
CA3016604A1 (en) | 2016-03-12 | 2017-09-21 | Philipp K. Lang | Devices and methods for surgery |
MX2018011544A (en) | 2016-03-23 | 2019-06-24 | Canary Medical Inc | Implantable reporting processor for an alert implant. |
JP7011326B2 (en) | 2016-05-11 | 2022-01-26 | アースロメダ、 インコーポレイテッド | Patient-specific prosthesis alignment |
CN112932602A (en) | 2016-05-18 | 2021-06-11 | 德普伊爱尔兰无限公司 | System and method for preparing a patient's femur in an orthopaedic joint replacement procedure |
US10357315B2 (en) | 2016-05-27 | 2019-07-23 | Mako Surgical Corp. | Preoperative planning and associated intraoperative registration for a surgical system |
CA3024840A1 (en) | 2016-05-27 | 2017-11-30 | Mako Surgical Corp. | Preoperative planning and associated intraoperative registration for a surgical system |
NL2016867B1 (en) | 2016-05-31 | 2017-12-11 | Umc Utrecht Holding Bv | Implant, fitting plate and method for manufacturing an implant and fitting plate |
US10806541B2 (en) | 2016-06-08 | 2020-10-20 | Adam Ross | Scannable optical identifier for use with implantable medical devices |
US11229489B2 (en) | 2016-06-16 | 2022-01-25 | Zimmer, Inc. | Soft tissue balancing in articular surgery |
EP3471646B1 (en) | 2016-06-17 | 2023-07-05 | Zimmer, Inc. | System for intraoperative surgical planning |
WO2018013848A1 (en) * | 2016-07-15 | 2018-01-18 | Mako Surgical Corp. | Systems for a robotic-assisted revision procedure |
US20190175283A1 (en) | 2016-08-10 | 2019-06-13 | Think Surgical, Inc. | Pinless femoral tracking |
US11071596B2 (en) | 2016-08-16 | 2021-07-27 | Insight Medical Systems, Inc. | Systems and methods for sensory augmentation in medical procedures |
US10398514B2 (en) | 2016-08-16 | 2019-09-03 | Insight Medical Systems, Inc. | Systems and methods for sensory augmentation in medical procedures |
CA3034071A1 (en) | 2016-08-30 | 2018-03-08 | Mako Surgical Corp. | Systems and methods for intra-operative pelvic registration |
WO2018085417A1 (en) | 2016-11-02 | 2018-05-11 | Zimmer, Inc. | Device for sensing implant location and impingement |
EP3537970A4 (en) | 2016-11-14 | 2020-03-18 | Navbit Holdings Pty Limited | Alignment apparatus for use in surgery |
EP3554425B1 (en) | 2016-12-14 | 2024-03-13 | Zimmer, Inc. | Shoulder arthroplasty trial assembly comprising sensors |
WO2018125834A1 (en) | 2016-12-26 | 2018-07-05 | Marino James F | Surgical navigation using a guide for instrumentation positioning |
US10772685B2 (en) | 2017-01-16 | 2020-09-15 | Think Surgical, Inc. | System and method for bone re-registration and marker installation |
FR3062297B1 (en) | 2017-02-01 | 2022-07-15 | Laurent Cazal | METHOD AND DEVICE FOR ASSISTING THE PLACEMENT OF A PROSTHESIS, PARTICULARLY OF THE HIP, BY A SURGEON FOLLOWING DIFFERENT SURGICAL PROTOCOLS |
US20210161681A1 (en) | 2017-03-02 | 2021-06-03 | Optimotion Implants LLC | Universal femoral trial system and methods |
US10695183B2 (en) | 2017-03-14 | 2020-06-30 | Charles L. Nelson | Augments and methods of implanting augments |
EP3595554A4 (en) | 2017-03-14 | 2021-01-06 | OrthAlign, Inc. | Hip replacement navigation systems and methods |
JP7344122B2 (en) | 2017-03-14 | 2023-09-13 | オースアライン・インコーポレイテッド | Systems and methods for measuring and balancing soft tissue |
EP3627997A1 (en) | 2017-04-30 | 2020-04-01 | Hafez, Mahmoud Alm El Din | A method and device for patient specific instruments for one stage and two stages of revision knee arthroplasty using constraint and hinged knee implant |
US20180344465A1 (en) | 2017-05-31 | 2018-12-06 | Zimmer, Inc. | Customizable augments and methods for acetabular implants |
WO2018231775A1 (en) | 2017-06-12 | 2018-12-20 | Think Surgical, Inc. | Intramedullary cutting device for revision hip arthroplasty |
US10729558B2 (en) | 2017-08-18 | 2020-08-04 | Zimmer, Inc. | Methods and systems for patient-specific acetabular implants |
WO2019046579A1 (en) | 2017-08-31 | 2019-03-07 | Smith & Nephew, Inc. | Intraoperative implant augmentation |
US10751186B2 (en) | 2017-09-12 | 2020-08-25 | Zimmer, Inc. | Methods for attaching acetabular augments together or to acetabular shells |
US11737893B2 (en) | 2017-10-06 | 2023-08-29 | Intellijoint Surgical Inc. | System and method for preoperative planning for total hip arthroplasty |
US11413095B2 (en) | 2017-11-03 | 2022-08-16 | Intellijoint Surgical Inc. | System and method for surgical planning |
US11432945B2 (en) | 2017-11-07 | 2022-09-06 | Howmedica Osteonics Corp. | Robotic system for shoulder arthroplasty using stemless implant components |
US11173048B2 (en) | 2017-11-07 | 2021-11-16 | Howmedica Osteonics Corp. | Robotic system for shoulder arthroplasty using stemless implant components |
US11241285B2 (en) | 2017-11-07 | 2022-02-08 | Mako Surgical Corp. | Robotic system for shoulder arthroplasty using stemless implant components |
EP3706646A1 (en) | 2017-11-10 | 2020-09-16 | Smith & Nephew, Inc. | Orthopedic systems, components, and methods |
EP3709904A1 (en) | 2017-11-13 | 2020-09-23 | Hafez, Mahmoud Alm El Din | A patient-specific guide for repairing the pelvic bone defects depending on bone quality in fixing artificial hip joint surgeries |
US20200281656A1 (en) | 2017-11-30 | 2020-09-10 | Think Surgical, Inc. | System and method fir installing bone hardware outside an end-effectors tool path |
EP3498197A3 (en) | 2017-12-12 | 2019-10-16 | Orthosoft, Inc. | Patient-specific instrumentation for implant revision surgery |
US11351007B1 (en) | 2018-01-22 | 2022-06-07 | CAIRA Surgical | Surgical systems with intra-operative 3D scanners and surgical methods using the same |
WO2019148154A1 (en) | 2018-01-29 | 2019-08-01 | Lang Philipp K | Augmented reality guidance for orthopedic and other surgical procedures |
US20210052327A1 (en) | 2018-02-13 | 2021-02-25 | Think Surgical, Inc. | Bone registration in two-stage orthopedic revision procedures |
GB2572594A (en) | 2018-04-04 | 2019-10-09 | Corin Ltd | Implant alignment system |
GB2573014A (en) | 2018-04-20 | 2019-10-23 | Corin Ltd | Surgical-tool angular measurement device |
EP3572048B1 (en) | 2018-05-25 | 2021-02-17 | Howmedica Osteonics Corp. | Variable thickness femoral augments |
EP3574862B1 (en) | 2018-06-01 | 2021-08-18 | Mako Surgical Corporation | Systems for adaptive planning and control of a surgical tool |
US20210220054A1 (en) | 2018-06-13 | 2021-07-22 | New York Society for the Relief of the Ruptured and Crippled, maintaining the Special Surgery | Evaluation of instability using imaging and modeling following arthroplasty |
EP3810015A1 (en) | 2018-06-19 | 2021-04-28 | Tornier, Inc. | Mixed-reality surgical system with physical markers for registration of virtual models |
BR112020021556A2 (en) | 2018-06-20 | 2021-03-02 | Techmah Medical Llc | methods and devices for knee surgery with inertial sensors |
US20200074631A1 (en) | 2018-09-04 | 2020-03-05 | The Board Of Regents, The University Of Texas System | Systems And Methods For Identifying Implanted Medical Devices |
US20200077924A1 (en) | 2018-09-07 | 2020-03-12 | Intellijoint Surgical Inc. | System and method to register anatomy without a probe |
EP3852645A4 (en) | 2018-09-12 | 2022-08-24 | Orthogrid Systems, SAS | An artificial intelligence intra-operative surgical guidance system and method of use |
CN113016038A (en) | 2018-10-12 | 2021-06-22 | 索尼集团公司 | Haptic obstruction to avoid collisions with robotic surgical equipment |
US11684489B2 (en) | 2018-10-29 | 2023-06-27 | Mako Surgical Corp. | Robotic system for ankle arthroplasty |
WO2020102886A1 (en) | 2018-11-19 | 2020-05-28 | Intellijoint Surgical Inc. | System and method for pre-planning a procedure |
WO2020123928A1 (en) | 2018-12-14 | 2020-06-18 | Mako Surgical Corp. | Systems and methods for preoperative planning and postoperative analysis of surgical procedures |
US11452566B2 (en) | 2018-12-21 | 2022-09-27 | Intellijoint Surgical Inc. | Pre-operative planning for reorientation surgery: surface-model-free approach using simulated x-rays |
JP7407442B2 (en) | 2019-01-15 | 2024-01-04 | インテリジョイント サージカル インク. | Systems, devices, and computer-implemented methods for bone reorientation |
EP3920825A1 (en) | 2019-02-05 | 2021-12-15 | Smith&Nephew, Inc. | Algorithm-based optimization, tool and selectable simulation data for total hip arthroplasty |
US20220160440A1 (en) | 2019-03-11 | 2022-05-26 | Smith & Nephew, Inc. | Surgical assistive robot arm |
IT201900005350A1 (en) | 2019-04-08 | 2020-10-08 | Medacta Int Sa | METHOD OBTAINED USING CALCULATOR TO VERIFY THE CORRECT ALIGNMENT OF A HIP PROSTHESIS AND SYSTEM TO IMPLEMENT THIS VERIFICATION |
WO2020231656A2 (en) | 2019-05-13 | 2020-11-19 | Tornier, Inc. | Patient-matched orthopedic implant |
US20220257145A1 (en) | 2019-05-15 | 2022-08-18 | Intellijoint Surgical Inc. | Systems and methods for computer assisted femoral surgery |
EP3986290A4 (en) | 2019-06-21 | 2023-07-19 | Manish Shah | A jigfor guiding placement of glenoid component of the implant in shoulder replacement surgery |
US20210038328A1 (en) | 2019-08-09 | 2021-02-11 | Olivier Boisvert | Revision robotics |
AU2020331936B2 (en) | 2019-08-16 | 2023-10-26 | Howmedica Osteonics Corp. | Pre-operative planning of surgical revision procedures for orthopedic joints |
WO2021041155A1 (en) | 2019-08-29 | 2021-03-04 | Mako Surgical Corp. | Robotic surgery system for augmented hip arthroplasty procedures |
EP4025150A4 (en) | 2019-09-03 | 2022-10-12 | Bodner, Russell, J. | Methods and systems for targeted alignment and sagittal plane positioning during hip replacement surgery |
JP2022551585A (en) | 2019-10-01 | 2022-12-12 | マコ サージカル コーポレーション | Surgical system and method for guiding a robotic manipulator |
AU2020357073A1 (en) | 2019-10-03 | 2022-04-07 | Smith & Nephew Asia Pacific Pte. Limited | Registration of intramedullary canal during revision total knee arthroplasty |
AU2020377135A1 (en) | 2019-10-28 | 2022-05-19 | Waldemar Link Gmbh & Co. Kg | System and method for computer-aided surgical navigation implementing 3D scans |
US20210128249A1 (en) | 2019-10-30 | 2021-05-06 | Medtech Sa | Tracker device for computer-assisted surgery |
US11607233B2 (en) | 2019-12-10 | 2023-03-21 | Zimmer, Inc. | Acetabular guide |
AU2021224529B2 (en) | 2020-02-18 | 2024-02-15 | Howmedica Osteonics Corp. | Computer-implemented surgical planning based on bone loss during orthopedic revision surgery |
CN111345895B (en) | 2020-03-13 | 2021-08-20 | 北京天智航医疗科技股份有限公司 | Total knee replacement surgery robot auxiliary system, control method and electronic equipment |
US20210327065A1 (en) | 2020-04-18 | 2021-10-21 | Mark B. Wright | Prosthesis scanning and identification system and method |
US11730601B2 (en) | 2020-04-23 | 2023-08-22 | DePuy Synthes Products, Inc. | Customized patient-specific 3D printed positioning augment for orthopaedic surgical implant |
US20230146371A1 (en) | 2020-04-29 | 2023-05-11 | Howmedica Osteonics Corp. | Mixed-reality humeral-head sizing and placement |
EP4146111A1 (en) | 2020-05-04 | 2023-03-15 | Howmedica Osteonics Corp. | Surgical system for revision orthopedic surgical procedures |
US20230200826A1 (en) | 2020-05-25 | 2023-06-29 | Orthopaedic Innovations Pty Ltd | A surgical method |
AU2021322848A1 (en) | 2020-08-07 | 2023-03-09 | Intellijoint Surgical Inc. | Systems and methods for procedure planning using prehabilitation input |
WO2022031564A1 (en) | 2020-08-07 | 2022-02-10 | Surgical Theater, Inc. | System and method for augmented reality spine surgery |
US11730603B2 (en) | 2020-09-03 | 2023-08-22 | Globus Medical, Inc. | Systems and methods for knee arthroplasty |
CA3191645A1 (en) | 2020-09-04 | 2022-03-10 | George GRAMMATOPOULOS | Hip arthroplasty planning method |
US20220125518A1 (en) | 2020-10-28 | 2022-04-28 | Waldemar Link Gmbh & Co. Kg | Tool for inserting an implant and method of using same |
US20220175400A1 (en) | 2020-12-03 | 2022-06-09 | Zimmer, Inc. | Tensor device for revision knee arthroplasty |
WO2022132319A1 (en) | 2020-12-15 | 2022-06-23 | Mako Surgical Corp. | Dynamic gap capture and flexion widget |
CN112641510B (en) | 2020-12-18 | 2021-08-17 | 北京长木谷医疗科技有限公司 | Joint replacement surgical robot navigation positioning system and method |
CN112641511B (en) | 2020-12-18 | 2021-09-10 | 北京长木谷医疗科技有限公司 | Joint replacement surgery navigation system and method |
US20220202494A1 (en) | 2020-12-31 | 2022-06-30 | Depuy Ireland Unlimited Company | Apparatus, system, and method for determining a position of a hip prosthesis in a bone of a patient |
EP4291126A1 (en) | 2021-02-11 | 2023-12-20 | Smith & Nephew, Inc. | Methods and systems for planning and performing implant surgery |
CN112971981B (en) | 2021-03-02 | 2022-02-08 | 北京长木谷医疗科技有限公司 | Deep learning-based total hip joint image processing method and equipment |
US11944549B2 (en) | 2021-03-31 | 2024-04-02 | DePuy Synthes Products, Inc. | 3D printed monoblock orthopaedic surgical implant with customized patient-specific augment |
US20230010852A1 (en) | 2021-07-12 | 2023-01-12 | Simplex Designs, Llc | System and Method for Revision Hip Surgery |
US20230013210A1 (en) | 2021-07-19 | 2023-01-19 | Orthosoft Ulc | Robotic revision knee arthroplasty virtual reconstruction system |
US11786284B2 (en) | 2021-07-30 | 2023-10-17 | Arthrex, Inc. | Orthopaedic systems and methods for defect indication |
US20230056596A1 (en) | 2021-08-12 | 2023-02-23 | Smith & Nephew, Inc. | System and method for implant surface matching for joint revision surgery |
US20230068971A1 (en) | 2021-09-01 | 2023-03-02 | Quentin Derouault | Fluoroscopic robotic prosthetic implant system and methods |
WO2023059931A1 (en) | 2021-10-08 | 2023-04-13 | Think Surgical, Inc. | Surgical system and method for forming less than all bone cut surfaces for implant placement |
US20230116074A1 (en) | 2021-10-13 | 2023-04-13 | DePuy Synthes Products, Inc. | Combination drill guide and depth gauge surgical instrument for implanting an acetabular cup component and associated surgical method |
AU2022263590A1 (en) | 2021-11-05 | 2023-05-25 | Mako Surgical Corp. | Assessment of soft tissue tension in hip procedures |
US20230190494A1 (en) | 2021-12-21 | 2023-06-22 | Depuy Ireland Unlimited Company | Orthopaedic surgical system for installing a knee cone augment and method of using the same |
US20230190139A1 (en) | 2021-12-21 | 2023-06-22 | Stryker Corporation | Systems and methods for image-based analysis of anatomical features |
US20230240759A1 (en) | 2022-01-31 | 2023-08-03 | Smith & Nephew, Inc. | Modular and depth-sensing surgical handpiece |
TW202402246A (en) | 2022-03-02 | 2024-01-16 | 輔仁大學學校財團法人輔仁大學 | Surgical navigation system and method thereof |
WO2023183644A1 (en) | 2022-03-25 | 2023-09-28 | Murphy Stephen B | Systems and methods for planning screw lengths and guiding screw trajectories during surgery |
CN114431957B (en) | 2022-04-12 | 2022-07-29 | 北京长木谷医疗科技有限公司 | Total knee joint replacement postoperative revision preoperative planning system based on deep learning |
DE102022111284A1 (en) | 2022-05-06 | 2023-11-09 | Aesculap Ag | Implant assistance procedure and implant assistance system for optimized use or joint replacement |
US11612503B1 (en) | 2022-06-07 | 2023-03-28 | Little Engine, LLC | Joint soft tissue evaluation method |
US11602443B1 (en) | 2022-06-07 | 2023-03-14 | Little Engine, LLC | Knee evaluation and arthroplasty method |
-
2017
- 2017-07-13 WO PCT/US2017/041989 patent/WO2018013848A1/en unknown
- 2017-07-13 EP EP17745226.5A patent/EP3484398A1/en active Pending
- 2017-07-13 CN CN202110589543.1A patent/CN113303907A/en active Pending
- 2017-07-13 CN CN201780056264.0A patent/CN109688963B/en active Active
- 2017-07-13 JP JP2019501933A patent/JP7123031B2/en active Active
- 2017-07-13 AU AU2017295728A patent/AU2017295728B2/en active Active
- 2017-07-13 CA CA3030831A patent/CA3030831A1/en not_active Abandoned
- 2017-07-13 KR KR1020197004649A patent/KR20190031281A/en not_active Application Discontinuation
- 2017-07-13 KR KR1020227039554A patent/KR20220157512A/en not_active Application Discontinuation
- 2017-07-13 US US15/649,416 patent/US10716630B2/en active Active
-
2020
- 2020-07-08 US US16/923,192 patent/US11484368B2/en active Active
- 2020-12-28 US US17/134,774 patent/US11857270B2/en active Active
-
2021
- 2021-03-31 AU AU2021202000A patent/AU2021202000B2/en active Active
- 2021-07-28 US US17/387,643 patent/US11944392B2/en active Active
-
2022
- 2022-08-09 JP JP2022126758A patent/JP2022160618A/en active Pending
- 2022-09-30 AU AU2022241606A patent/AU2022241606B2/en active Active
-
2023
- 2023-09-26 US US18/372,815 patent/US20240008934A1/en active Pending
- 2023-12-14 AU AU2023282267A patent/AU2023282267A1/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104869918A (en) * | 2012-12-21 | 2015-08-26 | 玛口外科股份有限公司 | Methods and systems for planning and performing osteotomy |
Also Published As
Publication number | Publication date |
---|---|
AU2017295728A1 (en) | 2019-02-07 |
US20180014891A1 (en) | 2018-01-18 |
WO2018013848A1 (en) | 2018-01-18 |
JP2022160618A (en) | 2022-10-19 |
US11857270B2 (en) | 2024-01-02 |
AU2022241606A1 (en) | 2022-10-27 |
US20240008934A1 (en) | 2024-01-11 |
AU2017295728B2 (en) | 2021-03-25 |
AU2023282267A1 (en) | 2024-01-18 |
AU2021202000B2 (en) | 2022-09-15 |
US20210113278A1 (en) | 2021-04-22 |
JP2019523049A (en) | 2019-08-22 |
US20210353367A1 (en) | 2021-11-18 |
US11944392B2 (en) | 2024-04-02 |
KR20190031281A (en) | 2019-03-25 |
EP3484398A1 (en) | 2019-05-22 |
CN109688963B (en) | 2021-08-06 |
CA3030831A1 (en) | 2018-01-18 |
JP7123031B2 (en) | 2022-08-22 |
US20200337784A1 (en) | 2020-10-29 |
AU2022241606B2 (en) | 2023-11-23 |
AU2021202000A1 (en) | 2021-04-29 |
US10716630B2 (en) | 2020-07-21 |
US11484368B2 (en) | 2022-11-01 |
CN109688963A (en) | 2019-04-26 |
KR20220157512A (en) | 2022-11-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109688963B (en) | System for robot-assisted correction of programs | |
JP2019523049A5 (en) | ||
US11937831B2 (en) | Systems and methods for preparing a proximal tibia | |
KR102335667B1 (en) | Systems and methods for generating customized haptic boundaries |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |