US20230355317A1 - Method for confirming registration of tracked bones - Google Patents

Method for confirming registration of tracked bones Download PDF

Info

Publication number
US20230355317A1
US20230355317A1 US18/221,913 US202318221913A US2023355317A1 US 20230355317 A1 US20230355317 A1 US 20230355317A1 US 202318221913 A US202318221913 A US 202318221913A US 2023355317 A1 US2023355317 A1 US 2023355317A1
Authority
US
United States
Prior art keywords
bone
location
bone model
model
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/221,913
Inventor
Joel Zuhars
Daniel P. BONNY
Saleh TABANDEH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Think Surgical Inc
Original Assignee
Think Surgical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/US2016/062008 external-priority patent/WO2017087371A1/en
Application filed by Think Surgical Inc filed Critical Think Surgical Inc
Priority to US18/221,913 priority Critical patent/US20230355317A1/en
Assigned to THINK SURGICAL, INC. reassignment THINK SURGICAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ZUHARS, JOEL, BONNY, DANIEL P., TABANDEH, Saleh
Publication of US20230355317A1 publication Critical patent/US20230355317A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00725Calibration or performance testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • A61B2034/2057Details of tracking cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/061Measuring instruments not otherwise provided for for measuring dimensions, e.g. length
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the present invention generally relates to the field of computer assisted orthopaedic surgery and in particular, to a new and useful process and system for confirming registration of tracked bones in surgery.
  • Total joint replacement (also called primary total joint arthroplasty) is a surgical procedure in which the articulating surfaces of a joint are replaced with prosthetic components, or implants.
  • TJR especially for hips, knees, shoulders, and ankles restores functionality while greatly reducing pain associated with osteoarthritis in patients.
  • the benefits of TJR are tempered by complications associated with replacement misfit. Less than perfect fit of a replacement joint places unbalanced forces on the implant that can lead to premature implant wear and discomfort. When such wear becomes extreme, revision surgery is required.
  • TJR typically involves the removal of the articulating cartilage surface of the joint including a varying amount of bone depending on the joint and the replacement implant being used. This cartilage and bone is then replaced with a synthetic, typically metal and/or plastic, implant that is used to create a new joint surface.
  • the position, orientation and compliance of the prosthetics implanted into the joint are critical factors that have a significant effect on the clinical outcome of the patient. Therefore, computer assisted surgical devices are gaining popularity as a tool to pre-operatively plan and precisely execute the plan to ensure an accurate final position and orientation of the prosthetics within the patient’s bone that can improve long term clinical outcomes and increase the survival rate of the prosthesis.
  • the computer assisted surgical systems include two components, an interactive pre-operative planning software program and a computer assisted surgical device that utilizes the pre-operative data from the software to assist the surgeon in precisely executing the procedure.
  • the conventional interactive pre-operative planning software generates a three-dimensional (3-D) model of the patient’s bony anatomy from a computed tomography (CT) or magnetic resonance imaging (MRI) image dataset of the patient.
  • a set of 3-D computer aided design (CAD) models of the manufacturer’s prosthesis are pre-loaded in the software that allows the user to place the components of a desired prosthesis to the 3-D model of the boney anatomy to designate the best fit, position and orientation of the implant to the bone.
  • the user can then save this pre-operative planning data to an electronic medium that is loaded and read by a surgical device to assist the surgeon intra-operatively in executing the plan.
  • one of the main goals of computer-assisted surgery is to define a patient specific plan and precisely execute the procedure, in a timely manner on a patient.
  • the accuracy of the cut volume for a given implant is critical and errors can accumulate based on registration error, cutter manufacturing tolerances and implant manufacturing tolerances.
  • Registration techniques well known in the art such as point to surface registration can align the coordinate frames of a patient’s bone to the coordinate frames of a 3-D model of a patient’s bone and to the coordinate frame of the surgical device.
  • the registration of the location of the bone intra-operatively within the workspace of a surgical robot serves to determine the precise location and orientation of the bone within the workspace of the robot. In some embodiments, this may be accomplished by probing radiopaque fiducial markers placed into or on the bone that were installed prior to pre-operative imaging.
  • a fiducial marker is appreciated to be a material with an opacity that is different than that of surrounding subject tissue or a reference point capable of detection by an external source (e.g. optical cameras, x-rays, radio frequency).
  • Examples of fiducial markers include a radiopaque pin, an active device such as radio frequency identification (RFID) tag or light emitting diode (LED), a passive retro-reflective sphere, or a combination thereof.
  • RFID radio frequency identification
  • LED light emitting diode
  • a registration guide is applied that fits on the bone, or a surface matching algorithm is used, or any other method to determine the orientation of the subject’s operative bone.
  • the usage of such techniques are further detailed in: PCT/IB2013/002311 entitled SYSTEM AND METHOD FOR REGISTRATION IN ORTHOPAEDIC APPLICATIONS. S. Cohan, “ROBODOC achieves pinless registration” The Industrial Robot; 2001; 28, 5; pg. 381.
  • P. J. Besl “A Method for Registration of 3-D Shapes” IEEE Transactions on Pattern Analysis and Machine intelligence, 1992; 14, pgs. 239-256.
  • a tracking system monitors the POSE of the registered bone via tracking arrays fixed thereto, any relative movement between the tracking array and the bone negates the accuracy of the registration. If the bone is not re-registered, the procedure cannot be executed according to the plan. Often, the tracking array is bumped unintentionally and there is no method for signaling or alerting the surgeon that the registration is no longer accurate. Or, the fixed tracking array may drift over-time accumulating errors in the registration. It is not until the surgeon notices the computer-assisted device is off-target or there is an erroneous reading on the device to suggest the registration is compromised.
  • a final verification step may be implemented which involves digitizing several additional points on the bone after the registration is complete. The additional points confirm and verify the POSE of the bone. A prompt may then indicate that the registration was successful. However, this verification process can increase the surgical time.
  • the surgeon may articulate the joint to ensure the proper range of motion and kinematics are achieved. It may be desirable to restore the joint to a similar joint motion, kinematics, or articulation as the pre-cut anatomy so the soft tissues are well balanced, and there is decreased stress on the surrounding muscles and other anatomical structures. Having a process to compare the articulation of the now modified joints to the pre-articulating motion may be highly beneficial for assessing not only the outcome of the procedure, but to also provide a mechanism for recommending intraoperative modifications for the joint to achieve the pre-articulating motion.
  • a system for confirming registration of a bone includes a three-dimensional (3-D) bone model of the bone registered to a bone location, a visible light camera, and a display.
  • the visible light camera captures images of the bone in the camera’s field-of-view (FOV).
  • the display is configured to display the captured images of the FOV and an image of the 3-D bone model, where the displayed location of the image of the 3-D bone model corresponds to the location of the 3-D bone model registered to the bone location.
  • the registration may be confirmed if the displayed location of the image of the bone as captured by the visible light camera corresponds to the displayed location of the 3-D bone model.
  • a system for confirming registration of a bone includes a three-dimensional (3-D) bone model registered to a bone location.
  • a digitizer having a digitizer tip and a tracking system for tracking movement of the digitizer.
  • a visible light camera for capturing images of a bone in the field-of-view (FOV) of the camera.
  • a computer with a processor configured to record positions of the digitizer tip as the digitizer is moved about the bone, and project the recorded positions onto the 3-D bone model registered at the bone location.
  • a display is configured to display captured images of the FOV, an image of the recorded positions, and an image of the recorded positions projected onto the 3-D bone model. The registration may be confirmed if the displayed location of the image of the recorded positions projected onto the 3-D bone model corresponds to the displayed location of the image of the recorded positions.
  • FIG. 1 illustrates a 3-D model of the operative bones of a knee joint replacement procedure
  • FIG. 2 illustrates a process flowchart according to the present invention
  • FIG. 3 illustrates a collision between 3-D models of the operative bones while a surgeon moves the actual bone
  • FIG. 4 illustrates system for the verification of bone registration using a high definition (HD) camera mounted on a surgical light and including an integrated tracking system.
  • HD high definition
  • FIGS. 5 A and 5 B illustrates a tracking array with an LED that highlight a relative position on the operative bone to monitor registration accuracy.
  • FIG. 6 illustrates a single fiducial marker mounted on the bone relative to a tracking array attached to the bone to monitor registration accuracy.
  • FIG. 7 illustrates a process flowchart for comparing a virtual motion to a physically tracked bone motion according to embodiments of the invention.
  • FIG. 8 depicts a digitizer tracing the edges of a soft tissue to bone interface for display on a captured image from a visible light camera for confirming, verifying, or evaluating the registration accuracy in accordance with embodiments of the invention.
  • the present invention has utility as a system and process for confirming the registration of tracked bones prior to and during a computer-assisted surgical procedure.
  • Embodiments of the inventive process and system confirm registration of a subject’s bones by articulating the physical tracked bones relative to 3-D displayed models of the tracked bones.
  • the articulation of the physically tracked bones and corresponding motion of the 3-D displayed models may be further used to adjust the bone orientation or provide intraoperative joint modification recommendations for improved artificial joint positioning, as compared to conventional techniques.
  • total knee arthroplasty is one procedure that can benefit from the disclosed embodiments
  • other surgical procedures can illustratively include surgery to the hip joint, spine, shoulder joint, elbow joint, ankle joint, jaw, a tumor site, joints of the hand or foot, and other appropriate surgical sites.
  • a fiducial marker refers to a point of reference capable of detection.
  • a fiducial marker may include: an active transmitter, such as a light emitting diode (LED) or electromagnetic emitter; a passive reflector, such as a plastic sphere with a retro-reflective film; a distinct pattern or sequence of shapes, lines or other characters; acoustic emitters or reflectors; magnetic emitters or reflectors; radiopaque markers; and the like or any combinations thereof.
  • a tracking array is an arrangement of a plurality of fiducial markers in/on a rigid body of any geometric shape, where each tracking array has a unique geometry of fiducial markers or a unique blinking frequency if active LEDs are used to distinguish between each of the tracked objects.
  • Optical tracking systems generally include at least two receivers to detect one or more fiducial markers in three-dimensional space.
  • the receivers are in communication with at least one processor for processing the receiver output.
  • the processing determines the position and orientation (pose) of a tracking array using various algorithms such as time-of-flight or triangulation.
  • the receiver may detect the location of a fiducial marker through a variety of mechanisms including, but not limited to, visible light, electromagnetic radiation, and infrared, as well as any shape, pattern, line, sequence or character recognition.
  • tracking systems may be used to track objects using radio frequency, magnetics, accelerometers, gyroscopes, acoustic energy or signals, and mechanical linkages. Examples of tracking systems to determine the pose of an object are described in U.S. Pat. Nos. 5,282,770, 6,061,644, and 7,302,288. An example of a mechanical tracking system having mechanical linkages is described in U.S. Pat. No. 6,322,567.
  • the computer-assisted device may be, for example, a 2-6 degree of freedom hand-held surgical system, a serial chain manipulator system, a parallel robotic system, or a master-slave robotic system, as described in U.S. Pat. Nos. 5,086,401, 7,206,626, 8,876,830, and 8,961,536, U.S. Pat. App. No. 2013/0060278 and U.S. Prov. App. 62/054,009.
  • Such commercial systems illustratively include for example the NavioPFSTM Robotic Resurfacing System (Blue Belt Technologies), the RIO® Robotic System (Mako Surgical Corp.), and the TSolution OneTM Surgical System (Think Surgical). It should be appreciated that navigated or tracked instruments may also be used with the subject matter disclosed herein.
  • a process is provided to confirm the registration of bones involved in a joint replacement procedure.
  • the process includes the use of pre-operative planning software to generate a 3-D model of the patient’s bony anatomy from a computed tomography (CT) or magnetic resonance imaging (MRI) image dataset of the patient.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • a set of 3-D computer aided design (CAD) models of the manufacturer’s prosthesis are pre-loaded in the software that allows the user to place the components of a desired prosthesis to the 3-D model of the boney anatomy to designate the best fit, position and orientation of the implant to the bone.
  • This is accomplished with commercially available systems such as the RIO® Interactive Orthopedic System manufactured by Stryker Mako (Ft.
  • FIG. 1 depicts a 3-D model of a knee joint including a coronal and sagittal view of the distal femur 12 and proximal tibia 14 .
  • the 3-D virtual models may be displayed on a monitor 10 to facilitate pre-operative planning or to monitor the motion of the tracked bones intra-operatively.
  • a computer simulation may be run to optimize the multiple variables of positional and rotational degrees of freedom to achieve an optimized artificial joint geometry.
  • a finite element analysis is readily applied based on these variables to achieve a balanced force weighting on opposing sides of the joint through a preselected swing angle for the joint. It is appreciated that patient weight, height, and arm swing counterbalance are additional factors that can be built into the optimization routine to predict optimal joint arthroplasty positioning.
  • An example of analyzing the positional and geometric rotational degrees of freedom using a computer simulated model is described in A.C. Godest et al., “Simulation of a knee joint replacement during a gait cycle using explicit finite element analysis” Journal of Biomechanics 35 (2002) 267-275.
  • an inventive process schematic is shown generally at 20 .
  • the three-dimensional (3D) models of operative bones involved in a joint replacement procedure are generated, as a physical or virtual model at step 22 and is made available to the surgeon during the procedure.
  • the virtual models may be part of an overall surgical plan that further includes instructions for a computer-assisted surgical device, the final implant position, or the results of any such computer simulations generated for the optimal position of the bones or implants at step 24 .
  • a tracking device such as a tracking array or a mechanical tracking probe is attached to each operative bone to allow 6-degrees of freedom (DOF) tracking during the procedure at 26 .
  • the bones may be tracked by a tracking system as previously described.
  • the 3D models of each operative bone are then registered to the patient’s actual operative bone at step 28 .
  • Systems and methods of optical or mechanical tracking and registration are described in U.S. Pat. No. 6,033,415, which is incorporated herein by reference in its entirety. Systems and processes for pre-operative planning and precise bone removal are also known to the art and include those detailed in WO 2015006721 A1.
  • the surgeon moves a tracked bone having a tracking device associated therewith, and a display shows a virtual representation of the motion in real-time.
  • a surgeon can observe the position of the bones involved in the joint arthroplasty in a way that is impossible from viewing the actual bones.
  • This attribute of the present invention is a novel attribute that becomes increasingly helpful as the bones tagged with a tracking device approach the optimal positioning.
  • the display or jig include feedback indicators such as markers, a blinking screen, arrows, or highlighted areas on the virtual bone models, that may be used to indicate or guide the surgeon to an optimal position and orientation of a given bone based on the pre-surgical plan and any simulations associated therewith. The surgeon may then assess the kinematics of the operative bones in the position and orientation defined by the pre-surgical plan prior to making any bone cuts.
  • the surgeon or system determines if the movements of the tracked bones correspond to the movements of the 3-D models at step 32 . If the tracked bones are moving in correspondence with the 3-D models then the surgeon can confirm that the registration is accurate at step 34 . If the tracked bones do not move in correspondence with the 3-D models, or the motion is unexpected, then the accuracy of the registration is likely not accurate. In that case, the surgeon re-registers the bone. This is described in more detail below.
  • the surgeon may verify or confirm the accuracy of the registration by making sure that the tracked femur and tibia do not collide by viewing the 3-D models on a display 40 (e.g. monitor, television); or that the femoral head stays approximately in the center of the acetabulum during articulation of the femur relative to the pelvis, each tracked, by viewing the 3-D models on the display 40 during articulation.
  • a display 40 e.g. monitor, television
  • This provides an intuitive visual process for confirming the accuracy of the bone registration.
  • the registration is likely not accurate and the surgeon can re-register the bone before proceeding with the procedure. If the registration appears accurate, the surgeon may confirm or verify the registration by way of an input mechanism (e.g. mouse, keyboard, joystick, pendant, touchscreen display, microphone) in communication with the tracking system or computer-assisted device.
  • a prompt 48 may be displayed on the display 40 for the surgeon to select whether the registration is confirmed or not.
  • the surgeon may also confirm the registration to the system or device by providing a particular signal to the tracking system.
  • the surgeon can place a tracked digitizer probe in a specific location relative to the 3-D bone model, or manipulate the bones in a particular pattern or shape. Therefore, the surgeon to quickly confirm the registration without having to use an input mechanism.
  • the surgeon can monitor and confirm the registration throughout an entire procedure by moving the tracked bones and visualizing the corresponding motion of the 3-D virtual models on the display.
  • articulating the joints throughout flexion/extension are normal steps of the procedure. Therefore, confirming the registration throughout these normal procedural steps is quick, efficient, and does not require any additional steps outside of a normal TKA.
  • viewing such models devoid of skin, connective tissue, fat, and blood provides insights not available to a surgeon through actual inspection of the joint before or during surgery.
  • collision detection may be implemented with a computer program or through other types of algorithms that provide a warning to a surgeon or other medical personnel if the 3D virtual models collide during the articulation of the tracked bones.
  • a collision detection algorithm that can be modified by one of skill in the art to warn a surgeon of inaccurate registration is described in Madera-Ram ⁇ rez, Francisco. “An Introduction to the Collision Detection Algorithms.” Abstraction and Application Magazine 5 (2014).
  • Other algorithms can also be implemented to provide a warning or alert the surgeon that the corresponding motion of the 3-D virtual models is outside of specified thresholds. For example, the algorithm may monitor the relative rotation of the femur with respect to the center of the acetabulum.
  • an alert may be generated.
  • an alert may be generated if the bone models move apart beyond a threshold limit.
  • TKA if the closest point between the tibia and either the medial or lateral condyle region of the distal femur becomes significantly greater than the expected total cartilage thickness or other specified distance threshold, the registration is likely off or the tracking arrays may have moved relative to the bone, and an alert is generated.
  • registration of tracked bones may also be confirmed by observing the full extension position of a subject’s limb and compare the limb extension to the virtual full extension position to make sure that the achieved extension is correct.
  • a measurement of the articulation of the joint prior to making any surgical or altering cuts may be saved for later comparison with a post treatment or operative articulation for a measurement of alignment after the trial implants are in place to see the differences, and possibly recommend corrections.
  • a surgeon can obtain joint positions that retain pre-cut bone positions or controlled modification of dysfunctional pre-cut geometries.
  • a pre- and post-operative comparison may be used as a way to recommend adjustments for the subject’s tibia internal-external rotation.
  • surgical errors during bone resection may lead to abnormal joint motion after trial implantation, and specific embodiments of the invention are used to visualize the abnormal joint motion, where the operative bones, with trial implants, are tracked during motion, and 3-D models of the pre-operative bones are displayed rather than the post-operative bones. Any abnormal motion, such as collisions between the bones, would indicate changes in jointkinematics and may be used to direct the surgeon’s choice of intraoperative correction.
  • the computer-assisted device or tracking system may recommend to the surgeon via a prompt on a display, that 2 mm of the medial aspect of the tibia should be resected to obtain the pre-cut articulating motion of the knee.
  • FIG. 7 generally outlines another process 90 using the physical articulation of the tracked bones to provide bone modification recommendations and/or further aid a surgeon in obtaining a pre-operatively planned result.
  • the generation of the 3-D bone models (step 92 ), and the planning of the placement of the virtual implants relative to the bone models (step 94 ) with or without the aid of the computer simulations (step 96 ), are all accomplished as described above.
  • the virtual motion of the 3-D bone models with the virtual implants is simulated.
  • the surgeon may further adjust the planned position of the virtual implants to achieve a desired virtual motion that the surgeon prefers the actual bone motion to mimic post-operatively.
  • the desired virtual motion is then saved at step 98 for use intra-operatively.
  • the surgeon executes the procedure and modifies the bone according to the plan as described above.
  • trial components are placed in the joint and the surgeon physically articulates the tracked bones at step 102 .
  • the saved virtual motion is then compared with the actual articulation of the tracked bones at step 104 .
  • the saved virtual motion of the 3-D bone models with the virtual implants are overlaid on the 3-D bone models registered to the actual bones.
  • one of the virtual bones or a portion of one of the virtual bones is mapped to one of or a portion of the 3-D bone model registered to the actual bone.
  • the femoral head and neck of the pre-operative virtual model associated with saved virtual motion is mapped to the femoral head and neck of the virtual model registered to the bone. Therefore, the surgeon can observe how the actual motion of the non-mapped bone corresponds to the saved virtual motion of the non-mapped bone.
  • the mapped bone may be the femur, where the surgeon can observe how the actual tibia moves in relation to the actual femur and compare this to how the virtual tibia moved relative to the virtual femur during planning.
  • the surgeon may have the ability to speed-up and slow-down the virtual motion, or stop the virtual motion at particular articulation points, to aid in the comparison.
  • the surgeon can complete the procedure (step 106 ) by inserting the final implants and closing the surgical site. If the motions are dissimilar, the computer-assisted system may provide recommendations (step 108 ) to re-modify the bone such that the actual bone motion corresponds with the saved virtual motion.
  • the system 50 may include a high definition (HD) camera 52 mounted on a surgical light 54 that has an integrated tracking system with optical receivers 56 .
  • the HD camera 52 may be calibrated to a tracking system using one or more marker light emitting diodes (LEDs) 58 fitted on the bone B prior to registration, and then during a post-registration assessment, a 3D reconstructed picture of the bone model from the HD camera’s perspective, or an outline of the bone model 60 from that perspective, could be superimposed on the video 62 from the camera 52 , and this should visually correspond to the image of the bone B in the video 62 .
  • LEDs marker light emitting diodes
  • the superimposed bone model 60 would shift away from being aligned with the bone B in the video 62 . This would give the surgeon an easily checked indication in real-time of whether the registration is still fundamentally accurate.
  • a visible light camera 52 e.g., an HD video camera, charged-coupled device (CCD) camera, active-pixel sensor (APS) camera, complementary MOS (CMOS) camera, and other photosensors
  • a tracking system having tracking receivers 56 e.g., optical detectors, infrared cameras, electromagnetic sensors, acoustic sensors, radio-frequency sensors
  • the tracking system coordinate frame and the visible light camera coordinate frame are calibrated with respect to one another.
  • This may be accomplished by: (i) determining the location of an object in the field-of-view (FoV) of the visible light camera 52 ; (ii) determining the location of the same object in the FoV of the tracking receivers 56 ; and (iii) calculating the transformation matrix between the coordinate frame of the tracking system and the coordinate frame of the visible light camera using the determined locations of the object in (i) and (ii).
  • the object may be, for example: (a) one or more LEDs 58 affixed to the bone B; (b) a dedicated calibration object; or (d) an LED associated with a tool used during the procedure (e.g., a surgical device, a digitizer, etc.).
  • the tracking system can track the location of the 3-D bone model (registered to the bone) in the tracking system coordinate frame.
  • the tracking system may track the location of the 3-D bone model by: i) tracking the location of a tracking array affixed to the bone; and ii) applying a registration transformation matrix that defines the location of the 3-D bone model registered to the bone with respect to the tracking array coordinate frame. Then using the transformation matrix from the calibration, the 3-D bone model, or an outline of the bone model 60 , may be displayed on the images captured by the visible light camera 52 at that tracked/registered location.
  • he tracked location of the 3-D bone model, or the outline of the bone model 60 , in the coordinate frame of the tracking system is transformed (using the calibration transformation matrix or a best fit model) to the same location in the coordinate frame of the visible light camera 52 for display on the images captured by the visible light camera 52 .
  • the transformation matrix can be determined in Cartesian, spherical, or cylindrical coordinates as the computational complexity can be reduced by a selection of suitable coordinate system. S. Guo et al., J Biomed Opt . 2015 Jan; 20(1): 016018 details one such methodology.
  • the 3-D bone model, or the outline of the bone model 60 will match and/or overlap with the location of the actual bone in the images (e.g., video 62 ) captured by the visible light camera 52 as shown on the display 61 in FIG. 4 . If the registration is not accurate, a user can quickly view the registration errors by comparing the displayed location of the actual bone in the images to the displayed location of the 3-D bone model in the same images. The images may be updated in real-time (e.g., a video feed) to account for any movement of the bone while evaluating the registration accuracy. In the event the registration is not accurate, the 3-D bone model may be re-registered to the bone.
  • a user may monitor the accuracy of the registration using this technique throughout the surgical procedure, where the view of the displayed location of the actual bone relative to the displayed location of the 3-D bone model is at least one of: always displayed on the display 61 ; displayed at one or more timepoints (e.g., checkpoints) during the surgical procedure; or easily accessible via a window, tab, or other software tool available on the display or via a computer input mechanism (e.g., mouse, keyboard, touchscreen, pendent, controller).
  • a computer input mechanism e.g., mouse, keyboard, touchscreen, pendent, controller
  • the registration accuracy may be further evaluated, confirmed, and/or verified by tracing the edges of the soft tissue “ST” (e.g., cartilage; skin, fat, and/or muscle of the incision that provides access the bone) to bone interface with a digitizer 120 to generate a 3-D contour, displaying the 3-D contour on the images captured by the visible light camera 52 , and comparing the displayed 3-D contour 65 with the displayed locations of the soft tissues a captured by the visible light camera 52 .
  • a digitizer 120 may include a digitizer tip 122 for designating the location of points in physical space, and a tracking array 124 having three or more fiducial markers to permit a tracking system to track the location of the digitizer 120 (and the tip 122 ) in space.
  • a user wielding the digitizer 120 may trace the edges of the soft tissue “ST” to bone interface on the bone B.
  • the tracking system records, or digitizes, the location of the digitizer tip 122 while the user traces the edges of the soft tissue “ST” to bone interface with the digitizer to generate a 3-D contour. While the digitizer tip 122 is depicted as a pointed tip, it is appreciated that a blunt tip is operative herein to assure registration of an upper extent of a target surface, as opposed to variable penetration into a soft or otherwise porous bone or target tissue.
  • the 3-D contour represents the edges of the soft tissue “ST” to bone interface in the tracking system coordinate frame. The location of the digitized 3-D contour is then projected onto the location of the registered 3-D bone model.
  • the digitized 3-D contour, the registered bone model, and the projected 3-D contour on the registered bone model may now be displayed on the 2-D images captured by the visible light camera at their corresponding locations using a calibration transformation matrix between. the tracking system coordinate frame and the visible light camera coordinate frame.
  • the registration accuracy may be visually evaluated by the user by comparing: i) the displayed location of the digitized 3-D contour (shown on the display 61 as dotted lines 65 ); and ii) the displayed location of the 3-D contour projected onto the registered 3-D bone model.
  • the calibration accuracy between the tracking system coordinate frame and visible light camera coordinate frame may also be evaluated by the user by comparing: i) the displayed location of the digitized 3-D contour; and ii) the actual location of the soft tissues “ST” to bone interface as captured by the visible light camera 52 .
  • the images may be updated in real-time (e.g., a video feed) to account for movement of the bone while evaluating the calibration accuracy.
  • the generation of a soft tissue 3-D contour may further be used to account for situations where there are soft tissues present on the actual bone, but the 3-D bone model lacks soft tissue information. Since the 3-D bone model is conventionally generated using CT scans, the 3-D bone model lacks soft tissue information (i.e., the 3-D bone model does not include any information about the covering/surrounding cartilage). Therefore, the displayed location of the 3-D bone model, or an outline of the 3-D bone model 60 (as shown in FIG. 4 ), may not quite match with the displayed location of the actual bone due to the locations or thicknesses of the covering/surrounding soft tissues.
  • the tracking system, visible light camera 52 , and display 61 may all be in communication with each other via a wired or wireless connection.
  • the tracking system, visible light camera 52 , and display 61 may further be in communication with (or include) a computing system comprising one or more computers, each having a processor, memory, and software for performing the functions and operations described herein.
  • a computing system may include image processing software for performing various functions.
  • the image processing software may segment the images captured by the visible light camera 52 , such as the areas of soft tissues “ST”.
  • the image processing software may use the locations of the 3-D contours to assist in the segmentation because the 3-D contour should be a close match and of the correct shape as the anatomy, which may enable the segmentation/matching to be fully automated.
  • the soft tissues “ST” segmented from the images captured by the visible light camera 52 , the bone without the soft tissues may be visualized to provide the user with a measurement of the registration error.
  • the 3-D bone model may not include any information about the covering/surrounding soft tissues.
  • the displayed location of the actual bone can be compared to the displayed location of the 3-D bone model (also without soft tissue information). If the registration is accurate, the displayed location of the actual bone (with the soft tissues segmented out of the image) will match with the displayed location of the 3-D bone model. If not, the user may view the registration error between the displayed location of the bone (without soft tissue information) to the displayed location of the 3-D bone model (also without soft tissue information).
  • the tracking system coordinate frame and the visible light camera coordinate frame may be calibrated with respect to one another using image processing techniques.
  • the digitizer 120 may be segmented from the images captured by the visible light camera 52 using known image processing techniques.
  • the digitizer 120 may be segmented from the image using the known/expected shape of the digitizer 120 to determine the location of the digitizer 120 in 2-D in real-time.
  • a transformation matrix may then be calculated between the tracking system coordinate frame and the visible light camera coordinate frame using: i) the determined location of the digitizer 120 segmented in the visible light camera coordinate frame; and ii) the tracked location of the digitizer as determined by the tracking system.
  • the advantages of this technique include: no need to have a full calibration with separate or specialized calibrating objects; it’s more robust; and requires less maintenance over time which is less risky from a safety perspective.
  • an LED 74 on the tracking array 70 is used to illuminate a particular landmark or manually made mark 76 on the bone B.
  • the tracking array 70 shown here includes passive fiducial markers 72 and an LED 74 capable of highlighting an arbitrary spot on the bone B. For example, once the tracking array 70 is attached to the bone B, this LED 74 illuminates a small spot in an arbitrary location on the bone B, the surgeon would then mark that spot 76 with a marking device, illustratively including a purple pen, and then proceed to do the registration. If the tracking array 70 moves relative to the bone B during registration or the procedure, the highlighted spot will almost certainly move away from the marked purple spot 76 , indicating a registration error.
  • a single fiducial marker 82 is mounted on the bone B, in relation to a tracking array 80 attached to the bone to provide a redundant reference for monitoring any relative motion between the fiducial marker 82 and the tracking array 80 .
  • the tracking system After registration, if the tracking array 80 were to move or bend, the tracking system would see the geometry change between the tracking array markers 84 and the single fiducial maker 82 . An alarm or prompt may be triggered to alert the surgeon that the tracking array 80 has moved relative to the bone indicating the registration is no longer accurate.
  • the fiducial marker 82 may be a single LED powered with a small embedded battery, which would most likely be a disposable battery.
  • two full markers are attached to a bone at a given location, with a main marker and a smaller, less accurate marker, to check that the relative transform between the markers remains stable.
  • a digitizer may be used as needed to recheck a particular landmark or manufactured mark, such as a purple pen mark, if any question on the registration arises.

Abstract

A system for confirming registration of a bone involved in a joint replacement procedure is provided. The system includes a three-dimensional (3-D) bone model of the bone registered to a bone location, a visible light camera, and a display. The visible light camera captures images of the bone in the camera’s field-of-view (FOV). The display is configured to display the captured images of the FOV and an image of the 3-D bone model, where the displayed location of the image of the 3-D bone model corresponds to the location of the 3-D bone model registered to the bone location. The registration may be confirmed if the displayed location of the image of the bone as captured by the visible light camera corresponds to the displayed location of the 3-D bone model.

Description

    RELATED APPLICATION DATA
  • This application is a continuation-in-part of U.S. Pat. Application No. 17/148,612, filed Jan. 14, 2021, which is a continuation of U.S. Pat. Application No. 15/769,040, filed Apr. 17, 2018, now U.S. Pat. No. 10, 905,496, issued Feb. 2, 2021, which is a U.S. National Phase of International Pat. Application No. PCT/US2016/062008, filed Nov. 15, 2016, which claims priority to U.S. Provisional Application No. 62/255,618, filed Nov. 16, 2015, the entire disclosure of which is hereby incorporated by reference as if set forth fully herein.
  • FIELD OF THE INVENTION
  • The present invention generally relates to the field of computer assisted orthopaedic surgery and in particular, to a new and useful process and system for confirming registration of tracked bones in surgery.
  • BACKGROUND OF THE INVENTION
  • Total joint replacement (TJR) (also called primary total joint arthroplasty) is a surgical procedure in which the articulating surfaces of a joint are replaced with prosthetic components, or implants. TJR, especially for hips, knees, shoulders, and ankles restores functionality while greatly reducing pain associated with osteoarthritis in patients. The benefits of TJR are tempered by complications associated with replacement misfit. Less than perfect fit of a replacement joint places unbalanced forces on the implant that can lead to premature implant wear and discomfort. When such wear becomes extreme, revision surgery is required.
  • TJR typically involves the removal of the articulating cartilage surface of the joint including a varying amount of bone depending on the joint and the replacement implant being used. This cartilage and bone is then replaced with a synthetic, typically metal and/or plastic, implant that is used to create a new joint surface. The position, orientation and compliance of the prosthetics implanted into the joint are critical factors that have a significant effect on the clinical outcome of the patient. Therefore, computer assisted surgical devices are gaining popularity as a tool to pre-operatively plan and precisely execute the plan to ensure an accurate final position and orientation of the prosthetics within the patient’s bone that can improve long term clinical outcomes and increase the survival rate of the prosthesis. In general, the computer assisted surgical systems include two components, an interactive pre-operative planning software program and a computer assisted surgical device that utilizes the pre-operative data from the software to assist the surgeon in precisely executing the procedure.
  • The conventional interactive pre-operative planning software generates a three-dimensional (3-D) model of the patient’s bony anatomy from a computed tomography (CT) or magnetic resonance imaging (MRI) image dataset of the patient. A set of 3-D computer aided design (CAD) models of the manufacturer’s prosthesis are pre-loaded in the software that allows the user to place the components of a desired prosthesis to the 3-D model of the boney anatomy to designate the best fit, position and orientation of the implant to the bone. The user can then save this pre-operative planning data to an electronic medium that is loaded and read by a surgical device to assist the surgeon intra-operatively in executing the plan.
  • Additionally, one of the main goals of computer-assisted surgery is to define a patient specific plan and precisely execute the procedure, in a timely manner on a patient. The accuracy of the cut volume for a given implant is critical and errors can accumulate based on registration error, cutter manufacturing tolerances and implant manufacturing tolerances. Registration techniques well known in the art such as point to surface registration can align the coordinate frames of a patient’s bone to the coordinate frames of a 3-D model of a patient’s bone and to the coordinate frame of the surgical device.
  • The registration of the location of the bone intra-operatively within the workspace of a surgical robot serves to determine the precise location and orientation of the bone within the workspace of the robot. In some embodiments, this may be accomplished by probing radiopaque fiducial markers placed into or on the bone that were installed prior to pre-operative imaging. A fiducial marker is appreciated to be a material with an opacity that is different than that of surrounding subject tissue or a reference point capable of detection by an external source (e.g. optical cameras, x-rays, radio frequency). Examples of fiducial markers include a radiopaque pin, an active device such as radio frequency identification (RFID) tag or light emitting diode (LED), a passive retro-reflective sphere, or a combination thereof. In still other inventive embodiments, a registration guide is applied that fits on the bone, or a surface matching algorithm is used, or any other method to determine the orientation of the subject’s operative bone. The usage of such techniques are further detailed in: PCT/IB2013/002311 entitled SYSTEM AND METHOD FOR REGISTRATION IN ORTHOPAEDIC APPLICATIONS. S. Cohan, “ROBODOC achieves pinless registration” The Industrial Robot; 2001; 28, 5; pg. 381. P. J. Besl, “A Method for Registration of 3-D Shapes” IEEE Transactions on Pattern Analysis and Machine intelligence, 1992; 14, pgs. 239-256.
  • Once the registration is complete, it is imperative that the registration is verified and remains accurate throughout the entire procedure. If a tracking system monitors the POSE of the registered bone via tracking arrays fixed thereto, any relative movement between the tracking array and the bone negates the accuracy of the registration. If the bone is not re-registered, the procedure cannot be executed according to the plan. Often, the tracking array is bumped unintentionally and there is no method for signaling or alerting the surgeon that the registration is no longer accurate. Or, the fixed tracking array may drift over-time accumulating errors in the registration. It is not until the surgeon notices the computer-assisted device is off-target or there is an erroneous reading on the device to suggest the registration is compromised.
  • Additionally, there is no intuitive visual process for verifying the accuracy of the registration. In certain registration procedures, a final verification step may be implemented which involves digitizing several additional points on the bone after the registration is complete. The additional points confirm and verify the POSE of the bone. A prompt may then indicate that the registration was successful. However, this verification process can increase the surgical time.
  • Finally, after the TJR is complete, the surgeon may articulate the joint to ensure the proper range of motion and kinematics are achieved. It may be desirable to restore the joint to a similar joint motion, kinematics, or articulation as the pre-cut anatomy so the soft tissues are well balanced, and there is decreased stress on the surrounding muscles and other anatomical structures. Having a process to compare the articulation of the now modified joints to the pre-articulating motion may be highly beneficial for assessing not only the outcome of the procedure, but to also provide a mechanism for recommending intraoperative modifications for the joint to achieve the pre-articulating motion.
  • Thus, there exists a need for a system and process to verify and monitor the accuracy of bone registration prior to and during a computer-assisted surgical procedure. There further exists a need for a process to articulate a joint prior to making any cuts, so as to compare the articulation after the trial implants are in place to see the differences, and possibly recommend corrections so as to obtain a similar articulation as the pre-cut anatomy. There further exists a need to rely on this comparison as a way to recommend adjustments to the tibial internal-external rotation in knee replacement TJR.
  • SUMMARY OF THE INVENTION
  • A system for confirming registration of a bone is provided. The system includes a three-dimensional (3-D) bone model of the bone registered to a bone location, a visible light camera, and a display. The visible light camera captures images of the bone in the camera’s field-of-view (FOV). The display is configured to display the captured images of the FOV and an image of the 3-D bone model, where the displayed location of the image of the 3-D bone model corresponds to the location of the 3-D bone model registered to the bone location. The registration may be confirmed if the displayed location of the image of the bone as captured by the visible light camera corresponds to the displayed location of the 3-D bone model.
  • A system for confirming registration of a bone is also provided. The system includes a three-dimensional (3-D) bone model registered to a bone location. A digitizer having a digitizer tip and a tracking system for tracking movement of the digitizer. A visible light camera for capturing images of a bone in the field-of-view (FOV) of the camera. A computer with a processor configured to record positions of the digitizer tip as the digitizer is moved about the bone, and project the recorded positions onto the 3-D bone model registered at the bone location. A display is configured to display captured images of the FOV, an image of the recorded positions, and an image of the recorded positions projected onto the 3-D bone model. The registration may be confirmed if the displayed location of the image of the recorded positions projected onto the 3-D bone model corresponds to the displayed location of the image of the recorded positions.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention is further detailed with respect to the following drawings. These figures are not intended to limit the scope of the present invention but rather illustrate certain attributes thereof.
  • FIG. 1 illustrates a 3-D model of the operative bones of a knee joint replacement procedure;
  • FIG. 2 illustrates a process flowchart according to the present invention;
  • FIG. 3 illustrates a collision between 3-D models of the operative bones while a surgeon moves the actual bone;
  • FIG. 4 illustrates system for the verification of bone registration using a high definition (HD) camera mounted on a surgical light and including an integrated tracking system.
  • FIGS. 5A and 5B, illustrates a tracking array with an LED that highlight a relative position on the operative bone to monitor registration accuracy.
  • FIG. 6 illustrates a single fiducial marker mounted on the bone relative to a tracking array attached to the bone to monitor registration accuracy.
  • FIG. 7 illustrates a process flowchart for comparing a virtual motion to a physically tracked bone motion according to embodiments of the invention.
  • FIG. 8 depicts a digitizer tracing the edges of a soft tissue to bone interface for display on a captured image from a visible light camera for confirming, verifying, or evaluating the registration accuracy in accordance with embodiments of the invention.
  • DESCRIPTION OF THE INVENTION
  • The present invention has utility as a system and process for confirming the registration of tracked bones prior to and during a computer-assisted surgical procedure. Embodiments of the inventive process and system confirm registration of a subject’s bones by articulating the physical tracked bones relative to 3-D displayed models of the tracked bones. The articulation of the physically tracked bones and corresponding motion of the 3-D displayed models may be further used to adjust the bone orientation or provide intraoperative joint modification recommendations for improved artificial joint positioning, as compared to conventional techniques.
  • The following description of the preferred embodiments of the invention is not intended to limit the invention to these preferred embodiments, but rather to enable any person skilled in the art to make and use this invention. The invention described herein illustratively uses total knee arthroplasty as an example. Although total knee arthroplasty is one procedure that can benefit from the disclosed embodiments other surgical procedures can illustratively include surgery to the hip joint, spine, shoulder joint, elbow joint, ankle joint, jaw, a tumor site, joints of the hand or foot, and other appropriate surgical sites.
  • As used herein, a fiducial marker refers to a point of reference capable of detection. Examples of a fiducial marker may include: an active transmitter, such as a light emitting diode (LED) or electromagnetic emitter; a passive reflector, such as a plastic sphere with a retro-reflective film; a distinct pattern or sequence of shapes, lines or other characters; acoustic emitters or reflectors; magnetic emitters or reflectors; radiopaque markers; and the like or any combinations thereof. A tracking array is an arrangement of a plurality of fiducial markers in/on a rigid body of any geometric shape, where each tracking array has a unique geometry of fiducial markers or a unique blinking frequency if active LEDs are used to distinguish between each of the tracked objects.
  • Disclosed herein is the use of a tracking system. Optical tracking systems generally include at least two receivers to detect one or more fiducial markers in three-dimensional space. The receivers are in communication with at least one processor for processing the receiver output. The processing determines the position and orientation (pose) of a tracking array using various algorithms such as time-of-flight or triangulation. The receiver may detect the location of a fiducial marker through a variety of mechanisms including, but not limited to, visible light, electromagnetic radiation, and infrared, as well as any shape, pattern, line, sequence or character recognition. It should be appreciated that other tracking systems known in the art may be used to track objects using radio frequency, magnetics, accelerometers, gyroscopes, acoustic energy or signals, and mechanical linkages. Examples of tracking systems to determine the pose of an object are described in U.S. Pat. Nos. 5,282,770, 6,061,644, and 7,302,288. An example of a mechanical tracking system having mechanical linkages is described in U.S. Pat. No. 6,322,567.
  • Also referenced herein are computer-assisted devices which are to be considered synonymous with computer-aided surgical system, robotic surgical systems, navigation assisted surgical system, image-guided surgical systems and the like. The computer-assisted device may be, for example, a 2-6 degree of freedom hand-held surgical system, a serial chain manipulator system, a parallel robotic system, or a master-slave robotic system, as described in U.S. Pat. Nos. 5,086,401, 7,206,626, 8,876,830, and 8,961,536, U.S. Pat. App. No. 2013/0060278 and U.S. Prov. App. 62/054,009. Such commercial systems illustratively include for example the NavioPFS™ Robotic Resurfacing System (Blue Belt Technologies), the RIO® Robotic System (Mako Surgical Corp.), and the TSolution One™ Surgical System (Think Surgical). It should be appreciated that navigated or tracked instruments may also be used with the subject matter disclosed herein.
  • A process is provided to confirm the registration of bones involved in a joint replacement procedure. The process includes the use of pre-operative planning software to generate a 3-D model of the patient’s bony anatomy from a computed tomography (CT) or magnetic resonance imaging (MRI) image dataset of the patient. A set of 3-D computer aided design (CAD) models of the manufacturer’s prosthesis are pre-loaded in the software that allows the user to place the components of a desired prosthesis to the 3-D model of the boney anatomy to designate the best fit, position and orientation of the implant to the bone. This is accomplished with commercially available systems such as the RIO® Interactive Orthopedic System manufactured by Stryker Mako (Ft. Lauderdale, FL) and the TSolution One™ Surgical System manufactured by THINK Surgical, Inc. (Fremont, CA). An exemplary 3-D modeling of a joint that will be subject to replacement is illustrated in FIG. 1 . FIG. 1 depicts a 3-D model of a knee joint including a coronal and sagittal view of the distal femur 12 and proximal tibia 14. The 3-D virtual models may be displayed on a monitor 10 to facilitate pre-operative planning or to monitor the motion of the tracked bones intra-operatively.
  • A computer simulation may be run to optimize the multiple variables of positional and rotational degrees of freedom to achieve an optimized artificial joint geometry. A finite element analysis is readily applied based on these variables to achieve a balanced force weighting on opposing sides of the joint through a preselected swing angle for the joint. It is appreciated that patient weight, height, and arm swing counterbalance are additional factors that can be built into the optimization routine to predict optimal joint arthroplasty positioning. An example of analyzing the positional and geometric rotational degrees of freedom using a computer simulated model is described in A.C. Godest et al., “Simulation of a knee joint replacement during a gait cycle using explicit finite element analysis” Journal of Biomechanics 35 (2002) 267-275.
  • With reference to FIG. 2 , an inventive process schematic is shown generally at 20. The three-dimensional (3D) models of operative bones involved in a joint replacement procedure are generated, as a physical or virtual model at step 22 and is made available to the surgeon during the procedure. The virtual models may be part of an overall surgical plan that further includes instructions for a computer-assisted surgical device, the final implant position, or the results of any such computer simulations generated for the optimal position of the bones or implants at step 24.
  • A tracking device such as a tracking array or a mechanical tracking probe is attached to each operative bone to allow 6-degrees of freedom (DOF) tracking during the procedure at 26. The bones may be tracked by a tracking system as previously described. The 3D models of each operative bone are then registered to the patient’s actual operative bone at step 28. Systems and methods of optical or mechanical tracking and registration are described in U.S. Pat. No. 6,033,415, which is incorporated herein by reference in its entirety. Systems and processes for pre-operative planning and precise bone removal are also known to the art and include those detailed in WO 2015006721 A1.
  • At step 30, the surgeon moves a tracked bone having a tracking device associated therewith, and a display shows a virtual representation of the motion in real-time. As a result, a surgeon can observe the position of the bones involved in the joint arthroplasty in a way that is impossible from viewing the actual bones. This attribute of the present invention is a novel attribute that becomes increasingly helpful as the bones tagged with a tracking device approach the optimal positioning. In some inventive embodiments, the display or jig include feedback indicators such as markers, a blinking screen, arrows, or highlighted areas on the virtual bone models, that may be used to indicate or guide the surgeon to an optimal position and orientation of a given bone based on the pre-surgical plan and any simulations associated therewith. The surgeon may then assess the kinematics of the operative bones in the position and orientation defined by the pre-surgical plan prior to making any bone cuts.
  • In a particular embodiment, the surgeon or system determines if the movements of the tracked bones correspond to the movements of the 3-D models at step 32. If the tracked bones are moving in correspondence with the 3-D models then the surgeon can confirm that the registration is accurate at step 34. If the tracked bones do not move in correspondence with the 3-D models, or the motion is unexpected, then the accuracy of the registration is likely not accurate. In that case, the surgeon re-registers the bone. This is described in more detail below.
  • In an illustrative example of a specific applications of the invention, with respect to FIG. 3 , the surgeon may verify or confirm the accuracy of the registration by making sure that the tracked femur and tibia do not collide by viewing the 3-D models on a display 40 (e.g. monitor, television); or that the femoral head stays approximately in the center of the acetabulum during articulation of the femur relative to the pelvis, each tracked, by viewing the 3-D models on the display 40 during articulation. This provides an intuitive visual process for confirming the accuracy of the bone registration. If, for example, the 3D virtual model of the femur 42 appears to impinge or collide with the 3D virtual model of the tibia 44 throughout articulation as seen in the bounding circle 46, or the femoral head rotates outside the center of the acetabulum, then the registration is likely not accurate and the surgeon can re-register the bone before proceeding with the procedure. If the registration appears accurate, the surgeon may confirm or verify the registration by way of an input mechanism (e.g. mouse, keyboard, joystick, pendant, touchscreen display, microphone) in communication with the tracking system or computer-assisted device. A prompt 48 may be displayed on the display 40 for the surgeon to select whether the registration is confirmed or not. The surgeon may also confirm the registration to the system or device by providing a particular signal to the tracking system. For example, the surgeon can place a tracked digitizer probe in a specific location relative to the 3-D bone model, or manipulate the bones in a particular pattern or shape. Therefore, the surgeon to quickly confirm the registration without having to use an input mechanism.
  • It should be appreciated that the surgeon can monitor and confirm the registration throughout an entire procedure by moving the tracked bones and visualizing the corresponding motion of the 3-D virtual models on the display. In certain applications, such as TKA, articulating the joints throughout flexion/extension are normal steps of the procedure. Therefore, confirming the registration throughout these normal procedural steps is quick, efficient, and does not require any additional steps outside of a normal TKA. Additionally, by viewing such models devoid of skin, connective tissue, fat, and blood provides insights not available to a surgeon through actual inspection of the joint before or during surgery.
  • In an inventive embodiment, collision detection may be implemented with a computer program or through other types of algorithms that provide a warning to a surgeon or other medical personnel if the 3D virtual models collide during the articulation of the tracked bones. One example of a collision detection algorithm that can be modified by one of skill in the art to warn a surgeon of inaccurate registration is described in Madera-Ramírez, Francisco. “An Introduction to the Collision Detection Algorithms.” Abstraction and Application Magazine 5 (2014). Other algorithms can also be implemented to provide a warning or alert the surgeon that the corresponding motion of the 3-D virtual models is outside of specified thresholds. For example, the algorithm may monitor the relative rotation of the femur with respect to the center of the acetabulum. If the femoral head does not rotate within ±2 mm of the natural center of the acetabulum, then an alert may be generated. In another example, an alert may be generated if the bone models move apart beyond a threshold limit. In TKA, if the closest point between the tibia and either the medial or lateral condyle region of the distal femur becomes significantly greater than the expected total cartilage thickness or other specified distance threshold, the registration is likely off or the tracking arrays may have moved relative to the bone, and an alert is generated.
  • In certain inventive embodiments, registration of tracked bones may also be confirmed by observing the full extension position of a subject’s limb and compare the limb extension to the virtual full extension position to make sure that the achieved extension is correct. Furthermore, as a comparison control, a measurement of the articulation of the joint prior to making any surgical or altering cuts may be saved for later comparison with a post treatment or operative articulation for a measurement of alignment after the trial implants are in place to see the differences, and possibly recommend corrections. As a result, a surgeon can obtain joint positions that retain pre-cut bone positions or controlled modification of dysfunctional pre-cut geometries. In a specific inventive embodiment, a pre- and post-operative comparison may be used as a way to recommend adjustments for the subject’s tibia internal-external rotation.
  • Furthermore, surgical errors during bone resection may lead to abnormal joint motion after trial implantation, and specific embodiments of the invention are used to visualize the abnormal joint motion, where the operative bones, with trial implants, are tracked during motion, and 3-D models of the pre-operative bones are displayed rather than the post-operative bones. Any abnormal motion, such as collisions between the bones, would indicate changes in jointkinematics and may be used to direct the surgeon’s choice of intraoperative correction. For example, in TKA, if a collision detection algorithm detects ±2 mm of collision in varus-valgus rotation between the medial femoral condyle and the medial aspect of the tibia, the computer-assisted device or tracking system may recommend to the surgeon via a prompt on a display, that 2 mm of the medial aspect of the tibia should be resected to obtain the pre-cut articulating motion of the knee. These recommendations can be extended to other alignment characteristics such as the tibial slope, internal-external rotation of the femoral or tibial component, varus-valgus rotation, mechanical axis alignment, kinematic alignment, ligament balancing, soft tissue balancing, flexion/extension gap, and any combinations thereof to achieve the pre-cut anatomical articulation.
  • FIG. 7 generally outlines another process 90 using the physical articulation of the tracked bones to provide bone modification recommendations and/or further aid a surgeon in obtaining a pre-operatively planned result. The generation of the 3-D bone models (step 92), and the planning of the placement of the virtual implants relative to the bone models (step 94) with or without the aid of the computer simulations (step 96), are all accomplished as described above. Next, the virtual motion of the 3-D bone models with the virtual implants is simulated. The surgeon may further adjust the planned position of the virtual implants to achieve a desired virtual motion that the surgeon prefers the actual bone motion to mimic post-operatively. The desired virtual motion is then saved at step 98 for use intra-operatively. At step 100, the surgeon executes the procedure and modifies the bone according to the plan as described above. After modifying the bone, trial components are placed in the joint and the surgeon physically articulates the tracked bones at step 102. The saved virtual motion is then compared with the actual articulation of the tracked bones at step 104. In a specific embodiment, the saved virtual motion of the 3-D bone models with the virtual implants are overlaid on the 3-D bone models registered to the actual bones. To ensure the saved virtual motion corresponds with the physical motion, one of the virtual bones or a portion of one of the virtual bones is mapped to one of or a portion of the 3-D bone model registered to the actual bone. For example, the femoral head and neck of the pre-operative virtual model associated with saved virtual motion is mapped to the femoral head and neck of the virtual model registered to the bone. Therefore, the surgeon can observe how the actual motion of the non-mapped bone corresponds to the saved virtual motion of the non-mapped bone. For TKA, the mapped bone may be the femur, where the surgeon can observe how the actual tibia moves in relation to the actual femur and compare this to how the virtual tibia moved relative to the virtual femur during planning. The surgeon may have the ability to speed-up and slow-down the virtual motion, or stop the virtual motion at particular articulation points, to aid in the comparison. If the motions are similar, the surgeon can complete the procedure (step 106) by inserting the final implants and closing the surgical site. If the motions are dissimilar, the computer-assisted system may provide recommendations (step 108) to re-modify the bone such that the actual bone motion corresponds with the saved virtual motion.
  • In FIG. 4 , a particular embodiment of a system is shown generally at 50 for facilitating embodiments of the aforementioned processes including the confirmation, verification and monitoring of bone registration. The system 50 may include a high definition (HD) camera 52 mounted on a surgical light 54 that has an integrated tracking system with optical receivers 56. The HD camera 52 may be calibrated to a tracking system using one or more marker light emitting diodes (LEDs) 58 fitted on the bone B prior to registration, and then during a post-registration assessment, a 3D reconstructed picture of the bone model from the HD camera’s perspective, or an outline of the bone model 60 from that perspective, could be superimposed on the video 62 from the camera 52, and this should visually correspond to the image of the bone B in the video 62. If, for example, the tracking array 64 on the bone B should move, or be bent during the procedure, the superimposed bone model 60 would shift away from being aligned with the bone B in the video 62. This would give the surgeon an easily checked indication in real-time of whether the registration is still fundamentally accurate.
  • An example of using a visible light camera 52 (e.g., an HD video camera, charged-coupled device (CCD) camera, active-pixel sensor (APS) camera, complementary MOS (CMOS) camera, and other photosensors) with a tracking system having tracking receivers 56 (e.g., optical detectors, infrared cameras, electromagnetic sensors, acoustic sensors, radio-frequency sensors) for confirming, verifying, or monitoring the registration may include the following. First, the tracking system coordinate frame and the visible light camera coordinate frame are calibrated with respect to one another. This may be accomplished by: (i) determining the location of an object in the field-of-view (FoV) of the visible light camera 52; (ii) determining the location of the same object in the FoV of the tracking receivers 56; and (iii) calculating the transformation matrix between the coordinate frame of the tracking system and the coordinate frame of the visible light camera using the determined locations of the object in (i) and (ii). The object may be, for example: (a) one or more LEDs 58 affixed to the bone B; (b) a dedicated calibration object; or (d) an LED associated with a tool used during the procedure (e.g., a surgical device, a digitizer, etc.). After the 3-D bone model is registered to the bone, the tracking system can track the location of the 3-D bone model (registered to the bone) in the tracking system coordinate frame. The tracking system may track the location of the 3-D bone model by: i) tracking the location of a tracking array affixed to the bone; and ii) applying a registration transformation matrix that defines the location of the 3-D bone model registered to the bone with respect to the tracking array coordinate frame. Then using the transformation matrix from the calibration, the 3-D bone model, or an outline of the bone model 60, may be displayed on the images captured by the visible light camera 52 at that tracked/registered location. In other words, he tracked location of the 3-D bone model, or the outline of the bone model 60, in the coordinate frame of the tracking system is transformed (using the calibration transformation matrix or a best fit model) to the same location in the coordinate frame of the visible light camera 52 for display on the images captured by the visible light camera 52. It is appreciated that the transformation matrix can be determined in Cartesian, spherical, or cylindrical coordinates as the computational complexity can be reduced by a selection of suitable coordinate system. S. Guo et al., J Biomed Opt. 2015 Jan; 20(1): 016018 details one such methodology. If the registration is accurate, the 3-D bone model, or the outline of the bone model 60, will match and/or overlap with the location of the actual bone in the images (e.g., video 62) captured by the visible light camera 52 as shown on the display 61 in FIG. 4 . If the registration is not accurate, a user can quickly view the registration errors by comparing the displayed location of the actual bone in the images to the displayed location of the 3-D bone model in the same images. The images may be updated in real-time (e.g., a video feed) to account for any movement of the bone while evaluating the registration accuracy. In the event the registration is not accurate, the 3-D bone model may be re-registered to the bone. A user may monitor the accuracy of the registration using this technique throughout the surgical procedure, where the view of the displayed location of the actual bone relative to the displayed location of the 3-D bone model is at least one of: always displayed on the display 61; displayed at one or more timepoints (e.g., checkpoints) during the surgical procedure; or easily accessible via a window, tab, or other software tool available on the display or via a computer input mechanism (e.g., mouse, keyboard, touchscreen, pendent, controller).
  • With reference to FIG. 8 , the registration accuracy may be further evaluated, confirmed, and/or verified by tracing the edges of the soft tissue “ST” (e.g., cartilage; skin, fat, and/or muscle of the incision that provides access the bone) to bone interface with a digitizer 120 to generate a 3-D contour, displaying the 3-D contour on the images captured by the visible light camera 52, and comparing the displayed 3-D contour 65 with the displayed locations of the soft tissues a captured by the visible light camera 52. A digitizer 120 may include a digitizer tip 122 for designating the location of points in physical space, and a tracking array 124 having three or more fiducial markers to permit a tracking system to track the location of the digitizer 120 (and the tip 122) in space. A user wielding the digitizer 120 may trace the edges of the soft tissue “ST” to bone interface on the bone B. The tracking system records, or digitizes, the location of the digitizer tip 122 while the user traces the edges of the soft tissue “ST” to bone interface with the digitizer to generate a 3-D contour. While the digitizer tip 122 is depicted as a pointed tip, it is appreciated that a blunt tip is operative herein to assure registration of an upper extent of a target surface, as opposed to variable penetration into a soft or otherwise porous bone or target tissue. The 3-D contour represents the edges of the soft tissue “ST” to bone interface in the tracking system coordinate frame. The location of the digitized 3-D contour is then projected onto the location of the registered 3-D bone model. The digitized 3-D contour, the registered bone model, and the projected 3-D contour on the registered bone model may now be displayed on the 2-D images captured by the visible light camera at their corresponding locations using a calibration transformation matrix between. the tracking system coordinate frame and the visible light camera coordinate frame. The registration accuracy may be visually evaluated by the user by comparing: i) the displayed location of the digitized 3-D contour (shown on the display 61 as dotted lines 65); and ii) the displayed location of the 3-D contour projected onto the registered 3-D bone model. The calibration accuracy between the tracking system coordinate frame and visible light camera coordinate frame may also be evaluated by the user by comparing: i) the displayed location of the digitized 3-D contour; and ii) the actual location of the soft tissues “ST” to bone interface as captured by the visible light camera 52. The images may be updated in real-time (e.g., a video feed) to account for movement of the bone while evaluating the calibration accuracy.
  • In a particular embodiment, the generation of a soft tissue 3-D contour may further be used to account for situations where there are soft tissues present on the actual bone, but the 3-D bone model lacks soft tissue information. Since the 3-D bone model is conventionally generated using CT scans, the 3-D bone model lacks soft tissue information (i.e., the 3-D bone model does not include any information about the covering/surrounding cartilage). Therefore, the displayed location of the 3-D bone model, or an outline of the 3-D bone model 60 (as shown in FIG. 4 ), may not quite match with the displayed location of the actual bone due to the locations or thicknesses of the covering/surrounding soft tissues. By displaying: i) the location of the digitized 3-D contours of the soft tissue “ST” to bone interface (shown as the displayed 3-D contour 65); ii) the location of the actual bone as captured by the visible light camera 52 (shown as bone 63): and iii) the registered/tracked location of the 3-D bone model, or outline of the 3-D bone model 60 (as shown in FIG. 4 ), all on the display 61, the user may view areas where the displayed location of the 3-D bone model, or outline 60, does not exactly align with the displayed location of the actual bone due to the locations or thickness of the soft tissues, where the locations of the soft tissues are outlined by the displayed 3-D contour 65 (here, the displayed 3-D contour 65 is the digitized 3-D contour). The frame in FIG. 8 denotes the boundaries of the display 61. Therefore, the registration accuracy may still be confirmed when the displayed location of the 3-D bone model substantially aligns with the displayed location of the actual bone but is misaligned by a thickness (e.g., an estimated or measured thickness of the cartilage) of the soft tissues at the soft tissue locations as outlined by the displayed 3-D contour 65.
  • It should be appreciated that the tracking system, visible light camera 52, and display 61 may all be in communication with each other via a wired or wireless connection. The tracking system, visible light camera 52, and display 61 may further be in communication with (or include) a computing system comprising one or more computers, each having a processor, memory, and software for performing the functions and operations described herein.
  • In a specific embodiment, still with reference to FIG. 8 , a computing system may include image processing software for performing various functions. In one embodiment, the image processing software may segment the images captured by the visible light camera 52, such as the areas of soft tissues “ST”. The image processing software may use the locations of the 3-D contours to assist in the segmentation because the 3-D contour should be a close match and of the correct shape as the anatomy, which may enable the segmentation/matching to be fully automated. With the soft tissues “ST” segmented from the images captured by the visible light camera 52, the bone without the soft tissues may be visualized to provide the user with a measurement of the registration error. As described above, the 3-D bone model may not include any information about the covering/surrounding soft tissues. By segmenting the locations of the soft tissues in the images captured by the visible light camera, the displayed location of the actual bone (without the soft tissue) can be compared to the displayed location of the 3-D bone model (also without soft tissue information). If the registration is accurate, the displayed location of the actual bone (with the soft tissues segmented out of the image) will match with the displayed location of the 3-D bone model. If not, the user may view the registration error between the displayed location of the bone (without soft tissue information) to the displayed location of the 3-D bone model (also without soft tissue information).
  • In a particular embodiment, the tracking system coordinate frame and the visible light camera coordinate frame may be calibrated with respect to one another using image processing techniques. The digitizer 120, for example, may be segmented from the images captured by the visible light camera 52 using known image processing techniques. In particular, the digitizer 120 may be segmented from the image using the known/expected shape of the digitizer 120 to determine the location of the digitizer 120 in 2-D in real-time. A transformation matrix may then be calculated between the tracking system coordinate frame and the visible light camera coordinate frame using: i) the determined location of the digitizer 120 segmented in the visible light camera coordinate frame; and ii) the tracked location of the digitizer as determined by the tracking system. The advantages of this technique include: no need to have a full calibration with separate or specialized calibrating objects; it’s more robust; and requires less maintenance over time which is less risky from a safety perspective.
  • In a specific inventive embodiment, with respect to FIGS. 5A and 5B, an LED 74 on the tracking array 70 is used to illuminate a particular landmark or manually made mark 76 on the bone B. The tracking array 70 shown here includes passive fiducial markers 72 and an LED 74 capable of highlighting an arbitrary spot on the bone B. For example, once the tracking array 70 is attached to the bone B, this LED 74 illuminates a small spot in an arbitrary location on the bone B, the surgeon would then mark that spot 76 with a marking device, illustratively including a purple pen, and then proceed to do the registration. If the tracking array 70 moves relative to the bone B during registration or the procedure, the highlighted spot will almost certainly move away from the marked purple spot 76, indicating a registration error. In a specific inventive embodiment, a distance measurement device, illustratively including a laser or possibly an LED mechanism 74 may be attached to the tracking array 70. The distance measurement device may be pointed at the bone and would detect a change in distance if the tracking array moved relative to the bone.
  • In an inventive embodiment, with respect to FIG. 6 , a single fiducial marker 82 is mounted on the bone B, in relation to a tracking array 80 attached to the bone to provide a redundant reference for monitoring any relative motion between the fiducial marker 82 and the tracking array 80. After registration, if the tracking array 80 were to move or bend, the tracking system would see the geometry change between the tracking array markers 84 and the single fiducial maker 82. An alarm or prompt may be triggered to alert the surgeon that the tracking array 80 has moved relative to the bone indicating the registration is no longer accurate. In an embodiment, the fiducial marker 82 may be a single LED powered with a small embedded battery, which would most likely be a disposable battery.
  • In specific inventive embodiments, two full markers are attached to a bone at a given location, with a main marker and a smaller, less accurate marker, to check that the relative transform between the markers remains stable. Furthermore, a digitizer may be used as needed to recheck a particular landmark or manufactured mark, such as a purple pen mark, if any question on the registration arises.
  • While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the described embodiments in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope as set forth in the appended claims and the legal equivalents thereof.
  • The foregoing description is illustrative of particular embodiments of the invention, but is not meant to be a limitation upon the practice thereof. The following claims, including all equivalents thereof, are intended to define the scope of the invention.

Claims (19)

1. A system for confirming registration of a bone, comprising:
a three-dimensional (3-D) bone model registered to a bone location;
a visible light camera for capturing images of a field-of-view (FOV), wherein the bone is in the FOV; and
a display for displaying the captured images of the FOV and an image of the 3-D bone model, or an outline of the 3-D bone model, wherein the displayed location of the image of the 3-D bone model, or the outline of the 3-D bone model, corresponds to the location of the 3-D bone model registered to the bone location.
2. The system of claim 1 further comprising a computer input for receiving confirmation of correct registration of the 3-D bone model to the bone location when the displayed location of the image of the 3-D bone model, or the outline of the 3-D bone model, corresponds to the location of the bone in the captured images of the FOV.
3. The system of claim 2 wherein the computer input comprises at least one of: a mouse; a keyboard; a joystick; a pendant; a touchscreen display; or a microphone.
4. The system of claim 1 wherein the display further displays a prompt to: a) confirm or not confirm correct registration of the 3-D bone model to the bone location.
5. The system of claim 1 further comprising a tracking system for tracking movement of the bone.
6. The system of claim 5 further comprising a computer comprising a processor configured to determine a calibration between a coordinate frame of the tracking system and a coordinate frame of the visible light camera.
7. The system of claim 6 wherein the location of the 3-D bone model registered to the bone location is determined in the coordinate frame of the tracking system.
8. The system of claim 7 wherein the location of the 3-D bone model registered to the bone location is determined in the coordinate frame of the visible light camera using: i) the calibration; and ii) the determined location of the 3-D bone model registered to the bone location in the coordinate frame of the tracking system.
9. The system of claim 5 wherein the display updates the displayed location of the image of the 3-D bone model, or the outline of the 3-D bone model, according to the tracked movements of the bone.
10. The system of claim 1 wherein the visible light camera is a video camera for capturing images of the bone in real-time, and the display displays the captured images of the bone in real-time.
11. A system for confirming registration of a bone, comprising:
a three-dimensional (3-D) bone model registered to a bone location;
a digitizer comprising a digitizer tip;
a tracking system for tracking movement of the digitizer;
a visible light camera for capturing images of a field-of-view (FOV), wherein the bone is in the FOV;
a computer comprising a processor configured to:
record positions of the digitizer tip as the digitizer is moved about the bone; and
project the recorded positions onto the 3-D bone model registered at the bone location; and
a display for displaying captured images of the FOV, an image of the recorded positions, and an image of the recorded positions projected onto the 3-D bone model.
12. The system of claim 11 wherein the recorded positions form a three-dimensional (3-D) contour and the display displays an image of the 3-D contour.
13. The system of claim 12 wherein the 3-D contour represents edges of a soft tissue to bone interface.
14. The system of claim 11 wherein the computer is further configured to segment, in the captured image of the bone, the soft tissue covering or surrounding the bone.
15. The system of claim 14 wherein the computer is further configured to remove the segmented soft tissue from the captured image.
16. The system of claim 14 wherein the computer is configured to segment the soft tissue using the recorded positions of the digitizer tip.
17. The system of claim 11 wherein the visible light camera is a video camera for capturing images of the bone in real-time, and the display is configured to display the captured images of the bone in real time.
18. The system of claim 11 wherein the display is further configured to display an image of the 3-D bone model, or an outline of the 3-D bone model, wherein the displayed location of the image of the 3-D bone model, or an outline of the 3-D bone model, corresponds to the location of the 3-D bone model registered to the bone location.
19. The system of claim 11 further comprising a computer input for receiving confirmation of correct registration of the 3-D bone model to the bone location when the displayed location of the image of the recorded positions projected onto the 3-D bone model corresponds to the displayed location of the image of the recorded positions.
US18/221,913 2015-11-16 2023-07-14 Method for confirming registration of tracked bones Pending US20230355317A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/221,913 US20230355317A1 (en) 2015-11-16 2023-07-14 Method for confirming registration of tracked bones

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201562255618P 2015-11-16 2015-11-16
PCT/US2016/062008 WO2017087371A1 (en) 2015-11-16 2016-11-15 Method for confirming registration of tracked bones
US201815769040A 2018-04-17 2018-04-17
US17/148,612 US11717353B2 (en) 2015-11-16 2021-01-14 Method for confirming registration of tracked bones
US18/221,913 US20230355317A1 (en) 2015-11-16 2023-07-14 Method for confirming registration of tracked bones

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/148,612 Continuation-In-Part US11717353B2 (en) 2015-11-16 2021-01-14 Method for confirming registration of tracked bones

Publications (1)

Publication Number Publication Date
US20230355317A1 true US20230355317A1 (en) 2023-11-09

Family

ID=88648941

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/221,913 Pending US20230355317A1 (en) 2015-11-16 2023-07-14 Method for confirming registration of tracked bones

Country Status (1)

Country Link
US (1) US20230355317A1 (en)

Similar Documents

Publication Publication Date Title
US11717353B2 (en) Method for confirming registration of tracked bones
US9456765B2 (en) Systems and methods for measuring parameters in joint replacement surgery
US20230414287A1 (en) Systems and methods for preoperative planning and postoperative analysis of surgical procedures
EP3322337B1 (en) Computer-implemented lower extremities leg length calculation method
US10973580B2 (en) Method and system for planning and performing arthroplasty procedures using motion-capture data
CN110251232B (en) Medical navigation guidance system
JP4331113B2 (en) How to determine the position of a joint point in a joint
US8469902B2 (en) Determining a joint orientation for an implantation
WO2011158115A2 (en) Method of determination of access areas from 3d patient images
US20190076195A1 (en) Articulating laser incision indication system
US20230100824A1 (en) Bone registration methods for robotic surgical procedures
US11957417B2 (en) Surgical registration tools, systems, and methods of use in computer-assisted surgery
US20230263572A1 (en) Dynamic joint analysis for joint replacement
US20230355317A1 (en) Method for confirming registration of tracked bones
Popescu et al. A new method to compare planned and achieved position of an orthopaedic implant
Krekel et al. Evaluation of bone impingement prediction in pre-operative planning for shoulder arthroplasty
US11410769B1 (en) Tactile solutions integration for patient specific treatment
Bonnin et al. Computer-assisted navigation in total knee replacement

Legal Events

Date Code Title Description
AS Assignment

Owner name: THINK SURGICAL, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZUHARS, JOEL;BONNY, DANIEL P.;TABANDEH, SALEH;SIGNING DATES FROM 20161110 TO 20161208;REEL/FRAME:064253/0104

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION