US20200352529A1 - Systems and methods for intra-operative image analysis - Google Patents

Systems and methods for intra-operative image analysis Download PDF

Info

Publication number
US20200352529A1
US20200352529A1 US16/938,912 US202016938912A US2020352529A1 US 20200352529 A1 US20200352529 A1 US 20200352529A1 US 202016938912 A US202016938912 A US 202016938912A US 2020352529 A1 US2020352529 A1 US 2020352529A1
Authority
US
United States
Prior art keywords
image
registered
bone
intraoperative
preoperative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/938,912
Inventor
Noah D. Wollowick
Andrew Cooper
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DePuy Synthes Products Inc
Original Assignee
DePuy Synthes Products Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=53881124&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20200352529(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by DePuy Synthes Products Inc filed Critical DePuy Synthes Products Inc
Priority to US16/938,912 priority Critical patent/US20200352529A1/en
Publication of US20200352529A1 publication Critical patent/US20200352529A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/505Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30052Implant; Prosthesis

Definitions

  • the invention relates to analysis of images of features within a patient and more particularly to accurately analyzing such images during surgery.
  • Orthopaedic surgeons have the option of utilizing computer-assisted navigation systems to provide intraoperative surgical guidance.
  • computer navigation can provide data on functional parameters such as leg length and offset changes during hip arthroplasty.
  • the purported benefits of computer navigation include reduction of outliers and adverse outcomes related to intraoperative positioning of surgical hardware.
  • the technique also requires consistent patient positioning in the preoperative and intraoperative images, including positioning of the femur relative to the pelvis. Maintaining femoral position while performing hip arthroplasty can pose a significant and often unrealistic challenge to a surgeon that is focused on performing a procedure. The high risk of inaccurate interpretation using this technique has limited its utility in guiding surgical decision making.
  • An object of the present invention is to quantify restoration of orthopaedic functionality at a surgical site within a patient, even during a surgical procedure.
  • Another object of the present invention is to provide image analysis and feedback information to enable better fracture reduction and/or optimal implant selection during the surgery.
  • Yet another object of the present invention is to capture and preserve a digital record of patient results for data collection and quality improvements in surgical procedures.
  • a still further object of the present invention is to improve the outcome of bone repositioning, fracture repair, and/or fixation within a patient.
  • a landmark identification module is capable of receiving the reference and intraoperative images and generates at least one reference landmark point on at least one anatomical feature on the articulating bone in the reference image and at least one intraoperative landmark point on that anatomical feature in the intraoperative image.
  • An image comparison module is capable of identifying (i) an estimation of at least the first center of rotation of the implant in at least one of the reference image and the intraoperative image and (ii) the longitudinal axis of the articulating bone in each of the reference image and intraoperative image.
  • An analysis module is capable of utilizing differences between the orientation of the articulating bone in the reference image relative to the orientation of the articulating bone in the intraoperative image to analyze at least one of offset and length differential.
  • At least one of the image comparison module, the landmark identification module and the image comparison module identifies at least one stationary point on the skeletal bone in each of the reference image and intraoperative image, and at least one of the image comparison module, the landmark identification module and the image comparison module aligns the reference image and intraoperative image according to at least the stationary point in each image.
  • aligning includes overlaying one of the reference image and intraoperative image on the other of the reference image and intraoperative image.
  • This invention also features a system including a memory, a user interface having a display capable of providing at least visual guidance to a user of the system, and a processor, with the processor executing a program performing the steps of acquiring (i) at least one digitized reference image including one of a preoperative image of a surgical site with skeletal and articulating bones and a contralateral image on an opposite side of the patient from the surgical site, and (ii) at least one digitized intraoperative image of the site after an implant has been affixed to the articulating bone.
  • the processor receives the reference and intraoperative images and generates at least one reference landmark point on at least one anatomical feature on the articulating bone in the reference image and at least one intraoperative landmark point on that anatomical feature in the intraoperative image.
  • the processor identifies (i) an estimation of at least the first center of rotation of the implant in at least one of the reference image and the intraoperative image and (ii) the longitudinal axis of the articulating bone in each of the reference image and intraoperative image.
  • One or more differences between the orientation of the articulating bone in the reference image relative to the orientation of the articulating bone in the intraoperative image are utilized to analyze at least one of offset and length differential.
  • the method includes identifying (i) an estimation of at least the first center of rotation of the implant in at least one of the reference image and the intraoperative image and (ii) the longitudinal axis of the articulating bone in each of the reference image and intraoperative image.
  • One or more differences between the orientation of the articulating bone in the reference image relative to the orientation of the articulating bone in the intraoperative image are utilized to analyze at least one of offset and length differential.
  • aligning includes overlaying one of the reference image and intraoperative image on the other of the reference image and intraoperative image.
  • the pelvis of the patient is selected as the skeletal bone and a femur is selected as the articulating bone
  • the skeletal component of the implant is an acetabular cup and the articulating bone component includes a femoral stem having a shoulder and pivotally connectable to the acetabular cup to establish the first center of rotation for the implant.
  • the landmark point on the articulating bone is identified to have a known location relative to the greater trochanter on the femur of the patient.
  • FIG. 2 is a schematic diagram illustrating how multiple types of user interfaces can be networked via a cloud-based system with data and/or software located on a remote server;
  • FIG. 3 is a Flowchart G showing technique flow for both contralateral and ipsilateral analysis
  • FIG. 4 is a Flowchart W of several functions performed for hip analysis
  • FIG. 7 is an image similar to FIG. 6 with a line drawn across the pelvic bone intersecting selected anatomical features
  • FIG. 8 is a schematic screen view of two images, the left-hand image representing a pre-operative view similar to FIG. 6 and the right-hand image representing an intra-operative view with a circle placed around the acetabular component of an implant to enable rescaling of that image;
  • FIG. 10 is a schematic screen view similar to FIG. 9 with a reference line drawn on the intra-operative femur in the right-hand view;
  • FIG. 11 is an image similar to FIGS. 7 and 10 with a line drawn across the obturator foramen in both pre- and intra-operative views;
  • FIG. 12 is an overlay image showing the right-hand, intra-operative image of FIG. 11 superimposed and aligned with the left-hand, pre-operative image;
  • FIG. 14 is an overlay image showing the right-hand, intra-operative image of FIG. 13 superimposed and aligned with the left-hand, pre-operative image utilizing triangular stable bases;
  • FIG. 15 is a schematic combined block diagram and flow chart of an identification guidance module utilized according to aspects of the present invention.
  • FIG. 16 is an image of a trial implant in a hip with the acetabular component transacted by a stationary base line and with two error analysis triangles;
  • FIG. 19 is a schematic screen view of a preoperative image and an intraoperative image positioned side by side with digital annotations marking anatomic landmarks and stationary points on the images;
  • FIG. 20 is a schematic screen view of the preoperative image and intraoperative image of FIG. 19 overlaid according to pelvic anatomy with generated femoral landmark points and error analysis according to another aspect of the present invention
  • FIG. 21 is a schematic diagram showing generation of a corrected landmark point and analysis of offset and length differential according to the present invention.
  • FIG. 22 is a schematic screen view of a preoperative image and an intraoperative image positioned side by side with a grid and digital annotations to mark anatomic landmarks and other features on the images according to certain aspects of the present invention.
  • FIG. 23 is a schematic view similar to FIG. 22 after the preoperative image has been aligned with the intraoperative image.
  • This invention may be accomplished by a system and/or method that acquire (i) at least one reference image including one of a preoperative image of a surgical site with skeletal and articulating bones and a contralateral image on an opposite side of the patient from the surgical site, and (ii) at least one intraoperative image of the site after an implant has been affixed to the articulating bone.
  • the reference and intraoperative images are received and at least one reference landmark point is generated on at least one anatomical feature on the articulating bone, such as on the greater trochanter of a femur, in the reference image and at least one intraoperative landmark point on that anatomical feature in the intraoperative image.
  • any change in positioning of the femur in the two images, relative to the pelvis, would adversely affect calculations in previous approaches of this technique. Maintaining femoral position while performing hip arthroplasty can pose a significant and often unrealistic challenge to a surgeon that is focused on performing a surgical procedure.
  • Various approaches for the ‘Image Overlay’ technique according to the present invention can correct for deviations in femoral positioning between preoperative and intraoperative images by mathematically correcting for any deviation in femoral position in at least one of the visual output and calculation output of offset and leg length.
  • Presently preferred techniques, both with and without image overlay, are described in more detail below in relation to FIGS. 17-23 .
  • the present Image Overlay technique can analyze how “similar” these images are to give the user feedback as to how accurate the results are, that is, to provide a confidence interval.
  • the images (the “intraop” intra-operative image and a “preop” pre-operative image, for example) preferably are scaled similarly and rotated similarly, at least relative to each other.
  • stationary base also referred to herein as a “stable base” means a collection of two or more points, which may be depicted as a line or other geometric shape, drawn on each of two or more images that includes at least one anatomical feature that is present in the two or more images of a region of a patient. For example, different images of a pelvic girdle PG of a patient, FIG.
  • FIG. 1 typically show one or both obturator foramen OF and a central pubic symphysis PS, which the present inventors have recognized as suitable reference points or features for use as part of a stationary base according to the present invention.
  • Other useful anatomical features include femoral neck FN and lesser trochanter LT, shown on right femur F R , and femoral head FH and greater trochanter GT shown on left femur F L , for example.
  • Femoral head FH engages the left acetabulum of the pelvic girdle PG. Also shown in FIG.
  • non-linear stationary bases may utilize additional identifiable points to establish such non-linear bases.
  • At least one identifiable anatomic “landmark”, “stationary point” or “error point”, or a set of landmarks stationary points or error points, is selected to be separate from the stationary base; the one or more landmarks, stationary points or error points are utilized in certain constructions to analyze the accuracy of the overlay process.
  • This additional anatomic feature preferably is part of the stationary anatomy being anatomically compared.
  • the inferior portion of the ischial tuberosity IT can be identified as an additional stationary point or error point.
  • This anatomic feature, in conjunction with the stationary base will depict any differences or errors in pelvic anatomy or the overlay which will enable the physician to validate, or to have more confidence in, the output of the present system.
  • trial hip prosthetic is utilized herein to designate an initial implant selected by a surgeon as a first medical device to insert at the surgical site, which is either the right side or the left side of a patient's hip in certain constructions.
  • the trial prosthetic is selected based on initial digital templating similar to the procedure described the parent application.
  • digital representation or “digital annotation” as utilized herein includes a digital line having at least two points, e.g. a line representing a longitudinal axis or a diameter of an implant or a bone, or a digital circle or other geometric shape which can be aligned with an implant or a bone intraoperatively and then placed in a corresponding location in a preoperative image, or visa versa.
  • FIGS. 2-16 herein correspond to FIGS. 4B, 7-16, 52-54 and 70 , respectively, in the parent application.
  • FIG. 2 herein is a schematic diagram of system 141 according to the present invention illustrating how multiple types of user interfaces in mobile computing devices 143 , 145 , 147 and 149 , as well as laptop 151 and personal computer 153 , can be networked via a cloud 109 with a remote server 155 connected through web services. Data and/or software typically are located on the server 155 and/or storage media 157 .
  • Flowchart G shows technique flow for both contralateral and ipsilateral analysis. This technique is commenced, step 340 , and either contralateral or ipsilateral analysis is selected, step 342 .
  • contralateral analysis the contralateral hip image is captured, step 344 , and the image is flipped, step 346 .
  • the preoperative ipsilateral hip image is opened, step 348 .
  • Flowchart W is applied, step 350 .
  • Flowchart W, FIG. 4 after being activated by step 350 , FIG. 3 , guides a user to identify a femoral landmark such as the greater trochanter in step 370 , FIG. 4 , and then the femoral axis is identified, step 372 , which corresponds to the longitudinal axis of the femur in that image. These steps are illustrated in FIGS. 5 and 6 , below. A line is then drawn across the bony pelvis, step 374 , as shown in FIG. 7 .
  • the technique proceeds to capturing an operative hip image, step 352 , FIG. 3 , and identifying an acetabular component, step 354 , such as shown in FIG. 8 below.
  • Acetabular components are also shown in and discussed relative to FIGS. 9 and 10 below.
  • the image is scaled by entering the size of the acetabular component, step 356 , and Flowchart W, FIG. 4 , is then applied to the operative hip, step 358 .
  • the operative and comparative hip images are scaled by a stationary base generated by selecting at least two reference points on the bony pelvis, step 360 , such as shown in FIG. 11 .
  • the scaled images are then overlaid in step 362 using the bony pelvis points, such as the overlaid lines 386 and 412 shown in FIG. 12 .
  • Differences in offset and leg length are calculated, step 364 , and the technique is terminated, step 366 .
  • FIG. 5 herein is an image 376 of the right side of a patient's hip prior to an operation and showing a marker 378 , bracketed by reference squares 377 and 379 , placed by a user as guided by the system, or placed automatically via image recognition, on the greater trochanter as a landmark or reference point.
  • FIG. 6 is an image 376 ′ similar to FIG.
  • FIG. 12 is an overlay image showing the right-hand, intra-operative, PostOp image 390 ′′′ of FIG. 11 superimposed and aligned with the left-hand, pre-operative PreOp image 376 ′′.
  • soft button icons for selectively changing PreOp image 376 ′′ and/or PostOp image 390 ′′′ are provided at the lower left-hand portion of the screen.
  • PostOp typically indicates post-insertion of a trial prosthesis during the surgical procedure, and is preferably intra-operative.
  • the PostOp image can also be taken and analysis conducted after a “final” prosthesis is implanted.
  • PreOp designates an image preferably taken before any surgical incision is made at the surgical site. In some situations, the image is taken at an earlier time, such as a prior visit to the medical facility and, in other situations, especially in emergency rooms and other critical care situations, the “PreOp” image is taken at the beginning of the surgical procedure.
  • a ball marker BM, FIG. 5 is shown but not utilized for alignment because ball markers can move relative to the patient's anatomy.
  • PreOp and PostOp icons are provided in certain screen views to adjust viewing features such as contrast and transparency.
  • at least one icon enables rotation in one construction and, in another construction, “swaps” the images so that the underlying image becomes the overlying image, as discussed in more detail below.
  • Additional icons and reference elements are provided in some constructions, such as described in the parent application.
  • One or more of these “virtual” items can be removed or added to a screen view by a user as desired by highlighting, touching or clicking the “soft keys” or “soft buttons” represented by the icons.
  • one or more of the icons serves as a toggle to provide “on-off” activation or de-activation of that feature.
  • Characters or other indicia can be utilized to designate image number and other identifying information. Other useful information can be shown such as Abduction Angle, Offset Changes and Leg Length Changes, as discussed in more detail below.
  • Optional user adjustment can be made by touching movement control icon 527 , FIG. 12 , also referred to as a “rotation handle”.
  • more than two points are generated for the stationary base for each image, such as illustrated in FIG. 13 for a preoperative image 1200 and a postoperative image 1201 , and in FIG. 14 for a combined overlay image 1298 of the preoperative image 1200 and the postoperative image 1201 of FIG. 13 .
  • Similar locations on the pelvis in each image are selected to generate the points utilized to establish a stationary base for each image.
  • image 1200 for example, a first point 1202 is generated on an upper corner of the obturator foramen or at the pelvic tear drop, a second point 1204 is generated at the top or superior portion of the pubic symphysis, and a third point 1206 is generated at the lowest or inferior point on the ischial tuberosity.
  • Overlay image 1298 FIG. 14 , shows the three points 1202 , 1204 and 1206 of preop image 1200 , forming the visible preop stationary base triangle 1216 , which is positioned relative to the corresponding three points 1203 , 1205 and 1207 of postop image 1201 , forming a visible postop stationary base triangle 1311 overlaid relative to triangle 1216 in FIG. 14 .
  • a ‘best fit overlay’ can be created using these points by identifying the centroid of the polygon created by these point, and rotating the set of point relative to one another to minimize the summation of distance between each of the related points.
  • scaling of the two images may be performed by these same set of points or, alternatively, a separate set of two or more points may be utilized to scale the two images relative to each other.
  • Clicking on a PreOp soft-button icon 1300 and a PostOp icon 1301 enable a user to alter positioning of images 1200 and 1201 , respectively, within image 1298 in a toggle-switch-type manner to selectively activate or de-activate manipulation of the selected feature.
  • One or more points of a stationary base may be shared with points establishing a scaling line.
  • at least one landmark is selected that is spaced from the stationary base points to increase accuracy of overlaying and/or comparing images.
  • Offset and Leg Length Changes with “Leg Length: ⁇ 0.2 mm”, “Offset: 21.8 mm” and “Confidence Score: 8.1”.
  • a confidence ratio that describes the quality of fit can be created by comparing the overlay area of the two triangles relative to the size of the overall polygon formed by the two triangles, including the non-overlapping areas of each triangle. Abduction angle and anteversion calculations are described in the parent application in relation to FIGS. 55-59 .
  • each image may be scaled by a ball marker or other scaling device, known magnification ratios of a radiographic device, or direct measurements of anatomical points (such as a direct measurement, via callipers, of the extracted femoral head, which can be used to scale the preoperative image).
  • Alternative constructions may also replace the ‘stationary base’ with various other techniques that could be used to scale and align the preoperative and intraoperative images relative to one another.
  • One example of such a construction would involve overlaying two images and displaying them with some transparency so that they could both be viewed on top of one another. The user would then be prompted to rotate and change their sizing, so that the pelvic anatomy in the two images were overlaid as closely as possible.
  • a guidance system is provided to adjust the viewing area of one image on a screen to track actions made by a user to another image on the screen, such as to focus or zoom in on selected landmarks in each image.
  • This feature is also referred to as an automatic ‘centering’ function: as a user moves a cursor to ‘mark’ a feature on one image, such as placing a point for a landmark or a stationary base on an intraoperative image, the other image on the screen is centered by the system to focus on identical points of interest so that both images on the screen are focused on the same anatomical site.
  • FIG. 15 is a schematic combined block diagram and flow chart of an identification guidance module 1400 utilized in one construction to assist a user to select landmarks when comparing a post- or intra-operative results image, box 1402 , with a reference image, box 1404 .
  • the module is initiated with a Start 1401 and terminates with an End 1418 .
  • the module 1400 locates all landmarks “1” on the pre-operative reference image, box 1408 , and calculates the visible area “v” within the pre-operative image in which to scale, such as by using
  • the identical landmark on the pre-operative image is located and its center-point “c” is determined, box 1410 .
  • the identical landmark on the pre-operative image is highlighted in one construction to increase its visual distinctiveness, box 1414 .
  • the pre-operative image is centered, box 1410 , and scaled, box 1412 , such as by utilizing the following Equations 2 and 3, respectively:
  • the user manipulates one or more visual landmarks in the results image, box 1416 , as desired and/or as appropriate.
  • the user manually ends the guidance activities, box 1418 and, in other constructions, the system automatically discontinues the guidance algorithm.
  • image recognition capabilities provide “automatic”, system-generated matching and alignment, with a reduced need for user input.
  • image recognition provides automatic detection of selected items including: the spherical ball marker frequently utilized in preoperative digital templating; the acetabular cup in digital templates and in trial prosthetics; and the Cobb Angle line, also referred to as abduction angle.
  • FIG. 16 is an overlay image 2000 of a preoperative hip image 2001 and an intraoperative hip image 2003 having a trial implant 2002 in a hip with the acetabular component 2004 transacted by stationary base lines 2006 and 2007 extending between a first point 2008 on the obturator foramen OF and a second point 2010 on the anterior inferior iliac spine AIIS of the ileum. Also shown are two error analysis triangles 2020 (solid lines) and 2030 (dashed lines). Circles 2022 and 2032 in this construction represent a landmark point on the greater trochanter in images 2001 and 2003 , respectively.
  • Image 2000 is a representation of preoperative and intraoperative hip images 2001 and 2003 overlaid according to stationary base lines 2006 and 2007 , respectively.
  • visual and/or audible user instructions are sequentially generated by the system to guide the user such as “Draw line along Pubic Symphysis”.
  • Guidance for surgery utilizing other types of implants, and for other surgical procedures, including partial or total knee or shoulder replacements and foot surgery as well as wrist surgery, will occur to those skilled in the art after reading this disclosure.
  • other types of medical imaging using energy other than visible light, such as ultrasound may be utilized according to the present invention instead of actual X-rays.
  • a computer interface tool such as a stylus or light pen, is provided to the user in a sterile condition, than the user can remain within a sterile field of surgery while operating a computing device programmed according to the present invention.
  • vector is utilised herein with the standard meaning of an Euclidean vector having an initial point or “origin” and a terminal point, representing magnitude and direction between the origin and the terminal point.
  • the system then positions an acetabular component template or representative digital annotation, such as a digital line or digital circle, in the preop image by replicating this vector.
  • Hip- and femur-related constructions of the present system and method calculate intraoperative changes in offset and leg length using a reference image, also referred to as a “preop image”, and an intraoperative image, also referred to as a “postop image” or an “intraop image”.
  • one construction of the system requires two consistently scaled images that are overlaid and aligned according to the stationary anatomic region (such as the pelvis), the generation of at least one landmark point on the non-stationary, articulating anatomic region (such as the femur) in both images, a mechanism to identify the difference in femoral angle of the femur relative to the pelvis between the images, a mathematical correction module that adjusts for differences in the articulating femur in each image relative to the stationary pelvis and, finally, a calculation module that uses this input to calculate intraoperative changes in offset and leg length.
  • femoral angle refers to the orientation of the longitudinal axis of the femur relative to the pelvis; a “difference in femoral angle” is described in more detail below in relation to FIG. 21 .
  • the system may optionally include an error analysis module that identifies and analyses potential error in the system.
  • an ‘Image Overlay’ process begins in some constructions by acquiring (i) at least one of a preoperative ipsilateral or an inverted contralateral image (“preop image” or “reference image”), and (ii) an intraoperative image (“intraop image”).
  • preop image or “reference image”
  • intraop image an intraoperative image
  • the system generates at least one landmark point on the non-stationary femur in both images (such as identification of a consistent point on the greater trochanter in both images), generally performed with user guidance.
  • the system will generate at least one error point on the pelvis in both images to provide error analysis. If the images have not been previously scaled and aligned, the system will scale and align them using one of a plurality of techniques. One of the images is then overlaid according to the pelvic anatomy in both images.
  • the system identifies points that can be used to analyze possible error in the images relative to each other.
  • the system additionally performs a series of steps to calculate any deviation in alignment of the non-stationary femur relative to the pelvic anatomy between the preop and intraop images.
  • the system then creates an overlay of the preop and intraop image, taking into consideration and correcting for the effect of any difference in femoral angles between the two images as the system compares the relative position of the generated femoral landmark points.
  • the system analyses the difference between the landmark points, including a correction for femoral alignment differences, and uses this data to calculate intraoperative change in offset and leg length.
  • Implementations that operate on a mobile device may also acquire the images in steps 3000 and 3002 by prompting the user to take a picture of the images using the device camera. If an inverted contralateral image is used as a ‘preop’ image, the contralateral image may be acquired and then inverted within the software, or otherwise it may be flipped in another system and then input to image capture module 3030 .
  • Screen view 3050 FIG. 19 , shows preoperative image 3052 and intraoperative image 3070 , referred to by labels 3053 and 3071 as “PreOp” and PostOp” images, respectively.
  • Landmark Identification Module 3034 identifying at least one point on the femoral anatomy in both the preop and intraop images.
  • Landmark Identification Module 3038 and Calculation Module 3040 can be considered as components of an Analysis Module 3037 , shown in dashed lines.
  • a point in each image will be placed on the greater trochanter, a particularly useful landmark point because it is easily identifiable and because the anatomy is relatively insensitive to deviations in image acquisition.
  • the point may be placed on the lesser trochanter or another identifiable femoral landmark.
  • consistent point placement on the lesser trochanter is more susceptible to error originating from deviations in image acquisition angle based on its 3-dimensional anatomy.
  • the user is either prompted to identify the point on the femoral anatomy, or otherwise the system auto-identifies the point or set of points using image recognition or other technology and then allows the user to modify the point placement.
  • FIG. 5 is an image 376 of the right side of a patient's hip prior to an operation and showing a marker 378 , bracketed by reference squares 377 and 379 , placed by a user as guided by the system, or placed automatically via image recognition, on the greater trochanter as a landmark or reference point, such as indicated in Landmark Identification Module 3034 , FIG. 18 .
  • Reference squares 377 and 379 enable the user to position the marker 378 on touch-screen devices, such as an iPad, without the user's fingers obscuring the position of the marker 378 .
  • reference landmark point 3054 and intraoperative landmark point 3074 are placed on the greater trochanter of the femur Fp in PreOp image 3052 and of femur Fi in PostOp image 3070 , respectively.
  • PreOp image 3052 Also shown in PreOp image 3052 are a femoral axis line 3055 and a pelvic reference line 3056 , tear drop point 3056 , pubic symphysis point 3058 , and ischial tuberosity point 3059 .
  • PostOp image 3070 FIG. 19
  • acetabular cup AC and femoral stem FS of an implant I a femoral axis line 3075 and a pelvic reference line 3076 , tear drop point 3076 , pubic symphysis point 3078 , and ischial tuberosity point 3079 .
  • a circle 3080 has been drawn around acetabular cup AC as described in more detail below.
  • the Landmark Identification Module 3034 In step 3006 , FIG. 17 , the Landmark Identification Module 3034 , FIG. 18 asks via User Interface UI, shown in phantom as box 3035 , whether the user wants to include error analysis in the system output. If yes, Module 3034 prompts the user, in Step 3008 , to identify a set of anatomic points on the stationary pelvis in both the preop and intraop images. While a minimum of only one point is required to provide error analysis in the system, the system preferably generates at least three points on the pelvis, such as points 3057 , 3058 and 3059 in PreOp image 3052 , FIG. 19 , and points 3077 , 3078 and 3079 in PostOp image 3070 .
  • the user positions each point on the pelvis in some constructions but, in preferred constructions, automated algorithms of a system according to the present invention initially place the points in appropriate positions on the pelvic anatomy. If pelvic reference lines, as described in more detail below, are used to align and scale the preop and intraop images, the points selected for error analysis should be independent of the points used to create the pelvic reference lines. Ideal points will also be identifiable, such as a discernible point on the pelvic teardrop, ischial tuberosity and pubic symphysis.
  • Step 3010 the Landmark Identification Module 3034 , FIG. 18 , identifies the approximate femoral center of rotation in the intraop image; this center of rotation information assists correction for deviations in femoral positioning between the preop and intraop images.
  • Landmark Construction Module 3034 identifies this point by placing a digital circle so that it overlays the boundary of the acetabular component, as shown by digital circle 392 in FIG. 9 and by circle 3080 in FIG. 19 . The system then identifies the midpoint of the circle, which approximates the center of rotation of the acetabular component and functions as the intraoperative femoral center of rotation.
  • the system may auto-detect the location of the digital circle by using image recognition to auto-detect the acetabular component in the intraoperative image, and then allow the user, via User Interface UI, box 3035 , to adjust the size and position of the digital circle using navigation handles connected to the circle, such as navigation handle 527 , FIG. 12 , and by navigation handle 3099 , FIG. 20 .
  • the user estimates the approximate center of rotation by drawing or positioning a circle around the femoral head in the preoperative image, and utilizing the center of that circle as an estimate of the center of rotation.
  • the PreOp image 3052 shows three error points 3057 , 3058 and 3059 positioned on the base of the pelvic teardrop, the superior point on the pubic symphysis, and the inferior point on the ischial tuberosity, respectively.
  • points 3077 , 3078 and 3079 are positioned on corresponding points in PostOp image 3070 . These corresponding points will be used for error analysis in constructions that include error analysis as part of the system.
  • Digital circle 3080 has been positioned around the acetabular cup AC of implant I, with a center-point represented by the crosshair 3081 that identifies the midpoint of the circle. This midpoint identifies the approximate femoral center of rotation after implant insertion.
  • the system generates a digital line in the preop image to identify the femoral axis, and the system provides the ability to adjust the line location so that it can identify the angle of the femur in the preop image. Then, the system generates a digital line in the intraop image to identify the femoral axis in the intraop image, again allowing for user adjustment. Preferred constructions of this system will attempt to auto-identify the femoral axis in this step using image recognition and known data, and place the digital lines accordingly. The system then provides the functionality for the user to further manipulate these lines.
  • FIG. 6 is an image 376 ′ similar to FIG. 5 showing a reference line 380 , bracketed by reference squares 381 , 382 , 383 and 384 , drawn on the preop image to represent the longitudinal axis of the femur.
  • Reference lines 381 , 382 , 383 and 384 can be manipulated to reposition the femoral axis line.
  • FIG. 10 is a schematic screen view with a reference line 406 drawn on the intra-operative femur in the right-hand view 390 ′′, guided by reference squares 407 , 408 , 409 and 410 .
  • Reference lines 407 , 408 , 409 and 410 can be manipulated to reposition the femoral axis line.
  • FIG. 19 again shows the positioned digital lines 3055 and 3075 , placed in Step 3012 , FIG. 17 , that identify the femoral axis in the PreOp and PostOp images 3052 and 3070 .
  • the Image Capture Module 3030 , FIG. 18 determines whether the preop and intraop images have been pre-scaled and aligned according to pelvic anatomy. Consistent scaling and alignment may be previously performed in this construction using a variety of approaches. For example, a software system residing on a digital fluoroscopy system may have been used to align and scale the images prior to image acquisition by this system. Alternatively, the images may already be scaled and aligned because the surgeon took images with the patient and radiographic system in identical position with a known magnification ratio.
  • the system can scale, or align, or scale and align the images in optional step 3016 . Consistent scale and alignment in this step is accomplished by the optional Image Scaling and Alignment Module 3032 , FIG. 18 , shown in dashed lines, which may accomplish these operations in various ways.
  • One method to accomplish consistent scaling and alignment is by using stationary bases (i.e. pelvic reference lines), along with identification and scaling of the acetabular cup in the intraop image, as visually illustrated in FIG. 11 .
  • a line is drawn connecting two identical landmarks on the pelvis in both the preop and intraop images.
  • Stationary base line 386 in FIG. 15 connects, in the preop image, a point on the anterior superior iliac spine to the inferior point on the pubic symphysis.
  • Stationary base line 412 in FIG. 11 connects the identical two pelvic landmarks in the intraop image.
  • the system can use these two lines to rotate the images so that the overlay lines are aligned at the same angle relative to the software screen.
  • the images can additionally be scaled, relative to one another, by scaling one image relative to another so that the pixel distances between the stationary base lines in the two images are equivalent.
  • absolute scaling of the images can be achieved by scaling at least one image according to an object of known dimension.
  • FIG. 8 depicts the digital circle 392 that has been generated around acetabular component 394 .
  • the digital circle may be either generated using image recognition to identify the acetabular component, positioned by the user, or initially system-generated in an approximate location and then positioned by the user.
  • the size of this component is known because the surgeon has placed it in the patient's femur. Therefore, the known size of the component, such as “50” mm, can be entered into the box following text “Size of Acetabular Component” located at the top of the intraop screen 390 .
  • the system uses this information to generate absolute scaling in the intraop image. Additionally, the preop image can be scaled in absolute measurements, according to this generated circle, once the preop image is scaled so that the pelvic reference lines in both images are of equivalent length in pixels.
  • FIG. 19 depicts the pelvic reference lines 3056 and 3076 that have been generated on identical points on the preop and intraop images 3052 and 3070 of the pelvis, allowing the system to align and scale the images according to the input.
  • Alternative constructions may apply absolute scaling to other objects of known size in either the preop or intraop image. For example, scaling can be applied according to the preop image by drawing a digital line across diameter of the femoral head in the preop image, and entering the size in absolute terms. This absolute measurement is known during surgery because the surgeon traditionally extracts the femoral head and measures its size, using calipers, during hip arthroplasty.
  • the output of the scaling and alignment performed in step 3016 , FIG. 17 is used to generate an overlay in step 3018 , and therefore may be represented visually by depicting the updated scaling and alignment visually on the software screen, or otherwise may exclusively be calculated by the system to create the overlay in step 3018 .
  • the Image Comparison Module 3036 FIG. 18 superimposes the preop and intraop images by aligning pelvic anatomy, with the images displayed with some transparency so that both can be visualized in the overlay, such as illustrated in FIG. 20 .
  • the overlaid images will contain the identified femoral landmarks (generally placed on the greater trochanter) generated in step 3008 so that location differences between the two points can be visualized.
  • the system will maintain the location of the generated greater trochanter points and the femoral axis lines, relative to the preop and intraop images, as the images are manipulated to create the image overlay.
  • the Image Comparison Module 3036 can align the images according to pelvic anatomy in a variety of ways in this step. In a preferred construction, the system will have previously guided the user in identifying at least two consistent points on the pelvic anatomy in both images. The Image Comparison Module 3036 then superimposes the images so that the stationary base lines are positioned identically. In other words, the images are scaled, aligned and superimposed according to the stationary bases drawn across consistent points on the pelvis in each image. The Image Comparison Module will move and scale all digital annotations in tandem with the underlying image so that they remain affixed to the underlying image. This includes positioning of the femoral and pelvic landmark annotations, the identified center of rotation of the femur, pelvic reference lines, the femoral axis lines, and any other annotations used in various constructions.
  • the system uses image recognition technique to auto-identify the pelvic anatomy and overlay the images based on the image recognition, then the user is presented with the option to manually manipulate the resulting overlay.
  • the user will be guided to manually position the images so that the pelvic anatomy matches.
  • the system in this method will provide the user with the ability to manipulate both the position of each of the images as well as adjust the magnification so that the pelvic anatomy can be superimposed on the overlay.
  • Alternative systems will rely on hardware implementations and stationary cameras to obviate the need for a digital line, image recognition, or user manipulation whatsoever to create the overlay. In these instances, the external system may provide a known magnification ratio and the consistent patient positioning that would be required to create the image overlay without the use of pelvic reference lines or similar technique.
  • Step 3020 the Landmark Correction Module 3038 , FIG. 18 calculates any existing difference between the preop and intraop femoral axis angles.
  • the terms “femoral angle” and “femoral axis angle” refer to the orientation of the longitudinal axis of the femur. If, for example, the preop and intraop femoral axis lines generated in step 3012 vary by eight degrees, the difference calculated in step 3020 will be eight degrees.
  • Step 3022 , FIG. 17 , Landmark Correction Module 3038 , FIG. 18 uses data gathered in previous steps to generate an additional “corrected” or “phantom” landmark point that accounts for differences in femoral position between the preop and intraop images.
  • a corrected landmark point 3082 is shown in FIG. 20 , positioned along circle 3083 from intraoperative landmark point 3074 ′, which is similar to corrected landmark point 3116 , FIG. 21 , along circle 3124 as described in more detail below.
  • the module To generate the corrected landmark point, the module first calculates angle femur , which is the angular difference between the longitudinal axes of the femur in the preoperative and intraoperative images, respectively, also referred to as the preop and intraop femoral axis lines in the overlay.
  • This technique is shown schematically in FIG. 21 for angle ⁇ , arrow 3108 , between longitudinal axis lines 3104 (“L1”) and 3106 (“L2”).
  • the system incorporates this with the femoral or acetabular center of rotation 3102 (“R 1 ”), (X origin , Y origin ) in the intraop image, previously identified in step 3010 , FIG.
  • X phantom ( X troch ⁇ X origin )*cosine(anglef emur ) ⁇ ( Y troch ⁇ Y origin )*sine(angle femur )+ X origin EQ. 4:
  • Y phantom ( X troch ⁇ X origin )*sine(anglef emur )+( Y troch ⁇ Y origin )*cosine(angle femur ) ⁇ Y origin EQ. 5:
  • a vector “v”, line 3118 is extended from the preoperative landmark point 3112 (“p 2 ”) to corrected landmark point 3116 .
  • Right triangle “legs” 3120 and 3122 are utilized to estimate offset and leg length, respectively.
  • Leg 3122 is generally parallel to preoperative femoral axis 3104 in this construction.
  • the Acetabular circle 3100 (“c 1 ”) assists in locating center of rotation 3102 .
  • radius lines 3130 and 3132 which are also separated by angle ⁇ , arrow 3114 .
  • FIG. 20 is an “overlay” screen view 3050 ′ of the intraop image 3070 , FIG. 19 , superimposed as PostOp image 3070 ′ on the preoperative image 3052 as PreOp image 3052 ′.
  • the two stationary base lines 3056 and 3076 of FIG. 19 are aligned exactly one on top of the other, represented as a single stationary base line 3056 ′, 3076 ′.
  • First error correction triangle 3084 is shown connecting intraoperative error point 3077 ′ on the pelvic teardrop, point 3078 ′ on the ischial tuberosity and point 3079 ′ on the pubic symphysis, and a similar error correction triangle 3085 connects points 3057 ′, 3058 ′ and 3059 ′, representing points 3057 , 3058 and 3059 of preoperative image 3052 , FIG. 19 .
  • Details window 3090 lists “Leg Length: ⁇ 0.4 mm”, “Offset: ⁇ 3.8 mm” and “Confidence Score: 5.4” as described in more detail below.
  • the Calculation Module 3040 calculates the change in leg length and offset by analysing the vector between the greater trochanter point in the preop image and the calculated phantom point in the intraop image, such as illustrated in FIG. 21 .
  • leg length the system calculates the distance between these two points along the femoral axis identified from the preop image, as identified by line 3122 in FIG. 21 .
  • offset the system calculates the distance between the two points along the axis that is perpendicular to the femoral axis from the preop image, as identified by line 3120 .
  • a specific example of these calculations is given in Details window 3090 , FIG. 20 .
  • the “Confidence Score” listed in box 3090 relates to the two error triangles 3084 and 3085 as follows.
  • the three points comprising each triangle enables the user to easily visualize any differences in pelvic anatomy in the overlay which may exist even after scaling and alignment.
  • the stationary bases are completely matched one on top of the other, such as illustrated by single stationary base line 3056 ′, 3076 ′, the amount of deviation in the two error triangles 3084 , 3085 can be visually inspected to appreciate potential error in the system, such as caused by one or more of parallax, differences in imaging vantage point of the three-dimensional skeletal anatomy, and/or by point placement within the system.
  • the system provides a weighted “confidence score”, ranging from 0.0 to 10.0 in this construction.
  • the system finds the difference in an absolute scale between each of two corresponding points in the preop and postop images as overlaid.
  • error in certain point pairs is assigned a weighting that is greater or lesser than for other error point pairs.
  • identifying a consistent point on the ischial tuberosity may be difficult between images, so that particular point pair (labelled 3059 ′ and 3079 ′ in FIG. 20 ) can be weighted less, such as by “discounting” it by fifty percent.
  • the weighted sum of numerical error among the error point pairs is converted to a single confidence score, such as “5.4” shown in display window 3090 .
  • the weighting is not necessarily linear. Further, a cut-off value can be provided beyond which the error is deemed to be too great to provide useful analysis; in one construction, the system then recommends that the user obtain an alternative intraoperative image to compare with the preoperative image, or with a contralateral image, to analyze according to the present invention.
  • the femoral angle can be analysed by creating an image cut-out of one femur and superimposing it on top of the other at the original angle.
  • the cut-out and underlying image may also be connected by the known femoral landmark, such as the greater trochanter, and be made to be immutable at that single landmark point.
  • at least one of the system and user may adjust the image cut-out so that the femoral bone precisely overlays the femoral bone in the superimposed image by pivoting about that landmark point.
  • the system may accomplish this using image recognition or other automated algorithm that identifies the femoral bone or related femoral landmarks such as the greater trochanter landmark previously identified.
  • the user may match the femoral bones by adjusting the superimposed image of the femur so that it matches the femur in the underlying image.
  • the system may attempt to initially match the femoral bones and then provide the user the option to reposition the femur to improve the position.
  • the system will calculate the deviation in angle between the two femurs by calculating the angle that the cut-out was adjusted, providing similar information
  • reference (preop) and intraop images are compared via a grid-type X-Y coordinate system without utilizing femoral angles, such as for preoperative images 3202 , 3202 ′ and intraoperative images 3242 , 3242 ′ in screen views 3200 and 3200 ′ illustrated in FIGS. 22-23 , respectively.
  • the reference and intraoperative images are not actually digitally overlaid one on top of the other in this construction; instead, preop image 3202 , FIG. 22 , is overlaid with, or otherwise associated with, a grid 3204 having a Y-axis 3205 and an X axis 3306 with units “ 100 , 200 , . . .
  • intraop image 3242 is associated with a grid 3244 having a Y-axis 3245 and an X axis 3346
  • preop image 3202 ′, FIG. 23 is associated with a grid 3204 ′ having a Y-axis 3205 ′ and an X axis 3306
  • intraop image 3242 ′ is associated with a grid 3244 ′ having a Y-axis 3245 ′ and an X axis 3346 ′.
  • Preop image 3202 includes femur Fp with landmark point 3208 on the greater trochanter, and stationary base 3210 and error triangle 3212 on the pelvis.
  • Intraop image 3242 includes femur Fi with implant I having femoral stem FS and acetabular cup AC.
  • Intraoperative landmark point 3248 has been placed on the greater trochanter.
  • Stationary base 3250 and error triangle 3253 have been placed on the pelvis.
  • Preop image 3202 ′ includes femur Fp′ with landmark point 3208 ′ on the greater trochanter, and stationary base 3210 ′ and error triangle 3212 ′ on the pelvis.
  • Intraop image 3242 ′ includes femur Fi′ with implant I′ having femoral stem FS' and acetabular cup AC′.
  • Intraoperative landmark point 3248 ′ is on the greater trochanter.
  • Stationary base 3250 ′ and error triangle 3253 ′ have been placed on the pelvis.
  • preop image 3202 ′ After a user activates a “Proceed To Analysis” icon 3260 , FIG. 22 , the system aligns preop image 3202 ′, FIG. 23 , with intraop image 3242 ′.
  • preop image 3202 ′ has been “tilted” or rotated counter-clockwise relative to the initial position of preop image 3202 in FIG. 22 to represent alignment achieved using stationary base 3210 ′ and 3250 ′.
  • a difference in position of one of the landmark points is determined, such as the shift of preop landmark point 3208 , FIG. 22 to the aligned position of preop landmark point 3208 ′, FIG. 23 .
  • intraoperative landmark point 3248 ′ is in the same grid location as intraoperative landmark point 3248 , FIG. 22 .
  • a vector can then be calculated from intraop landmark point 3248 ′ to corrected point 3208 ′ using calculations similar to that described above in relation to FIG. 21 .
  • a “Details” window 3270 graphically shows the change in position of initial preop landmark point 3208 to corrected landmark point 3208 ′.
  • An additional alternative construction will identify an estimated center of rotation in the preop image instead of the intraop image, using a similar digital circle placed around the femoral head, or similar technique to annotate the estimate center of rotation.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Optics & Photonics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Urology & Nephrology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Prostheses (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physiology (AREA)

Abstract

A system and method that acquire (i) at least a reference image including one of a preoperative image of a surgical site with skeletal and articulating bones and a contralateral image on an opposite side of the patient from the surgical site, and (ii) at least an intraoperative image of the site after an implant has been affixed to the articulating bone. The system generates at least one reference landmark point on at least one anatomical feature on the articulating bone in the reference image and at least one intraoperative landmark point on that anatomical feature in the intraoperative image. The reference and intraoperative images are compared, and differences between the orientation of the articulating bone in the two images are utilized to analyze at least one of offset and length differential.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a continuation of U.S. patent application Ser. No. 14/995,057, entitled “Systems and Methods for Intra-Operative Image Analysis,” filed Jan. 13, 2016, which is a continuation-in-part application of U.S. patent application Ser. No. 14/630,300, filed Feb. 24, 2015, which claims priority to: (a) U.S. Provisional Application No. 61/944,520, filed Feb. 25, 2014, (b) U.S. Provisional Application No. 61/948,534, filed Mar. 5, 2014, (c) U.S. Provisional Application No. 61/980,659, filed Apr. 17, 2014, (d) U.S. Provisional Application No. 62/016,483, filed Jun. 24, 2014, (e) U.S. Provisional Application No. 62/051,238, filed Sep. 16, 2014, (f) U.S. Provisional Application No. 62/080,953, filed Nov. 17, 2014, and (g) U.S. Provisional Application No. 62/105,183, filed Jan. 19, 2015. The present application is also related to U.S. patent application Ser. No. 14/974,225, filed Dec. 18, 2015, by the present inventors, which issued as U.S. Pat. No. 10,433,914 on Oct. 8, 2019. The entire contents of each of the above applications are incorporated herein by reference.
  • FIELD OF THE INVENTION
  • The invention relates to analysis of images of features within a patient and more particularly to accurately analyzing such images during surgery.
  • BACKGROUND OF THE INVENTION
  • Orthopaedic surgeons have the option of utilizing computer-assisted navigation systems to provide intraoperative surgical guidance. For example, computer navigation can provide data on functional parameters such as leg length and offset changes during hip arthroplasty. The purported benefits of computer navigation include reduction of outliers and adverse outcomes related to intraoperative positioning of surgical hardware.
  • Despite obvious clinical benefit, these systems have had limited adoption due to their expense, the learning curve and training requirements for surgeons and, for some systems, the additional procedure and time associated with hardware insertion into the patient. Surgeons that do not use these systems are limited to traditional techniques that are generally based on visual analysis and surgeon experience. However, these techniques are inconsistent, often leading to outliers in functional parameters which may affect patient satisfaction and implant longevity.
  • Details of one such technique, specifically used in a minimally invasive hip arthroplasty technique referred to as the direct anterior approach, are mentioned in the description of a total hip arthroplasty surgery, by Matta et al. in “Single-incision Anterior Approach for Total hip Arthroplasty on an Orthopaedic Table”, Clinical Ortho. And Related Res. 441, pp. 115-124 (2005). The intra-operative technique described by Matta et al. is time-consuming and has a high risk of inaccuracy due to differences in rotation, magnification and/or scaling of various images, because the technique relies upon acquiring a preoperative and intraoperative image that are scaled and positioned equivalently. The technique also requires consistent patient positioning in the preoperative and intraoperative images, including positioning of the femur relative to the pelvis. Maintaining femoral position while performing hip arthroplasty can pose a significant and often unrealistic challenge to a surgeon that is focused on performing a procedure. The high risk of inaccurate interpretation using this technique has limited its utility in guiding surgical decision making.
  • What appears to be a software implementation of this technique is described by Penenberg et al. in U.S. Patent Publication No. 2014/0378828, which is a continuation-in-part application of U.S. Pat. No. 8,831,324 by Penenberg. While the use of a computer system may facilitate some aspects of this technique, the underlying challenges to the technique are consistent with the challenges to Matta's approach, and limit the system's potential utility.
  • The challenge of accounting for differences in femoral positioning, ever-present in existing non-invasive guidance technologies for hip arthroplasty, could be solved by developing a system and method that corrects for deviations between preoperative and intraoperative femoral positioning.
  • It is therefore desirable to have a non-invasive system and method that provides intraoperative guidance and data by correcting for deviations in femoral positioning between preoperative and intraoperative images.
  • SUMMARY OF THE INVENTION
  • An object of the present invention is to quantify restoration of orthopaedic functionality at a surgical site within a patient, even during a surgical procedure.
  • Another object of the present invention is to provide image analysis and feedback information to enable better fracture reduction and/or optimal implant selection during the surgery.
  • Yet another object of the present invention is to capture and preserve a digital record of patient results for data collection and quality improvements in surgical procedures.
  • A still further object of the present invention is to improve the outcome of bone repositioning, fracture repair, and/or fixation within a patient.
  • This invention results from the realization that postoperative change in offset and leg length can be accurately estimated during surgery by overlaying or otherwise comparing preoperative and intraoperative images that have been consistently scaled based on pelvic anatomy, generating consistent femoral landmarks in each image, and calculating the vector difference between femoral landmarks after correcting for possible differences in femoral positioning between the two images relative to the pelvis.
  • This invention features a system to analyze images at a surgical site within a patient, the surgical site including at least one skeletal bone such as a pelvis and at least one articulating bone such as a femur that has a longitudinal axis and articulates with the skeletal bone at a joint. In one embodiment, the system includes an image capture module capable of acquiring (i) at least one reference image including one of a preoperative image of the surgical site and a contralateral image on an opposite side of the patient from the surgical site, and (ii) at least an intraoperative image of the site after an implant has been affixed to the articulating bone. A landmark identification module is capable of receiving the reference and intraoperative images and generates at least one reference landmark point on at least one anatomical feature on the articulating bone in the reference image and at least one intraoperative landmark point on that anatomical feature in the intraoperative image. An image comparison module is capable of identifying (i) an estimation of at least the first center of rotation of the implant in at least one of the reference image and the intraoperative image and (ii) the longitudinal axis of the articulating bone in each of the reference image and intraoperative image. An analysis module is capable of utilizing differences between the orientation of the articulating bone in the reference image relative to the orientation of the articulating bone in the intraoperative image to analyze at least one of offset and length differential.
  • In some embodiments, the first and second images are provided by the image capture module to the landmark identification module in a digitized format. In certain embodiments, the analysis module calculates a difference angle between the longitudinal axis of the femur in the reference image relative to the longitudinal axis of the femur in the intraoperative image and then estimates a corrected landmark point, such as a corrected intraoperative landmark point, based on that difference angle. In one embodiment, the analysis module estimates the corrected intraoperative landmark point by calculating a first radius between the estimated center of rotation and the intraoperative landmark and then selecting the corrected intraoperative landmark point at a second radius spaced at the difference angle from the first radius. In certain embodiments, the analysis module calculates length differential by estimating distance from the reference landmark point to the corrected intraoperative landmark point in a direction parallel to the longitudinal axis of the femur in the reference image, and/or calculates offset by estimating distance from the reference landmark point to the corrected intraoperative landmark in a direction perpendicular to the longitudinal axis of the femur in the reference image.
  • In certain embodiments, at least one of the image comparison module, the landmark identification module and the image comparison module identifies at least one stationary point on the skeletal bone in each of the reference image and intraoperative image, and at least one of the image comparison module, the landmark identification module and the image comparison module aligns the reference image and intraoperative image according to at least the stationary point in each image. In one embodiment, aligning includes overlaying one of the reference image and intraoperative image on the other of the reference image and intraoperative image.
  • In some embodiments, the reference image and the intraoperative image are at least one of aligned and scaled relative to each other prior to the analysis module analyzing offset and length differential. In one embodiment, at least two stationary points are generated on the skeletal bone in the reference image to establish a reference stationary base and at least two stationary points are generated on the skeletal bone in the intraoperative image to establish an intraoperative stationary base, and at least one of the image comparison module, the landmark identification module and the image comparison module utilizes the reference and intraoperative stationary bases to accomplish at least one of image alignment and image scaling. In another embodiment, at least one of the image comparison module, the landmark identification module and the image comparison module provides at least relative scaling of one of the reference and intraoperative images to match the scaling of the other of the reference and intraoperative images.
  • This invention also features a system including a memory, a user interface having a display capable of providing at least visual guidance to a user of the system, and a processor, with the processor executing a program performing the steps of acquiring (i) at least one digitized reference image including one of a preoperative image of a surgical site with skeletal and articulating bones and a contralateral image on an opposite side of the patient from the surgical site, and (ii) at least one digitized intraoperative image of the site after an implant has been affixed to the articulating bone. The processor receives the reference and intraoperative images and generates at least one reference landmark point on at least one anatomical feature on the articulating bone in the reference image and at least one intraoperative landmark point on that anatomical feature in the intraoperative image. The processor identifies (i) an estimation of at least the first center of rotation of the implant in at least one of the reference image and the intraoperative image and (ii) the longitudinal axis of the articulating bone in each of the reference image and intraoperative image. One or more differences between the orientation of the articulating bone in the reference image relative to the orientation of the articulating bone in the intraoperative image are utilized to analyze at least one of offset and length differential.
  • This invention further features a method including acquiring (i) at least one reference image including one of a preoperative image of a surgical site with skeletal and articulating bones and a contralateral image on an opposite side of the patient from the surgical site, and (ii) at least one intraoperative image of the site after an implant has been affixed to the articulating bone. The method further includes receiving the reference and intraoperative images and generating at least one reference landmark point on at least one anatomical feature on the articulating bone in the reference image and at least one intraoperative landmark point on that anatomical feature in the intraoperative image. The method includes identifying (i) an estimation of at least the first center of rotation of the implant in at least one of the reference image and the intraoperative image and (ii) the longitudinal axis of the articulating bone in each of the reference image and intraoperative image. One or more differences between the orientation of the articulating bone in the reference image relative to the orientation of the articulating bone in the intraoperative image are utilized to analyze at least one of offset and length differential.
  • In some embodiments, aligning includes overlaying one of the reference image and intraoperative image on the other of the reference image and intraoperative image. In certain embodiments, the pelvis of the patient is selected as the skeletal bone and a femur is selected as the articulating bone, and the skeletal component of the implant is an acetabular cup and the articulating bone component includes a femoral stem having a shoulder and pivotally connectable to the acetabular cup to establish the first center of rotation for the implant. The landmark point on the articulating bone is identified to have a known location relative to the greater trochanter on the femur of the patient.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In what follows, preferred embodiments of the invention are explained in more detail with reference to the drawings, in which:
  • FIG. 1 is a schematic image of a frontal, X-ray-type view of a pelvic girdle of a patient illustrating various known anatomical features;
  • FIG. 2 is a schematic diagram illustrating how multiple types of user interfaces can be networked via a cloud-based system with data and/or software located on a remote server;
  • FIG. 3 is a Flowchart G showing technique flow for both contralateral and ipsilateral analysis;
  • FIG. 4 is a Flowchart W of several functions performed for hip analysis;
  • FIG. 5 is an image of the right side of a patient's hip prior to an operation and showing a marker placed on the greater trochanter as a landmark or reference point;
  • FIG. 6 is an image similar to FIG. 5 showing a reference line, drawn on (i) the pre-operative, ipsilateral femur or (ii) the contra-lateral femur, to represent the longitudinal axis of the femur;
  • FIG. 7 is an image similar to FIG. 6 with a line drawn across the pelvic bone intersecting selected anatomical features;
  • FIG. 8 is a schematic screen view of two images, the left-hand image representing a pre-operative view similar to FIG. 6 and the right-hand image representing an intra-operative view with a circle placed around the acetabular component of an implant to enable rescaling of that image;
  • FIG. 9 is a schematic screen view similar to FIG. 8 indicating marking of the greater trochanter of the right-hand, intra-operative image as a femoral landmark;
  • FIG. 10 is a schematic screen view similar to FIG. 9 with a reference line drawn on the intra-operative femur in the right-hand view;
  • FIG. 11 is an image similar to FIGS. 7 and 10 with a line drawn across the obturator foramen in both pre- and intra-operative views;
  • FIG. 12 is an overlay image showing the right-hand, intra-operative image of FIG. 11 superimposed and aligned with the left-hand, pre-operative image;
  • FIG. 13 is an image similar to FIG. 11 with points marking the lowest point on the ischial tuberosity and points marking the obturator foramen and top of the pubic symphysis in both pre- and intra-operative views;
  • FIG. 14 is an overlay image showing the right-hand, intra-operative image of FIG. 13 superimposed and aligned with the left-hand, pre-operative image utilizing triangular stable bases;
  • FIG. 15 is a schematic combined block diagram and flow chart of an identification guidance module utilized according to aspects of the present invention;
  • FIG. 16 is an image of a trial implant in a hip with the acetabular component transacted by a stationary base line and with two error analysis triangles;
  • FIG. 17 is a flowchart showing the use of an ‘Image Overlay’ technique to calculate a postoperative change in offset and leg length according to an aspect of the present invention;
  • FIG. 18 is a schematic diagram of an Image Analysis System according to the present invention;
  • FIG. 19 is a schematic screen view of a preoperative image and an intraoperative image positioned side by side with digital annotations marking anatomic landmarks and stationary points on the images;
  • FIG. 20 is a schematic screen view of the preoperative image and intraoperative image of FIG. 19 overlaid according to pelvic anatomy with generated femoral landmark points and error analysis according to another aspect of the present invention;
  • FIG. 21 is a schematic diagram showing generation of a corrected landmark point and analysis of offset and length differential according to the present invention;
  • FIG. 22 is a schematic screen view of a preoperative image and an intraoperative image positioned side by side with a grid and digital annotations to mark anatomic landmarks and other features on the images according to certain aspects of the present invention; and
  • FIG. 23 is a schematic view similar to FIG. 22 after the preoperative image has been aligned with the intraoperative image.
  • DETAILED DESCRIPTION OF THE PRESENTLY PREFERRED EMBODIMENTS
  • This invention may be accomplished by a system and/or method that acquire (i) at least one reference image including one of a preoperative image of a surgical site with skeletal and articulating bones and a contralateral image on an opposite side of the patient from the surgical site, and (ii) at least one intraoperative image of the site after an implant has been affixed to the articulating bone. The reference and intraoperative images are received and at least one reference landmark point is generated on at least one anatomical feature on the articulating bone, such as on the greater trochanter of a femur, in the reference image and at least one intraoperative landmark point on that anatomical feature in the intraoperative image. At least the first center of rotation of the implant is estimated in at least one of the reference image and the intraoperative image, and the longitudinal axis of the articulating bone is identified in each of the reference image and intraoperative image. One or more differences between the orientation of the articulating bone in the reference image relative to the orientation of the articulating bone in the intraoperative image are utilized to analyze at least one of offset and length differential.
  • Broadly, some techniques according to the present invention, referred to by the present inventors as “Image Overlay”, place one image over another image during analysis to generate a combined overlapped image. Previous approaches for the ‘Image Overlay’ technique made use of a pelvic reference line having two or more points to scale and align a preoperative image and an intraoperative image. The pelvic reference line having two or more points is also referred to as a “stationary base” as defined in U.S. patent application Ser. No. 14/630,300 filed 24 Feb. 2015, sometimes referred to herein as “parent application”, now US Publication No. 2015/0238271.
  • Alternative approaches for ‘Image Overlay’ technique according to the present invention obviate the need for the pelvic reference line or other stationary base. In some constructions, these alternatives instead rely upon certain image acquisition techniques, certain image manipulation techniques, certain known imaging information, and/or direct user manipulation to create consistent scale and alignment between (i) at least one of a preoperative image and an inverted contralateral image and (ii) an intraoperative image.
  • Additionally, any change in positioning of the femur in the two images, relative to the pelvis, would adversely affect calculations in previous approaches of this technique. Maintaining femoral position while performing hip arthroplasty can pose a significant and often unrealistic challenge to a surgeon that is focused on performing a surgical procedure. Various approaches for the ‘Image Overlay’ technique according to the present invention can correct for deviations in femoral positioning between preoperative and intraoperative images by mathematically correcting for any deviation in femoral position in at least one of the visual output and calculation output of offset and leg length. Presently preferred techniques, both with and without image overlay, are described in more detail below in relation to FIGS. 17-23.
  • In general, accurate analysis of two images of a patient is directly related not only to how similar the two images are, but also how similarly the images are aligned with respect to scale and alignment, including rotation, and translation. Using conventional techniques, a user would have to manually adjust the images and/or retake multiple images to achieve this goal, something that would be difficult to do reliably and accurately. Utilizing two or more points as a stationary base according to the present invention in each image enables accurate analysis of the two images. Furthermore, the present Image Overlay technique can analyze how “similar” these images are to give the user feedback as to how accurate the results are, that is, to provide a confidence interval. To obtain useful information, the images (the “intraop” intra-operative image and a “preop” pre-operative image, for example) preferably are scaled similarly and rotated similarly, at least relative to each other.
  • For some constructions of image analysis according to the present invention, preferably at least one stationary base and at least one anatomical landmark are selected, at least for scaling and alignment of the images. The term “stationary base”, also referred to herein as a “stable base”, means a collection of two or more points, which may be depicted as a line or other geometric shape, drawn on each of two or more images that includes at least one anatomical feature that is present in the two or more images of a region of a patient. For example, different images of a pelvic girdle PG of a patient, FIG. 1, typically show one or both obturator foramen OF and a central pubic symphysis PS, which the present inventors have recognized as suitable reference points or features for use as part of a stationary base according to the present invention. Other useful anatomical features, especially to serve as landmarks utilized according to the present invention, include femoral neck FN and lesser trochanter LT, shown on right femur FR, and femoral head FH and greater trochanter GT shown on left femur FL, for example. Femoral head FH engages the left acetabulum of the pelvic girdle PG. Also shown in FIG. 1 are ischial tuberosities IT at the bottom of the ischium, a “tear drop” TD relating to a bony ridge along the floor of the acetabular fossa, and the anterior superior iliac spine ASIS and the anterior inferior iliac spine AIIS of the ileum.
  • In general, a longer stationary base is preferred over a shorter stationary base, because the longer base, especially if it is a line, will contain more pixels in images thereof and will increase accuracy of overlays and scaling according to the present invention. However, the further the stationary base is from the area of anatomical interest, the greater the risk of parallax-induced error. For example, if the area of interest is the hip joint, then the ideal stationary base will be near the hip. In some procedures involving hip surgery, for example, a stationary base line begins at the pubic symphysis PS, touches or intersects at least a portion of an obturator foramen OF, and extends to (i) the “tear drop” TD, or (ii) the anterior interior iliac spine AIIS. Of course, only two points are needed to define a line, so only two reliable anatomical features, or two locations on a single anatomical feature, are needed to establish a stationary base utilized according to the present invention. More complex, non-linear stationary bases may utilize additional identifiable points to establish such non-linear bases.
  • Additionally, at least one identifiable anatomic “landmark”, “stationary point” or “error point”, or a set of landmarks stationary points or error points, is selected to be separate from the stationary base; the one or more landmarks, stationary points or error points are utilized in certain constructions to analyze the accuracy of the overlay process. This additional anatomic feature preferably is part of the stationary anatomy being anatomically compared. For example, the inferior portion of the ischial tuberosity IT can be identified as an additional stationary point or error point. This anatomic feature, in conjunction with the stationary base, will depict any differences or errors in pelvic anatomy or the overlay which will enable the physician to validate, or to have more confidence in, the output of the present system. As generally utilized herein: (i) a “stationary point” refers to a point on a relatively stationary bone such as on the pelvis; (ii) a “landmark point” is located on an articulating bone such as a femur; (iii) an “error point” is preferably on pelvis and spaced from other points; and (iv) a “fixed point” is located on an implant, such as the shoulder of a femoral stem prosthesis.
  • The term “trial hip prosthetic” is utilized herein to designate an initial implant selected by a surgeon as a first medical device to insert at the surgical site, which is either the right side or the left side of a patient's hip in certain constructions. In some techniques, the trial prosthetic is selected based on initial digital templating similar to the procedure described the parent application.
  • The term “digital representation” or “digital annotation” as utilized herein includes a digital line having at least two points, e.g. a line representing a longitudinal axis or a diameter of an implant or a bone, or a digital circle or other geometric shape which can be aligned with an implant or a bone intraoperatively and then placed in a corresponding location in a preoperative image, or visa versa.
  • FIGS. 2-16 herein correspond to FIGS. 4B, 7-16, 52-54 and 70, respectively, in the parent application. FIG. 2 herein is a schematic diagram of system 141 according to the present invention illustrating how multiple types of user interfaces in mobile computing devices 143, 145, 147 and 149, as well as laptop 151 and personal computer 153, can be networked via a cloud 109 with a remote server 155 connected through web services. Data and/or software typically are located on the server 155 and/or storage media 157.
  • Software to accomplish the techniques described herein is located on a single computing device in some constructions and, in other constructions such as system 141, FIG. 2, is distributed among a server 155 and one or more user interface devices which are preferably portable or mobile. In some techniques a digitized X-ray image of the hip region of a patient along a frontal or anterior-to-posterior viewing angle is utilized for a screen view on a display and, in other techniques, a digital photograph “secondary” image of a “primary” X-ray image of the hip region of a patient along a frontal or anterior-to-posterior viewing angle is utilized for the screen view. In one construction, the screen view is shown on a computer monitor and, in another construction, is shown on the screen or viewing region of a tablet or other mobile computing device.
  • Flowchart G, FIG. 3, shows technique flow for both contralateral and ipsilateral analysis. This technique is commenced, step 340, and either contralateral or ipsilateral analysis is selected, step 342. For contralateral analysis, the contralateral hip image is captured, step 344, and the image is flipped, step 346. For ipsilateral analysis, the preoperative ipsilateral hip image is opened, step 348. For both types of analysis, Flowchart W is applied, step 350.
  • Flowchart W, FIG. 4, after being activated by step 350, FIG. 3, guides a user to identify a femoral landmark such as the greater trochanter in step 370, FIG. 4, and then the femoral axis is identified, step 372, which corresponds to the longitudinal axis of the femur in that image. These steps are illustrated in FIGS. 5 and 6, below. A line is then drawn across the bony pelvis, step 374, as shown in FIG. 7.
  • The technique proceeds to capturing an operative hip image, step 352, FIG. 3, and identifying an acetabular component, step 354, such as shown in FIG. 8 below. Acetabular components are also shown in and discussed relative to FIGS. 9 and 10 below. The image is scaled by entering the size of the acetabular component, step 356, and Flowchart W, FIG. 4, is then applied to the operative hip, step 358. The operative and comparative hip images are scaled by a stationary base generated by selecting at least two reference points on the bony pelvis, step 360, such as shown in FIG. 11. The scaled images are then overlaid in step 362 using the bony pelvis points, such as the overlaid lines 386 and 412 shown in FIG. 12. Differences in offset and leg length are calculated, step 364, and the technique is terminated, step 366.
  • One currently preferred implementation of the JointPoint IntraOp™ Anterior system, which provides the basis for intraoperative analysis of the anterior approach to hip surgery, is illustrated in relation to FIGS. 9-22 in the parent application; FIGS. 9-16 are described herein as FIGS. 5-12. FIG. 5 herein is an image 376 of the right side of a patient's hip prior to an operation and showing a marker 378, bracketed by reference squares 377 and 379, placed by a user as guided by the system, or placed automatically via image recognition, on the greater trochanter as a landmark or reference point. FIG. 6 is an image 376′ similar to FIG. 5 showing a reference line 380, bracketed by reference squares 381, 382, 383 and 384, drawn on (i) the pre-operative, ipsilateral femur or (ii) the contra-lateral femur, to represent the longitudinal axis of the femur. FIG. 7 is an image 376″ similar to FIG. 6 with a line 386, defined by two end-points, which is drawn across the pelvic bone intersecting selected anatomical features.
  • FIG. 8 is a schematic screen view of two images, the left-hand image 376′ representing a pre-operative view similar to FIG. 6 and the right-hand image 390 representing an intra-operative view with a circle 392 placed around the acetabular component 394 of an implant 398 to enable rescaling of that image. In some constructions, circle 392 is placed by an image recognition program and then manually adjusted by a user as desired. Reference square 398 designates implant 398 to the user. FIG. 9 is a schematic screen view similar to FIG. 8 indicating marking of the greater trochanter of the right-hand, intra-operative image 390′ as a femoral landmark 400, guided by reference squares 402 and 404. FIG. 10 is a schematic screen view similar to FIG. 9 with a reference line 406 drawn on the intra-operative femur in the right-hand view 390″, guided by reference squares 407, 408, 409 and 410.
  • FIG. 11 is an image similar to FIGS. 7 and 10 with a line 386, 412 drawn across the obturator foremen in both pre- and intra-operative views 376″ and 390′″, respectively. Reference squares 413, 414, 415 and 416 guide the user while drawing reference line 412.
  • FIG. 12 is an overlay image showing the right-hand, intra-operative, PostOp image 390′″ of FIG. 11 superimposed and aligned with the left-hand, pre-operative PreOp image 376″. In this construction, soft button icons for selectively changing PreOp image 376″ and/or PostOp image 390′″ are provided at the lower left-hand portion of the screen.
  • Note that “PostOp” as utilized herein typically indicates post-insertion of a trial prosthesis during the surgical procedure, and is preferably intra-operative. The PostOp image can also be taken and analysis conducted after a “final” prosthesis is implanted. “PreOp” designates an image preferably taken before any surgical incision is made at the surgical site. In some situations, the image is taken at an earlier time, such as a prior visit to the medical facility and, in other situations, especially in emergency rooms and other critical care situations, the “PreOp” image is taken at the beginning of the surgical procedure. A ball marker BM, FIG. 5, is shown but not utilized for alignment because ball markers can move relative to the patient's anatomy. Further PreOp and PostOp icons are provided in certain screen views to adjust viewing features such as contrast and transparency. Preferably, at least one icon enables rotation in one construction and, in another construction, “swaps” the images so that the underlying image becomes the overlying image, as discussed in more detail below.
  • Additional icons and reference elements are provided in some constructions, such as described in the parent application. One or more of these “virtual” items can be removed or added to a screen view by a user as desired by highlighting, touching or clicking the “soft keys” or “soft buttons” represented by the icons. In certain embodiments, one or more of the icons serves as a toggle to provide “on-off” activation or de-activation of that feature. Characters or other indicia can be utilized to designate image number and other identifying information. Other useful information can be shown such as Abduction Angle, Offset Changes and Leg Length Changes, as discussed in more detail below. Optional user adjustment can be made by touching movement control icon 527, FIG. 12, also referred to as a “rotation handle”.
  • In certain constructions, image recognition capabilities provide “automatic”, system-generated matching and alignment, with a reduced need for user input. Currently utilized image recognition provides automatic detection of selected items including: the spherical ball marker frequently utilized in preoperative digital templating; the acetabular cup in digital templates and in trial prosthetics; and the Cobb Angle line, also referred to as abduction angle.
  • In another construction, more than two points are generated for the stationary base for each image, such as illustrated in FIG. 13 for a preoperative image 1200 and a postoperative image 1201, and in FIG. 14 for a combined overlay image 1298 of the preoperative image 1200 and the postoperative image 1201 of FIG. 13. Similar locations on the pelvis in each image are selected to generate the points utilized to establish a stationary base for each image. In image 1200, for example, a first point 1202 is generated on an upper corner of the obturator foramen or at the pelvic tear drop, a second point 1204 is generated at the top or superior portion of the pubic symphysis, and a third point 1206 is generated at the lowest or inferior point on the ischial tuberosity. Lines 1208, 1210 and 1212 are drawn connecting those points to generate a visible stationary base triangle 1216 on image 1200. Also shown is a point 1214 on the greater trochanter. In postoperative image 1201, first and second points 1203 and 1205 correspond with first and second points 1202 and 1204 in image 1200. A third point 1207 is shown in image 1201 between reference squares 1209 and 1211 in the process of a user selecting the lowest point on the ischial tuberosity to correspond with third point 1206 in image 1200. The user is prompted by “Mark lowest point on Ischial Tuberosity” in the upper portion of image 1201. Also shown is a circle 1213 around the acetabular component and a point 1215 on the greater trochanter.
  • Establishing at least three points is especially useful for determining rotational differences between images. Overlay image 1298, FIG. 14, shows the three points 1202, 1204 and 1206 of preop image 1200, forming the visible preop stationary base triangle 1216, which is positioned relative to the corresponding three points 1203, 1205 and 1207 of postop image 1201, forming a visible postop stationary base triangle 1311 overlaid relative to triangle 1216 in FIG. 14. A ‘best fit overlay’ can be created using these points by identifying the centroid of the polygon created by these point, and rotating the set of point relative to one another to minimize the summation of distance between each of the related points. In this construction, scaling of the two images may be performed by these same set of points or, alternatively, a separate set of two or more points may be utilized to scale the two images relative to each other. Clicking on a PreOp soft-button icon 1300 and a PostOp icon 1301 enable a user to alter positioning of images 1200 and 1201, respectively, within image 1298 in a toggle-switch-type manner to selectively activate or de-activate manipulation of the selected feature. One or more points of a stationary base may be shared with points establishing a scaling line. Preferably, at least one landmark is selected that is spaced from the stationary base points to increase accuracy of overlaying and/or comparing images.
  • Also illustrated in FIG. 14 are “Offset and Leg Length Changes” with “Leg Length: −0.2 mm”, “Offset: 21.8 mm” and “Confidence Score: 8.1”. A confidence ratio that describes the quality of fit can be created by comparing the overlay area of the two triangles relative to the size of the overall polygon formed by the two triangles, including the non-overlapping areas of each triangle. Abduction angle and anteversion calculations are described in the parent application in relation to FIGS. 55-59.
  • Alternative constructions may alternatively apply absolute scaling to the preoperative and intraoperative images directly in each image, and without the need for a stationary base. For example, each image may be scaled by a ball marker or other scaling device, known magnification ratios of a radiographic device, or direct measurements of anatomical points (such as a direct measurement, via callipers, of the extracted femoral head, which can be used to scale the preoperative image).
  • Alternative constructions may also replace the ‘stationary base’ with various other techniques that could be used to scale and align the preoperative and intraoperative images relative to one another. One example of such a construction would involve overlaying two images and displaying them with some transparency so that they could both be viewed on top of one another. The user would then be prompted to rotate and change their sizing, so that the pelvic anatomy in the two images were overlaid as closely as possible.
  • In some constructions, a guidance system is provided to adjust the viewing area of one image on a screen to track actions made by a user to another image on the screen, such as to focus or zoom in on selected landmarks in each image. This feature is also referred to as an automatic ‘centering’ function: as a user moves a cursor to ‘mark’ a feature on one image, such as placing a point for a landmark or a stationary base on an intraoperative image, the other image on the screen is centered by the system to focus on identical points of interest so that both images on the screen are focused on the same anatomical site. FIG. 15 is a schematic combined block diagram and flow chart of an identification guidance module 1400 utilized in one construction to assist a user to select landmarks when comparing a post- or intra-operative results image, box 1402, with a reference image, box 1404. The module is initiated with a Start 1401 and terminates with an End 1418. When a visual landmark is added to a post-operative image, box 1406, the module 1400 locates all landmarks “1” on the pre-operative reference image, box 1408, and calculates the visible area “v” within the pre-operative image in which to scale, such as by using
  • Equation 1:

  • v=[maxx(l)−minx(l),maxy(l)−miny(l)]  EQ. 1
  • The identical landmark on the pre-operative image is located and its center-point “c” is determined, box 1410. The identical landmark on the pre-operative image is highlighted in one construction to increase its visual distinctiveness, box 1414. The pre-operative image is centered, box 1410, and scaled, box 1412, such as by utilizing the following Equations 2 and 3, respectively:

  • Center=c−(v)(0.5)  EQ. 2

  • Scale=i/v  EQ. 3
  • The user manipulates one or more visual landmarks in the results image, box 1416, as desired and/or as appropriate. In some constructions, the user manually ends the guidance activities, box 1418 and, in other constructions, the system automatically discontinues the guidance algorithm.
  • In certain constructions, image recognition capabilities provide “automatic”, system-generated matching and alignment, with a reduced need for user input. Currently utilized image recognition provides automatic detection of selected items including: the spherical ball marker frequently utilized in preoperative digital templating; the acetabular cup in digital templates and in trial prosthetics; and the Cobb Angle line, also referred to as abduction angle.
  • FIG. 16 is an overlay image 2000 of a preoperative hip image 2001 and an intraoperative hip image 2003 having a trial implant 2002 in a hip with the acetabular component 2004 transacted by stationary base lines 2006 and 2007 extending between a first point 2008 on the obturator foramen OF and a second point 2010 on the anterior inferior iliac spine AIIS of the ileum. Also shown are two error analysis triangles 2020 (solid lines) and 2030 (dashed lines). Circles 2022 and 2032 in this construction represent a landmark point on the greater trochanter in images 2001 and 2003, respectively. Image 2000 is a representation of preoperative and intraoperative hip images 2001 and 2003 overlaid according to stationary base lines 2006 and 2007, respectively. Three identical pelvic points 2024, 2026, 2028 and 2034, 2036, 2038 in images 2001 and 2003, respectively, have been identified, with a system such as system 200, FIGS. 4C-4F in the parent application, generating triangles 2020 and 2030 for each image as represented by FIG. 16. The triangles 2020 and 2030 can be visually compared to analyze the error in the anatomic area containing the stationary bases which, in this case, is the pelvis. A numerical confidence score or other normalized numeric error analysis value may also be calculated and displayed in the system by calculating the distance between points, comparing them to the length of the triangle vectors, and then normalizing the data, possibly using a log or other such nonlinear algorithm. The visual display and/or numerical confidence score provides efficacy analysis in the construction. In other words, error analysis and correction is provided in some constructions for at least one image, such as providing a confidence score or other normalized numeric error analysis, and/or a visual representation of at least one error value or error factor, such as relative alignment of one or more geometric shapes, e.g. triangles, or symbols in two or more images.
  • In some constructions of the various alternative systems and techniques according to the present invention, visual and/or audible user instructions are sequentially generated by the system to guide the user such as “Draw line along Pubic Symphysis”. Guidance for surgery utilizing other types of implants, and for other surgical procedures, including partial or total knee or shoulder replacements and foot surgery as well as wrist surgery, will occur to those skilled in the art after reading this disclosure. Also, other types of medical imaging using energy other than visible light, such as ultrasound, may be utilized according to the present invention instead of actual X-rays. Moreover, if a computer interface tool, such as a stylus or light pen, is provided to the user in a sterile condition, than the user can remain within a sterile field of surgery while operating a computing device programmed according to the present invention.
  • The term “vector” is utilised herein with the standard meaning of an Euclidean vector having an initial point or “origin” and a terminal point, representing magnitude and direction between the origin and the terminal point. The system then positions an acetabular component template or representative digital annotation, such as a digital line or digital circle, in the preop image by replicating this vector.
  • Hip- and femur-related constructions of the present system and method calculate intraoperative changes in offset and leg length using a reference image, also referred to as a “preop image”, and an intraoperative image, also referred to as a “postop image” or an “intraop image”. To accomplish this, one construction of the system requires two consistently scaled images that are overlaid and aligned according to the stationary anatomic region (such as the pelvis), the generation of at least one landmark point on the non-stationary, articulating anatomic region (such as the femur) in both images, a mechanism to identify the difference in femoral angle of the femur relative to the pelvis between the images, a mathematical correction module that adjusts for differences in the articulating femur in each image relative to the stationary pelvis and, finally, a calculation module that uses this input to calculate intraoperative changes in offset and leg length. As utilized herein, the term “femoral angle” refers to the orientation of the longitudinal axis of the femur relative to the pelvis; a “difference in femoral angle” is described in more detail below in relation to FIG. 21. The system may optionally include an error analysis module that identifies and analyses potential error in the system.
  • As described in more detail below in relation to FIGS. 17-23, an ‘Image Overlay’ process according to the present invention begins in some constructions by acquiring (i) at least one of a preoperative ipsilateral or an inverted contralateral image (“preop image” or “reference image”), and (ii) an intraoperative image (“intraop image”). The system generates at least one landmark point on the non-stationary femur in both images (such as identification of a consistent point on the greater trochanter in both images), generally performed with user guidance. Optionally, the system will generate at least one error point on the pelvis in both images to provide error analysis. If the images have not been previously scaled and aligned, the system will scale and align them using one of a plurality of techniques. One of the images is then overlaid according to the pelvic anatomy in both images.
  • In some constructions, the system identifies points that can be used to analyze possible error in the images relative to each other. The system additionally performs a series of steps to calculate any deviation in alignment of the non-stationary femur relative to the pelvic anatomy between the preop and intraop images. The system then creates an overlay of the preop and intraop image, taking into consideration and correcting for the effect of any difference in femoral angles between the two images as the system compares the relative position of the generated femoral landmark points. Finally, the system analyses the difference between the landmark points, including a correction for femoral alignment differences, and uses this data to calculate intraoperative change in offset and leg length.
  • In one construction, the process begins in the flowchart OA in FIG. 17 by acquiring, step 3000, either a selected preoperative ipsilateral image, or a selected inverted contralateral image. Whichever image is selected is referred to herein as a “first, reference image” or “preop image”. The process continues with acquisition of the intraop hip image, step 3002. Image acquisition in steps 3000 and 3002 is performed by the Image Capture module 3030, also referred to as an Image Selection Module, of overlay analysis system 3028, FIG. 18. Acquisition of these images can be performed in a variety of ways, such as a direct connection to a c-arm fluoroscopy unit, file upload, or similar techniques. Implementations that operate on a mobile device such as an iPad, or other platforms that similarly integrate a camera device, may also acquire the images in steps 3000 and 3002 by prompting the user to take a picture of the images using the device camera. If an inverted contralateral image is used as a ‘preop’ image, the contralateral image may be acquired and then inverted within the software, or otherwise it may be flipped in another system and then input to image capture module 3030. Screen view 3050, FIG. 19, shows preoperative image 3052 and intraoperative image 3070, referred to by labels 3053 and 3071 as “PreOp” and PostOp” images, respectively.
  • The method continues in step 3004, FIG. 17, with Landmark Identification Module 3034, FIG. 18, identifying at least one point on the femoral anatomy in both the preop and intraop images. Landmark Identification Module 3038 and Calculation Module 3040 can be considered as components of an Analysis Module 3037, shown in dashed lines. In a preferred construction, a point in each image will be placed on the greater trochanter, a particularly useful landmark point because it is easily identifiable and because the anatomy is relatively insensitive to deviations in image acquisition. Alternatively, the point may be placed on the lesser trochanter or another identifiable femoral landmark. However, consistent point placement on the lesser trochanter is more susceptible to error originating from deviations in image acquisition angle based on its 3-dimensional anatomy. In various constructions, the user is either prompted to identify the point on the femoral anatomy, or otherwise the system auto-identifies the point or set of points using image recognition or other technology and then allows the user to modify the point placement.
  • FIG. 5, described above, is an image 376 of the right side of a patient's hip prior to an operation and showing a marker 378, bracketed by reference squares 377 and 379, placed by a user as guided by the system, or placed automatically via image recognition, on the greater trochanter as a landmark or reference point, such as indicated in Landmark Identification Module 3034, FIG. 18. Reference squares 377 and 379 enable the user to position the marker 378 on touch-screen devices, such as an iPad, without the user's fingers obscuring the position of the marker 378.
  • In a similar manner, reference landmark point 3054 and intraoperative landmark point 3074, FIG. 19, are placed on the greater trochanter of the femur Fp in PreOp image 3052 and of femur Fi in PostOp image 3070, respectively. Also shown in PreOp image 3052 are a femoral axis line 3055 and a pelvic reference line 3056, tear drop point 3056, pubic symphysis point 3058, and ischial tuberosity point 3059.
  • Further shown in PostOp image 3070, FIG. 19, are acetabular cup AC and femoral stem FS of an implant I, a femoral axis line 3075 and a pelvic reference line 3076, tear drop point 3076, pubic symphysis point 3078, and ischial tuberosity point 3079. A circle 3080 has been drawn around acetabular cup AC as described in more detail below.
  • In step 3006, FIG. 17, the Landmark Identification Module 3034, FIG. 18 asks via User Interface UI, shown in phantom as box 3035, whether the user wants to include error analysis in the system output. If yes, Module 3034 prompts the user, in Step 3008, to identify a set of anatomic points on the stationary pelvis in both the preop and intraop images. While a minimum of only one point is required to provide error analysis in the system, the system preferably generates at least three points on the pelvis, such as points 3057, 3058 and 3059 in PreOp image 3052, FIG. 19, and points 3077, 3078 and 3079 in PostOp image 3070. The user positions each point on the pelvis in some constructions but, in preferred constructions, automated algorithms of a system according to the present invention initially place the points in appropriate positions on the pelvic anatomy. If pelvic reference lines, as described in more detail below, are used to align and scale the preop and intraop images, the points selected for error analysis should be independent of the points used to create the pelvic reference lines. Ideal points will also be identifiable, such as a discernible point on the pelvic teardrop, ischial tuberosity and pubic symphysis.
  • In Step 3010, the Landmark Identification Module 3034, FIG. 18, identifies the approximate femoral center of rotation in the intraop image; this center of rotation information assists correction for deviations in femoral positioning between the preop and intraop images. In a preferred construction, Landmark Construction Module 3034 identifies this point by placing a digital circle so that it overlays the boundary of the acetabular component, as shown by digital circle 392 in FIG. 9 and by circle 3080 in FIG. 19. The system then identifies the midpoint of the circle, which approximates the center of rotation of the acetabular component and functions as the intraoperative femoral center of rotation.
  • Various constructions will accomplish step 3010 in different ways. In a preferred construction, the system may auto-detect the location of the digital circle by using image recognition to auto-detect the acetabular component in the intraoperative image, and then allow the user, via User Interface UI, box 3035, to adjust the size and position of the digital circle using navigation handles connected to the circle, such as navigation handle 527, FIG. 12, and by navigation handle 3099, FIG. 20. In another construction, the user estimates the approximate center of rotation by drawing or positioning a circle around the femoral head in the preoperative image, and utilizing the center of that circle as an estimate of the center of rotation.
  • As shown in FIG. 19, the PreOp image 3052 shows three error points 3057, 3058 and 3059 positioned on the base of the pelvic teardrop, the superior point on the pubic symphysis, and the inferior point on the ischial tuberosity, respectively. Similarly, points 3077, 3078 and 3079 are positioned on corresponding points in PostOp image 3070. These corresponding points will be used for error analysis in constructions that include error analysis as part of the system. Digital circle 3080 has been positioned around the acetabular cup AC of implant I, with a center-point represented by the crosshair 3081 that identifies the midpoint of the circle. This midpoint identifies the approximate femoral center of rotation after implant insertion.
  • In Step 3012, FIG. 17, the system begins the process of analysing the difference in the femoral axis angles, relative to the pelvis, between the preop and intraop images. In a preferred construction, the system accomplishes this by generating digital lines to identify the longitudinal axis of the femurs in both images, such as femoral axis lines 3055 and 3075, FIG. 19, and calculating any angle difference between them as described in more detail below in relation to FIG. 21. Landmark Identification Module 3034, FIG. 18 guides the user to generate a line that identifies the longitudinal axis of the femur in both the preop and intraop images. First, the system generates a digital line in the preop image to identify the femoral axis, and the system provides the ability to adjust the line location so that it can identify the angle of the femur in the preop image. Then, the system generates a digital line in the intraop image to identify the femoral axis in the intraop image, again allowing for user adjustment. Preferred constructions of this system will attempt to auto-identify the femoral axis in this step using image recognition and known data, and place the digital lines accordingly. The system then provides the functionality for the user to further manipulate these lines.
  • FIG. 6, described above, is an image 376′ similar to FIG. 5 showing a reference line 380, bracketed by reference squares 381, 382, 383 and 384, drawn on the preop image to represent the longitudinal axis of the femur. Reference lines 381, 382, 383 and 384 can be manipulated to reposition the femoral axis line. FIG. 10, described above, is a schematic screen view with a reference line 406 drawn on the intra-operative femur in the right-hand view 390″, guided by reference squares 407, 408, 409 and 410. Reference lines 407, 408, 409 and 410 can be manipulated to reposition the femoral axis line. FIG. 19 again shows the positioned digital lines 3055 and 3075, placed in Step 3012, FIG. 17, that identify the femoral axis in the PreOp and PostOp images 3052 and 3070.
  • In step 3014, FIG. 17, the Image Capture Module 3030, FIG. 18 determines whether the preop and intraop images have been pre-scaled and aligned according to pelvic anatomy. Consistent scaling and alignment may be previously performed in this construction using a variety of approaches. For example, a software system residing on a digital fluoroscopy system may have been used to align and scale the images prior to image acquisition by this system. Alternatively, the images may already be scaled and aligned because the surgeon took images with the patient and radiographic system in identical position with a known magnification ratio.
  • If the images have not been either scaled or aligned, the system can scale, or align, or scale and align the images in optional step 3016. Consistent scale and alignment in this step is accomplished by the optional Image Scaling and Alignment Module 3032, FIG. 18, shown in dashed lines, which may accomplish these operations in various ways.
  • One method to accomplish consistent scaling and alignment is by using stationary bases (i.e. pelvic reference lines), along with identification and scaling of the acetabular cup in the intraop image, as visually illustrated in FIG. 11. In this approach, a line is drawn connecting two identical landmarks on the pelvis in both the preop and intraop images. Stationary base line 386 in FIG. 15 connects, in the preop image, a point on the anterior superior iliac spine to the inferior point on the pubic symphysis. Stationary base line 412 in FIG. 11 connects the identical two pelvic landmarks in the intraop image. The system can use these two lines to rotate the images so that the overlay lines are aligned at the same angle relative to the software screen. The images can additionally be scaled, relative to one another, by scaling one image relative to another so that the pixel distances between the stationary base lines in the two images are equivalent. Finally, absolute scaling of the images can be achieved by scaling at least one image according to an object of known dimension.
  • FIG. 8 depicts the digital circle 392 that has been generated around acetabular component 394. The digital circle may be either generated using image recognition to identify the acetabular component, positioned by the user, or initially system-generated in an approximate location and then positioned by the user. The size of this component is known because the surgeon has placed it in the patient's femur. Therefore, the known size of the component, such as “50” mm, can be entered into the box following text “Size of Acetabular Component” located at the top of the intraop screen 390. The system uses this information to generate absolute scaling in the intraop image. Additionally, the preop image can be scaled in absolute measurements, according to this generated circle, once the preop image is scaled so that the pelvic reference lines in both images are of equivalent length in pixels.
  • FIG. 19 depicts the pelvic reference lines 3056 and 3076 that have been generated on identical points on the preop and intraop images 3052 and 3070 of the pelvis, allowing the system to align and scale the images according to the input. Alternative constructions may apply absolute scaling to other objects of known size in either the preop or intraop image. For example, scaling can be applied according to the preop image by drawing a digital line across diameter of the femoral head in the preop image, and entering the size in absolute terms. This absolute measurement is known during surgery because the surgeon traditionally extracts the femoral head and measures its size, using calipers, during hip arthroplasty.
  • The output of the scaling and alignment performed in step 3016, FIG. 17, is used to generate an overlay in step 3018, and therefore may be represented visually by depicting the updated scaling and alignment visually on the software screen, or otherwise may exclusively be calculated by the system to create the overlay in step 3018. In this construction of Step 3018, the Image Comparison Module 3036, FIG. 18 superimposes the preop and intraop images by aligning pelvic anatomy, with the images displayed with some transparency so that both can be visualized in the overlay, such as illustrated in FIG. 20. In a preferred construction the overlaid images will contain the identified femoral landmarks (generally placed on the greater trochanter) generated in step 3008 so that location differences between the two points can be visualized. The system will maintain the location of the generated greater trochanter points and the femoral axis lines, relative to the preop and intraop images, as the images are manipulated to create the image overlay.
  • The Image Comparison Module 3036 can align the images according to pelvic anatomy in a variety of ways in this step. In a preferred construction, the system will have previously guided the user in identifying at least two consistent points on the pelvic anatomy in both images. The Image Comparison Module 3036 then superimposes the images so that the stationary base lines are positioned identically. In other words, the images are scaled, aligned and superimposed according to the stationary bases drawn across consistent points on the pelvis in each image. The Image Comparison Module will move and scale all digital annotations in tandem with the underlying image so that they remain affixed to the underlying image. This includes positioning of the femoral and pelvic landmark annotations, the identified center of rotation of the femur, pelvic reference lines, the femoral axis lines, and any other annotations used in various constructions.
  • Alternative constructions obviate the need for the use of the pelvic reference lines. In one alternative construction, the system uses image recognition technique to auto-identify the pelvic anatomy and overlay the images based on the image recognition, then the user is presented with the option to manually manipulate the resulting overlay. In another alternative, the user will be guided to manually position the images so that the pelvic anatomy matches. The system in this method will provide the user with the ability to manipulate both the position of each of the images as well as adjust the magnification so that the pelvic anatomy can be superimposed on the overlay. Alternative systems will rely on hardware implementations and stationary cameras to obviate the need for a digital line, image recognition, or user manipulation whatsoever to create the overlay. In these instances, the external system may provide a known magnification ratio and the consistent patient positioning that would be required to create the image overlay without the use of pelvic reference lines or similar technique.
  • Differences between the preop and intraop positioning of the femur, relative to the pelvis, creates a challenge in comparing the relative location of a femoral landmark such as a greater trochanter because a change in leg position alters the vector between the two femoral landmarks in the overlay. In Step 3020, FIG. 17, the Landmark Correction Module 3038, FIG. 18 calculates any existing difference between the preop and intraop femoral axis angles. The terms “femoral angle” and “femoral axis angle” refer to the orientation of the longitudinal axis of the femur. If, for example, the preop and intraop femoral axis lines generated in step 3012 vary by eight degrees, the difference calculated in step 3020 will be eight degrees.
  • In Step 3022, FIG. 17, Landmark Correction Module 3038, FIG. 18 uses data gathered in previous steps to generate an additional “corrected” or “phantom” landmark point that accounts for differences in femoral position between the preop and intraop images. A corrected landmark point 3082 is shown in FIG. 20, positioned along circle 3083 from intraoperative landmark point 3074′, which is similar to corrected landmark point 3116, FIG. 21, along circle 3124 as described in more detail below.
  • To generate the corrected landmark point, the module first calculates anglefemur, which is the angular difference between the longitudinal axes of the femur in the preoperative and intraoperative images, respectively, also referred to as the preop and intraop femoral axis lines in the overlay. This technique is shown schematically in FIG. 21 for angle α, arrow 3108, between longitudinal axis lines 3104 (“L1”) and 3106 (“L2”). The system incorporates this with the femoral or acetabular center of rotation 3102 (“R1”), (Xorigin, Yorigin) in the intraop image, previously identified in step 3010, FIG. 17, and the greater trochanter point 3110 (“p1”), (Xtroch, Ytroch) in the intraop image. The system uses the following formulas to calculate the corrected landmark “phantom” point 3116 (“p3”), (Xphantom, Yphantom) in Equations 4 and 5:

  • X phantom=(X troch −X origin)*cosine(anglefemur)−(Y troch −Y origin)*sine(anglefemur)+X origin  EQ. 4:

  • Y phantom=(X troch −X origin)*sine(anglefemur)+(Y troch −Y origin)*cosine(anglefemur)−Y origin  EQ. 5:
  • A vector “v”, line 3118, is extended from the preoperative landmark point 3112 (“p2”) to corrected landmark point 3116. Right triangle “legs” 3120 and 3122 are utilized to estimate offset and leg length, respectively. Leg 3122 is generally parallel to preoperative femoral axis 3104 in this construction. The Acetabular circle 3100 (“c1”) assists in locating center of rotation 3102. Also shown in FIG. 21 are radius lines 3130 and 3132 which are also separated by angle α, arrow 3114.
  • As mentioned above, FIG. 20 is an “overlay” screen view 3050′ of the intraop image 3070, FIG. 19, superimposed as PostOp image 3070′ on the preoperative image 3052 as PreOp image 3052′. The two stationary base lines 3056 and 3076 of FIG. 19 are aligned exactly one on top of the other, represented as a single stationary base line 3056′, 3076′. First error correction triangle 3084 is shown connecting intraoperative error point 3077′ on the pelvic teardrop, point 3078′ on the ischial tuberosity and point 3079′ on the pubic symphysis, and a similar error correction triangle 3085 connects points 3057′, 3058′ and 3059′, representing points 3057, 3058 and 3059 of preoperative image 3052, FIG. 19. Details window 3090 lists “Leg Length: −0.4 mm”, “Offset: −3.8 mm” and “Confidence Score: 5.4” as described in more detail below.
  • Finally, in Step 3018, FIG. 17, the Calculation Module 3040, FIG. 18, calculates the change in leg length and offset by analysing the vector between the greater trochanter point in the preop image and the calculated phantom point in the intraop image, such as illustrated in FIG. 21. To calculate leg length, the system calculates the distance between these two points along the femoral axis identified from the preop image, as identified by line 3122 in FIG. 21. To calculate offset, the system calculates the distance between the two points along the axis that is perpendicular to the femoral axis from the preop image, as identified by line 3120. A specific example of these calculations is given in Details window 3090, FIG. 20.
  • The “Confidence Score” listed in box 3090 relates to the two error triangles 3084 and 3085 as follows. The three points comprising each triangle enables the user to easily visualize any differences in pelvic anatomy in the overlay which may exist even after scaling and alignment. Although the stationary bases are completely matched one on top of the other, such as illustrated by single stationary base line 3056′, 3076′, the amount of deviation in the two error triangles 3084, 3085 can be visually inspected to appreciate potential error in the system, such as caused by one or more of parallax, differences in imaging vantage point of the three-dimensional skeletal anatomy, and/or by point placement within the system.
  • As an additional, optional step to quantify the differences between the placement of the two error triangles, the system provides a weighted “confidence score”, ranging from 0.0 to 10.0 in this construction. In one implementation, the system finds the difference in an absolute scale between each of two corresponding points in the preop and postop images as overlaid. In some constructions, error in certain point pairs is assigned a weighting that is greater or lesser than for other error point pairs. As one example, identifying a consistent point on the ischial tuberosity may be difficult between images, so that particular point pair (labelled 3059′ and 3079′ in FIG. 20) can be weighted less, such as by “discounting” it by fifty percent. Finally, the weighted sum of numerical error among the error point pairs is converted to a single confidence score, such as “5.4” shown in display window 3090. The weighting is not necessarily linear. Further, a cut-off value can be provided beyond which the error is deemed to be too great to provide useful analysis; in one construction, the system then recommends that the user obtain an alternative intraoperative image to compare with the preoperative image, or with a contralateral image, to analyze according to the present invention.
  • Alternative constructions of this system and method will use different methods to determine the deviation between femoral angles in the preop and intraop images. For example, in one construction, the femoral angle can be analysed by creating an image cut-out of one femur and superimposing it on top of the other at the original angle. The cut-out and underlying image may also be connected by the known femoral landmark, such as the greater trochanter, and be made to be immutable at that single landmark point. Then, at least one of the system and user may adjust the image cut-out so that the femoral bone precisely overlays the femoral bone in the superimposed image by pivoting about that landmark point. The system may accomplish this using image recognition or other automated algorithm that identifies the femoral bone or related femoral landmarks such as the greater trochanter landmark previously identified. Alternatively, the user may match the femoral bones by adjusting the superimposed image of the femur so that it matches the femur in the underlying image. The system may attempt to initially match the femoral bones and then provide the user the option to reposition the femur to improve the position. Finally, the system will calculate the deviation in angle between the two femurs by calculating the angle that the cut-out was adjusted, providing similar information
  • In yet another construction, reference (preop) and intraop images are compared via a grid-type X-Y coordinate system without utilizing femoral angles, such as for preoperative images 3202, 3202′ and intraoperative images 3242, 3242′ in screen views 3200 and 3200′ illustrated in FIGS. 22-23, respectively. The reference and intraoperative images are not actually digitally overlaid one on top of the other in this construction; instead, preop image 3202, FIG. 22, is overlaid with, or otherwise associated with, a grid 3204 having a Y-axis 3205 and an X axis 3306 with units “100, 200, . . . 500” as shown, with the origin in the upper left-hand corner of grid 3204. In a similar manner, intraop image 3242 is associated with a grid 3244 having a Y-axis 3245 and an X axis 3346, preop image 3202′, FIG. 23, is associated with a grid 3204′ having a Y-axis 3205′ and an X axis 3306, and intraop image 3242′ is associated with a grid 3244′ having a Y-axis 3245′ and an X axis 3346′.
  • Preop image 3202, FIG. 22, includes femur Fp with landmark point 3208 on the greater trochanter, and stationary base 3210 and error triangle 3212 on the pelvis. Intraop image 3242 includes femur Fi with implant I having femoral stem FS and acetabular cup AC. Intraoperative landmark point 3248 has been placed on the greater trochanter. Stationary base 3250 and error triangle 3253 have been placed on the pelvis.
  • Preop image 3202′, FIG. 23, includes femur Fp′ with landmark point 3208′ on the greater trochanter, and stationary base 3210′ and error triangle 3212′ on the pelvis. Intraop image 3242′ includes femur Fi′ with implant I′ having femoral stem FS' and acetabular cup AC′. Intraoperative landmark point 3248′ is on the greater trochanter. Stationary base 3250′ and error triangle 3253′ have been placed on the pelvis.
  • After a user activates a “Proceed To Analysis” icon 3260, FIG. 22, the system aligns preop image 3202′, FIG. 23, with intraop image 3242′. In this example, preop image 3202′ has been “tilted” or rotated counter-clockwise relative to the initial position of preop image 3202 in FIG. 22 to represent alignment achieved using stationary base 3210′ and 3250′. After both preop image and 3202′ and 3242′ have been aligned relative to each other, then a difference in position of one of the landmark points is determined, such as the shift of preop landmark point 3208, FIG. 22 to the aligned position of preop landmark point 3208′, FIG. 23. In this example, intraoperative landmark point 3248′ is in the same grid location as intraoperative landmark point 3248, FIG. 22. A vector can then be calculated from intraop landmark point 3248′ to corrected point 3208′ using calculations similar to that described above in relation to FIG. 21. In this construction, a “Details” window 3270 graphically shows the change in position of initial preop landmark point 3208 to corrected landmark point 3208′.
  • Other alternative constructions will change the order of various steps, including the generation of various digital landmarks. An additional alternative construction will identify an estimated center of rotation in the preop image instead of the intraop image, using a similar digital circle placed around the femoral head, or similar technique to annotate the estimate center of rotation.
  • Although specific features of the present invention are shown in some drawings and not in others, this is for convenience only, as each feature may be combined with any or all of the other features in accordance with the invention. While there have been shown, described, and pointed out fundamental novel features of the invention as applied to one or more preferred embodiments thereof, it will be understood that various omissions, substitutions, and changes in the form and details of the devices illustrated, and in their operation, may be made by those skilled in the art without departing from the spirit and scope of the invention. For example, it is expressly intended that all combinations of those elements and/or steps that perform substantially the same function, in substantially the same way, to achieve the same results be within the scope of the invention. Substitutions of elements from one described embodiment to another are also fully intended and contemplated.
  • It is also to be understood that the drawings are not necessarily drawn to scale, but that they are merely conceptual in nature. Other embodiments will occur to those skilled in the art and are within the scope of the present disclosure.

Claims (20)

What is claimed is:
1. A processor-implemented method of estimating one or more post-operative parameters during a surgical procedure, the method comprising:
registering a preoperative image and an intraoperative image based on an anatomical feature, wherein the anatomical feature is associated with an articulating bone capable of articulating relative to a first bone, wherein the preoperative image comprises a depiction of the first bone and the articulating bone, and the intraoperative image comprises a depiction of at least one portion of the articulating bone, a surgical implant attached to the articulating bone, and the first bone;
determining, based on a first center of rotation of the articulating bone in the registered intraoperative image, an angle of rotation to align a longitudinal axis of the articulating bone in the registered preoperative image with a corresponding longitudinal axis of the articulating bone in the registered intraoperative image;
determining, based on the angle of rotation, a corrected location of the anatomical feature in the registered intraoperative image;
estimating the one or more post-operative parameters based on the corrected location of the anatomical feature in the registered intraoperative image; and
displaying the one or more estimated post-operative parameters.
2. The method of claim 1, wherein the surgical procedure is one of:
a hip arthroplasty; or
a knee replacement; or
a shoulder replacement; or
a wrist replacement; or
a foot replacement.
3. The method of claim 2, wherein the surgical procedure is the hip arthroplasty, the first bone is a pelvic bone, the articulating bone is a femur, and the one or more estimated post-operative parameters comprise at least one of a leg length, or an offset, or a combination thereof.
4. The method of claim 1, wherein the first center of rotation of the articulating bone in the registered intraoperative image is determined based on one of:
a second center of rotation, wherein the second center of rotation is associated with the surgical implant in the registered intraoperative image, or
a third center of rotation, wherein the third center of rotation is associated with a digital representation of the surgical implant in the registered intraoperative image.
5. The method of claim 1, wherein the preoperative image is either an ipsilateral image or a flipped contralateral image.
6. The method of claim 5, wherein the preoperative image is the flipped contralateral image and determining the first center of rotation of the articulating bone in the registered intraoperative image comprises:
determining the first center of rotation of the articulating bone based on the flipped contralateral image.
7. The method of claim 1, further comprising:
performing, prior to the estimation of the one or more post-operative parameters, at least one of:
scaling at least one of the registered preoperative image or the registered intraoperative image, or
aligning the registered preoperative image and the registered intraoperative image; or
a combination thereof.
8. The method of claim 7, wherein scaling at least one of the preoperative image or the intraoperative image comprises one of:
scaling the registered preoperative image to a match an intraoperative image scale; or
scaling the registered intraoperative image to match a preoperative image scale; or
rescaling the registered intraoperative image and the registered preoperative image to a consistent scale.
9. The method of claim 8, wherein the consistent scale to effect the rescaling is determined based on one or more of:
known parameters associated with a radiographic device used to capture the preoperative image and the intraoperative image, or
known dimensions of a marker appearing in the preoperative image and in the intraoperative image, or
direct anatomical measurements, or
known dimensions of the surgical implant.
10. The method of claim 7, wherein the scaling and the aligning are based on stationary base lines in the registered preoperative image and corresponding stationary base lines in the registered intraoperative image, wherein each stationary base line connects feature points on the first bone in the registered preoperative image, and each corresponding stationary base line connects corresponding feature points on the first bone in the registered intraoperative image.
11. An apparatus to estimate one or more post-operative parameters during a surgical procedure, wherein the apparatus comprises:
a memory to store a preoperative image and an intraoperative image,
a display, and
a processor coupled to the memory and the display, wherein the processor is configured to:
register the preoperative image and the intraoperative image based on an anatomical feature, wherein the anatomical feature is associated with an articulating bone capable of articulating relative to a first bone, wherein the preoperative image comprises a depiction of the first bone and the articulating bone, and the intraoperative image comprises a depiction of at least one portion of the articulating bone, a surgical implant attached to the articulating bone, and the first bone;
determine, based on a first center of rotation of the articulating bone in the registered intraoperative image, an angle of rotation to align a longitudinal axis of the articulating bone in the registered preoperative image with a corresponding longitudinal axis of the articulating bone in the registered intraoperative image;
determine, based on the angle of rotation, a corrected location of the anatomical feature in the registered intraoperative image;
estimate the one or more post-operative parameters based on the corrected location of the anatomical feature in the registered intraoperative image; and
display the one or more estimated post-operative parameters on the display.
12. The apparatus of claim 11, wherein the surgical procedure is a hip arthroplasty, the first bone is a pelvic bone, the articulating bone is a femur, and the one or more estimated post-operative parameters comprise at least one of a leg length, or an offset or a combination thereof.
13. The apparatus of claim 11, wherein the first center of rotation of the articulating bone in the registered intraoperative image is determined based on one of:
a second center of rotation, wherein the second center of rotation is associated with the surgical implant in the registered intraoperative image, or
a third center of rotation, wherein the third center of rotation is associated with a digital representation of the surgical implant in the registered intraoperative image.
14. The apparatus of claim 11, wherein the processor is configured to perform, prior to the estimation of the one or more post-operative parameters, at least one of:
scaling at least one of the registered preoperative image or the registered intraoperative image, or
aligning the registered preoperative image and the registered intraoperative image; or
a combination thereof.
15. The apparatus of claim 14, wherein to perform scaling of at least one of the preoperative image or the intraoperative image, the processor is configured to:
scale the registered preoperative image to a match an intraoperative image scale; or
scale the registered intraoperative image to match a preoperative image scale; or
rescale the registered intraoperative image and the registered preoperative image to a consistent scale.
16. The apparatus of claim 15, wherein the consistent scale to effect the rescaling is determined based on one or more of:
known parameters associated with a radiographic device used to capture the preoperative image and the intraoperative image, or
known dimensions of a marker appearing in the preoperative image and in the intraoperative image, or
direct anatomical measurements, or
known dimensions of the surgical implant.
17. The apparatus of claim 14, wherein the scaling and the aligning are based on stationary base lines in the registered preoperative image and corresponding stationary base lines in the registered intraoperative image, wherein each stationary base line connects feature points on the first bone in the registered preoperative image, and each corresponding stationary base line connects corresponding feature points on the first bone in the registered intraoperative image.
18. A non-transitory computer-readable medium comprising instructions to configure a processor to:
register a preoperative image and an intraoperative image based on an anatomical feature, wherein the anatomical feature is associated with an articulating bone capable of articulating relative to a first bone, wherein the preoperative image comprises a depiction of the first bone and the articulating bone, and the intraoperative image comprises a depiction of at least one portion of the articulating bone, a surgical implant attached to the articulating bone, and the first bone;
determining, based on a first center of rotation of the articulating bone in the registered intraoperative image, an angle of rotation to align a longitudinal axis of the articulating bone in the registered preoperative image with a corresponding longitudinal axis of the articulating bone in the registered intraoperative image;
determining, based on the angle of rotation, a corrected location of the anatomical feature in the registered intraoperative image;
estimating one or more post-operative parameters based on the corrected location of the anatomical feature in the registered intraoperative image; and
displaying the one or more estimated post-operative parameters.
19. The computer-readable medium of claim 18, wherein the first bone is a pelvic bone, the articulating bone is a femur, and the one or more estimated post-operative parameters comprise at least one of a leg length, or an offset or a combination thereof.
20. The computer-readable medium of claim 18, wherein to estimate the one or more post-operative parameters, the instructions configure the processor to perform, prior to the estimation of the one or more post-operative parameters, at least one of:
scaling at least one of the registered preoperative image or the registered intraoperative image, or
aligning the registered preoperative image and the registered intraoperative image; or
a combination thereof.
US16/938,912 2014-02-25 2020-07-24 Systems and methods for intra-operative image analysis Abandoned US20200352529A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/938,912 US20200352529A1 (en) 2014-02-25 2020-07-24 Systems and methods for intra-operative image analysis

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US201461944520P 2014-02-25 2014-02-25
US201461948534P 2014-03-05 2014-03-05
US201461980659P 2014-04-17 2014-04-17
US201462016483P 2014-06-24 2014-06-24
US201462051238P 2014-09-16 2014-09-16
US201462080953P 2014-11-17 2014-11-17
US201562105183P 2015-01-19 2015-01-19
US14/630,300 US10758198B2 (en) 2014-02-25 2015-02-24 Systems and methods for intra-operative image analysis
US14/995,057 US10765384B2 (en) 2014-02-25 2016-01-13 Systems and methods for intra-operative image analysis
US16/938,912 US20200352529A1 (en) 2014-02-25 2020-07-24 Systems and methods for intra-operative image analysis

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/995,057 Continuation US10765384B2 (en) 2014-02-25 2016-01-13 Systems and methods for intra-operative image analysis

Publications (1)

Publication Number Publication Date
US20200352529A1 true US20200352529A1 (en) 2020-11-12

Family

ID=53881124

Family Applications (5)

Application Number Title Priority Date Filing Date
US14/630,300 Active 2037-06-03 US10758198B2 (en) 2014-02-25 2015-02-24 Systems and methods for intra-operative image analysis
US14/995,057 Active 2035-11-17 US10765384B2 (en) 2014-02-25 2016-01-13 Systems and methods for intra-operative image analysis
US16/690,392 Active 2036-02-07 US11534127B2 (en) 2014-02-25 2019-11-21 Systems and methods for intra-operative image analysis
US16/938,912 Abandoned US20200352529A1 (en) 2014-02-25 2020-07-24 Systems and methods for intra-operative image analysis
US17/396,656 Pending US20210361252A1 (en) 2014-02-25 2021-08-07 Systems and methods for intra-operative image analysis

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US14/630,300 Active 2037-06-03 US10758198B2 (en) 2014-02-25 2015-02-24 Systems and methods for intra-operative image analysis
US14/995,057 Active 2035-11-17 US10765384B2 (en) 2014-02-25 2016-01-13 Systems and methods for intra-operative image analysis
US16/690,392 Active 2036-02-07 US11534127B2 (en) 2014-02-25 2019-11-21 Systems and methods for intra-operative image analysis

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/396,656 Pending US20210361252A1 (en) 2014-02-25 2021-08-07 Systems and methods for intra-operative image analysis

Country Status (6)

Country Link
US (5) US10758198B2 (en)
EP (3) EP3449861B1 (en)
JP (3) JP6685580B2 (en)
AU (1) AU2015223078B2 (en)
ES (2) ES2704691T5 (en)
WO (1) WO2015130848A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10973590B2 (en) * 2018-09-12 2021-04-13 OrthoGrid Systems, Inc Artificial intelligence intra-operative surgical guidance system and method of use
WO2022249190A1 (en) * 2021-05-26 2022-12-01 Beyeonics Surgical Ltd. System and method for verification of conversion of locations between coordinate systems
WO2022261548A1 (en) * 2021-06-11 2022-12-15 AccuJoint, Inc Adjustment system and method for patient position intraoperatively using radiographic measurements
US11534127B2 (en) 2014-02-25 2022-12-27 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US11540794B2 (en) 2018-09-12 2023-01-03 Orthogrid Systesm Holdings, LLC Artificial intelligence intra-operative surgical guidance system and method of use
US11642174B2 (en) 2014-02-25 2023-05-09 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
EP4235686A1 (en) * 2022-02-23 2023-08-30 Koninklijke Philips N.V. Region of interest indication for live medical images
US11887306B2 (en) 2021-08-11 2024-01-30 DePuy Synthes Products, Inc. System and method for intraoperatively determining image alignment
US11925420B2 (en) 2015-08-05 2024-03-12 Accupredict, Inc. Adjustment system and method for patient position intraoperatively using radiographic measurements
US11999065B2 (en) 2020-10-30 2024-06-04 Mako Surgical Corp. Robotic surgical system with motorized movement to a starting pose for a registration or calibration routine

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9204937B2 (en) 2013-02-19 2015-12-08 Stryker Trauma Gmbh Software for use with deformity correction
JP6038733B2 (en) * 2013-06-18 2016-12-07 浜松ホトニクス株式会社 Manufacturing method of radiation detection unit
CN110123448A (en) 2013-10-09 2019-08-16 纽文思公司 The method for being designed in art during vertebra program of performing the operation and evaluating spine malformation correction
CA2853012A1 (en) * 2014-05-30 2015-11-30 Derek Cooke Joint surgery triage tool
EP3157425A4 (en) 2014-06-17 2017-11-15 Nuvasive, Inc. Systems and methods for planning, performing, and assessing spinal correction during surgery
US20160262800A1 (en) 2015-02-13 2016-09-15 Nuvasive, Inc. Systems and methods for planning, performing, and assessing spinal correction during surgery
US10406054B1 (en) 2015-02-18 2019-09-10 Nuvasive, Inc. Systems and methods for facilitating surgical procedures
WO2017024202A1 (en) * 2015-08-05 2017-02-09 New York Society For The Ruptured And Crippled Maintaining The Hospital For Special Surgery A fluoroscopy-based technique to measure intraoperative cup anteversion
JP6713185B2 (en) * 2015-10-15 2020-06-24 株式会社日立ハイテク Inspection apparatus and inspection method using template matching
US20170119316A1 (en) * 2015-10-30 2017-05-04 Orthosensor Inc Orthopedic measurement and tracking system
BR112018012090A2 (en) * 2015-12-14 2018-11-27 Nuvasive Inc 3d visualization during surgery with reduced radiation exposure
US10991070B2 (en) 2015-12-18 2021-04-27 OrthoGrid Systems, Inc Method of providing surgical guidance
US11386556B2 (en) * 2015-12-18 2022-07-12 Orthogrid Systems Holdings, Llc Deformed grid based intra-operative system and method of use
ES2902078T3 (en) * 2015-12-18 2022-03-24 Depuy Synthes Products Inc Systems and methods for intraoperative image analysis
EP3402409B1 (en) * 2016-01-13 2024-02-28 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
WO2017127838A1 (en) 2016-01-22 2017-07-27 Nuvasive, Inc. Systems and methods for facilitating spine surgery
US10463433B2 (en) 2016-03-02 2019-11-05 Nuvasive, Inc. Systems and methods for spinal correction surgical planning
US10182871B2 (en) 2016-05-22 2019-01-22 JointPoint, Inc. Systems and methods for intra-operative image acquisition and calibration
US10251705B2 (en) * 2016-06-02 2019-04-09 Stryker European Holdings I, Llc Software for use with deformity correction
EP3471646B1 (en) 2016-06-17 2023-07-05 Zimmer, Inc. System for intraoperative surgical planning
WO2018013848A1 (en) 2016-07-15 2018-01-18 Mako Surgical Corp. Systems for a robotic-assisted revision procedure
US10925674B2 (en) * 2016-07-18 2021-02-23 Stryker European Operations Holdings Llc Surgical site displacement tracking
US10748319B1 (en) * 2016-09-19 2020-08-18 Radlink, Inc. Composite radiographic image that corrects effects of parallax distortion
US10643360B2 (en) * 2017-02-10 2020-05-05 Arizona Board Of Regents On Behalf Of Arizona State University Real-time medical image visualization systems and related methods
US10319108B2 (en) 2017-02-14 2019-06-11 Jx Imaging Arts, Llc System and method for machine vision object orientation measurement
KR102618956B1 (en) 2017-03-14 2023-12-27 스티븐 비. 머피 Systems and methods for determining leg length change during hip surgery
JP2020511239A (en) * 2017-03-17 2020-04-16 インテリジョイント サージカル インク. System and method for augmented reality display in navigation surgery
EP3609424A1 (en) * 2017-04-14 2020-02-19 Stryker Corporation Surgical systems and methods for facilitating ad-hoc intraoperative planning of surgical procedures
EP3682418A4 (en) * 2017-09-15 2021-06-09 Mirus LLC Systems and methods for measurement of anatomic alignment
WO2019079521A1 (en) * 2017-10-17 2019-04-25 Friedrich Boettner Fluoroscopy-based measurement and processing system and method
WO2019102473A1 (en) 2017-11-22 2019-05-31 Mazor Robotics Ltd. A method for verifying hard tissue location using implant imaging
GB2572594A (en) * 2018-04-04 2019-10-09 Corin Ltd Implant alignment system
US20210100627A1 (en) * 2018-04-25 2021-04-08 Intuitive Surgical Operations, Inc. Systems and methods related to elongate devices
US11227385B2 (en) * 2018-08-08 2022-01-18 Loyola University Chicago Methods of classifying and/or determining orientations of objects using two-dimensional images
JP7336309B2 (en) * 2018-08-19 2023-08-31 チャン グァン メモリアル ホスピタル,リンコウ Medical image analysis methods, systems and models
EP3640767A1 (en) * 2018-10-17 2020-04-22 Siemens Schweiz AG Method for determining at least one area in at least one input model for at least one element to be placed
CN111179180B (en) * 2018-11-13 2023-06-27 创惟科技股份有限公司 Image correction method and device
LU101009B1 (en) * 2018-11-26 2020-05-26 Metamorphosis Gmbh Artificial-intelligence-based determination of relative positions of objects in medical images
US10872690B2 (en) 2018-11-28 2020-12-22 General Electric Company System and method for remote visualization of medical images
EP3893793A4 (en) 2018-12-14 2022-08-31 MAKO Surgical Corp. Systems and methods for preoperative planning and postoperative analysis of surgical procedures
US11176683B2 (en) * 2019-02-13 2021-11-16 Sectra Ab Automated implant movement analysis systems and related methods
WO2020227832A1 (en) * 2019-05-15 2020-11-19 Intellijoint Surgical Inc. Systems and methods for computer assisted femoral surgery
TWI753412B (en) * 2020-04-27 2022-01-21 長庚醫療財團法人林口長庚紀念醫院 A method for generating a model for automatically locating an anchor point, a skeletal state analysis method, and an electronic system
US11107586B1 (en) * 2020-06-24 2021-08-31 Cuptimize, Inc. System and method for analyzing acetabular cup position
US11670013B2 (en) 2020-06-26 2023-06-06 Jigar Patel Methods, systems, and computing platforms for photograph overlaying utilizing anatomic body mapping
JP7472845B2 (en) 2020-07-21 2024-04-23 株式会社島津製作所 X-ray imaging device and image processing method
CN112641510B (en) * 2020-12-18 2021-08-17 北京长木谷医疗科技有限公司 Joint replacement surgical robot navigation positioning system and method
JP2022181902A (en) * 2021-05-27 2022-12-08 国立大学法人北海道大学 Image diagnostic device for osteoarthropathy, and operation method and program of image diagnostic device
CN116712171B (en) * 2023-08-11 2023-11-03 北京维卓致远医疗科技发展有限责任公司 Intertrochanteric fracture navigation method, device and storable medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150257846A1 (en) * 2013-02-18 2015-09-17 Orthogrid Systems, Inc. Alignment plate apparatus and system and method of use

Family Cites Families (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1018287A (en) 1911-10-04 1912-02-20 Samuel M Work Fruit-picker.
DE4304571A1 (en) 1993-02-16 1994-08-18 Mdc Med Diagnostic Computing Procedures for planning and controlling a surgical procedure
US6205411B1 (en) 1997-02-21 2001-03-20 Carnegie Mellon University Computer-assisted surgery planner and intra-operative guidance system
US6994424B2 (en) 1998-10-16 2006-02-07 Silverbrook Research Pty Ltd Printhead assembly incorporating an array of printhead chips on an ink distribution structure
US6614453B1 (en) 2000-05-05 2003-09-02 Koninklijke Philips Electronics, N.V. Method and apparatus for medical image display for surgical tool planning and navigation in clinical environments
EP1188421B1 (en) 2000-09-18 2005-03-23 Fuji Photo Film Co., Ltd. Artificial bone template selection, display and storage system and recording medium
CA2334495A1 (en) 2001-02-06 2002-08-06 Surgical Navigation Specialists, Inc. Computer-aided positioning method and system
JP2003271749A (en) 2002-03-18 2003-09-26 Fuji Photo Film Co Ltd Surgical operation assistance system
WO2004001569A2 (en) 2002-06-21 2003-12-31 Cedara Software Corp. Computer assisted system and method for minimal invasive hip, uni knee and total knee replacement
JP4146696B2 (en) 2002-09-19 2008-09-10 京セラ株式会社 Artificial hip joint range of motion measurement method
GB2393625B (en) * 2002-09-26 2004-08-18 Internet Tech Ltd Orthopaedic surgery planning
US9308002B2 (en) 2002-11-07 2016-04-12 Crescent H Trust Precise hip component positioning for hip replacement surgery
US7542791B2 (en) 2003-01-30 2009-06-02 Medtronic Navigation, Inc. Method and apparatus for preplanning a surgical procedure
CA2523727A1 (en) 2003-04-28 2005-01-06 Bracco Imaging Spa Surgical navigation imaging system
US8484001B2 (en) 2003-08-26 2013-07-09 Voyant Health Ltd. Pre-operative medical planning system and method for use thereof
JP4401741B2 (en) 2003-10-28 2010-01-20 キヤノン株式会社 Image display device, image display method and program thereof
JP2005185767A (en) 2003-12-26 2005-07-14 Kobe Steel Ltd Artificial joint member select support device and artificial joint member select support program
US8007448B2 (en) 2004-10-08 2011-08-30 Stryker Leibinger Gmbh & Co. Kg. System and method for performing arthroplasty of a joint and tracking a plumb line plane
GB0504172D0 (en) 2005-03-01 2005-04-06 King S College London Surgical planning
DE102005012708A1 (en) 2005-03-11 2006-09-21 Eberhard-Karls-Universität Tübingen Method for determining body orientations in space based on two x-ray images
US20100249790A1 (en) 2009-03-26 2010-09-30 Martin Roche System and method for soft tissue tensioning in extension and flexion
GB0507243D0 (en) * 2005-04-09 2005-05-18 Depuy Int Ltd Acetabular cup positioning
EP1908023A1 (en) 2005-06-02 2008-04-09 Depuy International Limited Surgical system and method
WO2006128301A1 (en) 2005-06-02 2006-12-07 Orthosoft Inc. Leg alignment for surgical parameter measurement in hip replacement surgery
US20070015999A1 (en) 2005-07-15 2007-01-18 Heldreth Mark A System and method for providing orthopaedic surgical information to a surgeon
JP2009503634A (en) 2005-07-22 2009-01-29 セダラ ソフトウェア コーポレーション Implant inventory management system and method using digital image planning
US20070066917A1 (en) 2005-09-20 2007-03-22 Hodorek Robert A Method for simulating prosthetic implant selection and placement
US20070078678A1 (en) 2005-09-30 2007-04-05 Disilvestro Mark R System and method for performing a computer assisted orthopaedic surgical procedure
JP2007151742A (en) 2005-12-02 2007-06-21 Canon Inc Information processor, its method and program
US8635082B2 (en) 2006-05-25 2014-01-21 DePuy Synthes Products, LLC Method and system for managing inventories of orthopaedic implants
US20080021299A1 (en) 2006-07-18 2008-01-24 Meulink Steven L Method for selecting modular implant components
US8090166B2 (en) 2006-09-21 2012-01-03 Surgix Ltd. Medical image analysis
US7769222B2 (en) * 2006-10-27 2010-08-03 Mitutoyo Corporation Arc tool user interface
US20080120262A1 (en) 2006-11-16 2008-05-22 Koninklijke Philips Electronics, N.V. What-if planning for medical procedures
US20080161680A1 (en) 2006-12-29 2008-07-03 General Electric Company System and method for surgical navigation of motion preservation prosthesis
JP2009136384A (en) * 2007-12-04 2009-06-25 Fujifilm Corp Implant selection supporting device and program
US8617171B2 (en) * 2007-12-18 2013-12-31 Otismed Corporation Preoperatively planning an arthroplasty procedure and generating a corresponding patient specific arthroplasty resection guide
US8737700B2 (en) * 2007-12-18 2014-05-27 Otismed Corporation Preoperatively planning an arthroplasty procedure and generating a corresponding patient specific arthroplasty resection guide
US8160345B2 (en) 2008-04-30 2012-04-17 Otismed Corporation System and method for image segmentation in generating computer models of a joint to undergo arthroplasty
ES2528294T3 (en) 2008-02-25 2015-02-06 Smith & Nephew, Inc. Method and system for mapping a femoral head for alignment of acetabular prostheses
JP5472893B2 (en) * 2008-04-17 2014-04-16 シャープ株式会社 Information processing system, information processing apparatus, information processing method, and range determination program
US8249318B2 (en) 2008-09-26 2012-08-21 OsteoWare, Inc. Method for identifying implanted reconstructive prosthetic devices
US8160326B2 (en) * 2008-10-08 2012-04-17 Fujifilm Medical Systems Usa, Inc. Method and system for surgical modeling
CN102300512B (en) * 2008-12-01 2016-01-20 马佐尔机器人有限公司 The sloped-spine stabilisation that robot guides
US8337397B2 (en) 2009-03-26 2012-12-25 Intuitive Surgical Operations, Inc. Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient
US9554868B2 (en) 2009-08-07 2017-01-31 DePuy Synthes Products, Inc. Method and apparatus for reducing malalignment of fractured bone fragments
US8311791B1 (en) 2009-10-19 2012-11-13 Surgical Theater LLC Method and system for simulating surgical procedures
WO2011134083A1 (en) 2010-04-28 2011-11-03 Ryerson University System and methods for intraoperative guidance feedback
US8842893B2 (en) 2010-04-30 2014-09-23 Medtronic Navigation, Inc. Method and apparatus for image-based navigation
US8932299B2 (en) * 2010-06-18 2015-01-13 Howmedica Osteonics Corp. Patient-specific total hip arthroplasty
US8675939B2 (en) 2010-07-13 2014-03-18 Stryker Leibinger Gmbh & Co. Kg Registration of anatomical data sets
US9717508B2 (en) * 2010-10-29 2017-08-01 The Cleveland Clinic Foundation System of preoperative planning and provision of patient-specific surgical aids
CA2821670A1 (en) 2010-12-17 2012-06-21 Avenir Medical Inc. Method and system for aligning a prosthesis during surgery
WO2012100825A1 (en) 2011-01-26 2012-08-02 Brainlab Ag Method for planning the positioning of an implant
US8917290B2 (en) 2011-01-31 2014-12-23 Biomet Manufacturing, Llc Digital image templating
US8736679B2 (en) 2011-02-02 2014-05-27 The Boeing Company Avionic display testing system
WO2013025927A2 (en) 2011-08-17 2013-02-21 New York Society For The Ruptured And Crippled Maintaining The Hospital For Special Surgery Method for orienting an acetabular cup and instruments for use therewith
US9167989B2 (en) 2011-09-16 2015-10-27 Mako Surgical Corp. Systems and methods for measuring parameters in joint replacement surgery
KR101846552B1 (en) 2011-11-30 2018-04-09 엘지디스플레이 주식회사 System and method for inspecting misalign between display panel and film patterned retarder
US9064332B2 (en) * 2012-01-12 2015-06-23 Siemens Medical Solutions Usa, Inc. Fused-image visualization for surgery evaluation
US8926454B2 (en) 2012-05-07 2015-01-06 Karsten Manufacturing Corporation Fitting systems for golf equipment using camera image for measurement of individual, and related methods
US9545233B2 (en) 2012-05-22 2017-01-17 Mazor Robotics Ltd. On-site verification of implant positioning
CA2878861A1 (en) 2012-07-12 2014-01-16 Ao Technology Ag Method for generating a graphical 3d computer model of at least one anatomical structure in a selectable pre-, intra-, or postoperative status
EP4218647A1 (en) 2012-08-08 2023-08-02 Ortoma AB System for computer assisted surgery
US20140073907A1 (en) 2012-09-12 2014-03-13 Convergent Life Sciences, Inc. System and method for image guided medical procedures
KR20140028221A (en) 2012-08-28 2014-03-10 삼성전자주식회사 Method and apparatus for setting electronic blackboard system
US8831324B2 (en) 2012-10-02 2014-09-09 Brad L. Penenberg Surgical method and workflow
US20140378828A1 (en) 2012-10-02 2014-12-25 Brad L. Penenberg Hip arthroplasty method and workflow
US20140303938A1 (en) * 2013-04-05 2014-10-09 Biomet Manufacturing Corp. Integrated orthopedic planning and management process
US10052060B2 (en) * 2013-10-31 2018-08-21 Andrew B. Lytle System and method for adjusting alignment of a body part with an imaging apparatus
US10758198B2 (en) 2014-02-25 2020-09-01 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US10433914B2 (en) 2014-02-25 2019-10-08 JointPoint, Inc. Systems and methods for intra-operative image analysis
US10013808B2 (en) 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US9955056B2 (en) 2015-03-16 2018-04-24 Qualcomm Incorporated Real time calibration for multi-camera wireless device
US10733914B2 (en) 2015-09-30 2020-08-04 Steven N. Kruchko Systems and methods for labeling
US10182871B2 (en) 2016-05-22 2019-01-22 JointPoint, Inc. Systems and methods for intra-operative image acquisition and calibration

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150257846A1 (en) * 2013-02-18 2015-09-17 Orthogrid Systems, Inc. Alignment plate apparatus and system and method of use

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11642174B2 (en) 2014-02-25 2023-05-09 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US11534127B2 (en) 2014-02-25 2022-12-27 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US11925420B2 (en) 2015-08-05 2024-03-12 Accupredict, Inc. Adjustment system and method for patient position intraoperatively using radiographic measurements
US10973590B2 (en) * 2018-09-12 2021-04-13 OrthoGrid Systems, Inc Artificial intelligence intra-operative surgical guidance system and method of use
US11540794B2 (en) 2018-09-12 2023-01-03 Orthogrid Systesm Holdings, LLC Artificial intelligence intra-operative surgical guidance system and method of use
US11589928B2 (en) 2018-09-12 2023-02-28 Orthogrid Systems Holdings, Llc Artificial intelligence intra-operative surgical guidance system and method of use
US11883219B2 (en) 2018-09-12 2024-01-30 Orthogrid Systems Holdings, Llc Artificial intelligence intra-operative surgical guidance system and method of use
US11937888B2 (en) 2018-09-12 2024-03-26 Orthogrid Systems Holding, LLC Artificial intelligence intra-operative surgical guidance system
US11999065B2 (en) 2020-10-30 2024-06-04 Mako Surgical Corp. Robotic surgical system with motorized movement to a starting pose for a registration or calibration routine
WO2022249190A1 (en) * 2021-05-26 2022-12-01 Beyeonics Surgical Ltd. System and method for verification of conversion of locations between coordinate systems
WO2022261548A1 (en) * 2021-06-11 2022-12-15 AccuJoint, Inc Adjustment system and method for patient position intraoperatively using radiographic measurements
US11887306B2 (en) 2021-08-11 2024-01-30 DePuy Synthes Products, Inc. System and method for intraoperatively determining image alignment
EP4235686A1 (en) * 2022-02-23 2023-08-30 Koninklijke Philips N.V. Region of interest indication for live medical images
WO2023161047A1 (en) * 2022-02-23 2023-08-31 Koninklijke Philips N.V. Region of interest indication for live medical images

Also Published As

Publication number Publication date
JP2020075109A (en) 2020-05-21
ES2909140T3 (en) 2022-05-05
EP3449861B1 (en) 2022-02-16
US20160128654A1 (en) 2016-05-12
JP2017515613A (en) 2017-06-15
US11534127B2 (en) 2022-12-27
JP6919106B2 (en) 2021-08-18
EP3113710A4 (en) 2017-10-18
ES2704691T3 (en) 2019-03-19
AU2015223078B2 (en) 2019-04-04
EP3449861A1 (en) 2019-03-06
JP6685580B2 (en) 2020-04-22
US10758198B2 (en) 2020-09-01
EP3875052A1 (en) 2021-09-08
US10765384B2 (en) 2020-09-08
US20150238271A1 (en) 2015-08-27
WO2015130848A1 (en) 2015-09-03
US20210361252A1 (en) 2021-11-25
ES2704691T5 (en) 2022-10-27
JP7203148B2 (en) 2023-01-12
EP3113710B1 (en) 2018-10-10
US20200100751A1 (en) 2020-04-02
JP2021151490A (en) 2021-09-30
EP3113710A1 (en) 2017-01-11
EP3113710B2 (en) 2022-06-22
AU2015223078A1 (en) 2016-09-08

Similar Documents

Publication Publication Date Title
US20200352529A1 (en) Systems and methods for intra-operative image analysis
US11642174B2 (en) Systems and methods for intra-operative image analysis
US20230277331A1 (en) Method and Apparatus for Implant Size Determination
CN108701375B (en) System and method for intra-operative image analysis
AU2022200996B2 (en) Systems and methods for intra-operative image analysis
EP4134033A1 (en) System and method for intraoperatively determining image alignment

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION