US20200082526A1 - Methods of classifying and/or determining orientations of objects using two-dimensional images - Google Patents

Methods of classifying and/or determining orientations of objects using two-dimensional images Download PDF

Info

Publication number
US20200082526A1
US20200082526A1 US16/535,566 US201916535566A US2020082526A1 US 20200082526 A1 US20200082526 A1 US 20200082526A1 US 201916535566 A US201916535566 A US 201916535566A US 2020082526 A1 US2020082526 A1 US 2020082526A1
Authority
US
United States
Prior art keywords
dimensional
orientation
area
objects
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/535,566
Other versions
US11227385B2 (en
Inventor
Michael Patrick Murphy
Cameron James Killen
Karen Wu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Loyola University Chicago
Original Assignee
Loyola University Chicago
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Loyola University Chicago filed Critical Loyola University Chicago
Priority to US16/535,566 priority Critical patent/US11227385B2/en
Assigned to LOYOLA UNIVERSITY CHICAGO reassignment LOYOLA UNIVERSITY CHICAGO ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KILLEN, Cameron James, MURPHY, MICHAEL PATRICK, WU, Karen
Publication of US20200082526A1 publication Critical patent/US20200082526A1/en
Application granted granted Critical
Publication of US11227385B2 publication Critical patent/US11227385B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/505Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/487Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5217Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • G06T2207/10124Digitally reconstructed radiograph [DRR]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30052Implant; Prosthesis

Definitions

  • the present invention generally relates to methods for measuring orientations of objects that may be present or placed in a body or other object, as a nonlimiting example, a prosthesis or prosthetic implant in a subject, all of which will hereinafter simply be referred to as “implant(s)” as a matter of convenience.
  • the invention particularly relates to methods of obtaining information regarding an implant that involve analyzing two-dimensional images thereof to determine the orientation of the implant within a subject, optionally various characteristics of the implant, as nonlimiting examples, type, size, material, make, and/or model, and optionally the orientation and characteristics of objects in the vicinity of the implant and visible in the same image to determine a relative orientation of the implant to nearby objects.
  • FIGS. 4A, 4B, and 4C contain images that schematically represent a total hip replacement implant composed of components including an acetabular component (“cup”), a plastic liner, a femoral head, and a femoral stem.
  • acetabular component (“cup”)
  • plastic liner a plastic liner
  • a femoral head a femoral head
  • a femoral stem a femoral stem.
  • Metal objects are radiopaque in x-ray images, as readily seen in the x-ray image d of the total hip replacement implant in FIG. 4D .
  • radiopaque images of implanted acetabular cups have been utilized to assess their orientation.
  • Implant models have unique designs and thus unique radiographic signatures on an x-ray image. These characteristic features can be used to identify the implant size, type, material, make, and model within a radiopaque image.
  • FIG. 5 represents the current convention where acetabular cup orientation is described by its inclination and anteversion angles (RI, RA) from, respectively, the sagittal and coronal planes of a human body.
  • CT three-dimensional reconstruction is considered to be the most accurate radiographic assessment of acetabular cup orientation
  • plain radiographs are frequently employed in its place due to the additional radiation exposure and cost to the patient associated with CT scans. Since plain radiographs produce a 2D image, measuring implant orientation that is not orthogonal or parallel to the plane of the image (i.e., anteversion and inclination) can be difficult.
  • various methods exist for measuring acetabular cup orientation from radiographs significant variability has been reported on the accuracy of these methods. One contribution to the variability reported may be the dependency these methods have on radiographically obscured points.
  • Several conventional methods rely on accurate visualization and definition of an elliptical opening of the acetabular cup on radiographs.
  • this opening is masked by the radiopacity of the acetabular cup and further hidden by the femoral head. Furthermore, the clarity of the opening is variable given its dependency on acetabular cup material, contour, and thickness. These methods have reported a standard deviation up to 10.8°, span upwards of 20° relative to CT measurements, inter-observer reliability as low as 0.747, and intra-observer reliability as low as 0.746.
  • a critical part in pre-operative planning for the orthopaedic surgeon involves the identification of the failed implant, which generally involves identifying its type, size, material, make, and/or model. This step is critical to the surgeon in deciding what materials are necessary for the surgery. Studies have reported cases where surgeons were unable to pre-operatively identify the implant, necessitating that at least twice the number of implants must be brought to the operating room, and resulted in increased time for the surgery, increased healthcare cost, and higher complexity of the surgery.
  • the present invention provides methods suitable for accurately and precisely measuring orientations of objects, as nonlimiting examples, implants, and optionally one or more objects in the vicinity of an implant, utilizing two-dimensional radiographs, and optionally also identifying such objects through characteristics thereof that are identifiable in a two-dimensional radiograph.
  • a method for determining a three-dimensional orientation of an object, and optionally other characteristics of the object, based on its area projected onto a two-dimensional image and known or measured geometry.
  • a method for determining a three-dimensional orientation of a first object that is partially obscured by a second object based on a two-dimensional image of the first and second objects.
  • the method involves measuring a combined two-dimensional area of the first and second objects as displayed in the two-dimensional image.
  • the combined two-dimensional area is then used to determine the three-dimensional orientation of the first object by accounting for a measured or estimated two-dimensional area of overlap between the first and second objects, a measured or estimated two-dimensional area of the first object, and a measured or estimated two-dimensional area of the second object.
  • This area of overlap may be estimated with known areas and measurements of the first and second objects and measured, known, or assumed orientations of the first and second objects.
  • Two-dimensional areas may be estimated for first and second objects having known or measured geometries and shapes.
  • a particular but nonlimiting application of the method described above is to employ the method to determine the three-dimensional orientation of an acetabular cup of a hip implant based on its two-dimensional area displayed in a two-dimensional radiographic image, wherein the acetabular cup is partially obscured by the femoral head of the hip implant in the two-dimensional radiographic image.
  • Other aspects include methods capable of accurately and precisely identifying characteristics of one or more objects that are identifiable in a two-dimensional radiographic image, as nonlimiting examples, the type, size, material, make, and/or model of an implant, and optionally the orientation and characteristics of one or more objects in the vicinity of the implant and visible in the same two-dimensional radiographic image to determine a relative orientation of the implant to nearby objects.
  • Such a method comprises image processing techniques that identify an implant in question or components thereof (as nonlimiting examples, a total hip arthroplasty, femoral stem, acetabular cup), then identify features of the implant, and subsequently use the features to predict characteristics of the implant, for example, the type, size, material, make, and/or model of the implant.
  • the method may be further applied to structures in the vicinity of the implant, for example, characterizing structures and tissues in the vicinity of the implant and visible in the same two-dimensional radiographic image.
  • Such tissues and structures include, but are not limited to, the pelvis and femur, whose identities and orientations can be utilized to determine the orientation of the implant relative to these structures.
  • a method for determining the three-dimensional orientation of an acetabular cup of a hip prosthesis that is partially obscured by a femoral head of the hip prosthesis based on a two-dimensional radiographic image of the acetabular cup and the femoral head cup after the hip prosthesis has been implanted in a subject.
  • the method includes measuring a first dimension equal to a diameter of the acetabular cup, measuring a second dimension equal to a maximum distance between a posterolateral edge of an opening of the acetabular cup and a point on an exterior surface of the acetabular cup along an axis thereof, and determining the orientation of the acetabular cup based on the measured first and second dimensions.
  • Still other aspects include methods that apply any of the aforementioned methods to a convolutional neural network, which as used herein refers to a type of artificial intelligence where a program is able to learn a relationship between many inputs and their corresponding outputs.
  • a convolutional neural network can be employed as a means of automating an otherwise manual process of accurately and precisely identifying characteristics of objects that are identifiable in a two-dimensional radiographic image.
  • technical effects of the methods described above can include the ability to measure the orientation of an implant (and/or components thereof) in two-dimensional radiographic images in a manner that is significantly more accurate and precise than conventional methods that utilize two-dimensional radiographic images, and/or the ability to identify characteristics of an implant, such as its type, size, material, make, and/or model.
  • FIGS. 1A and 1B are two plain radiographs representing aspects of measurement methods in accordance with certain nonlimiting embodiments of the invention.
  • FIG. 1A shows dimensions (TL, D 2 ) for calculations used in an “orthogonal method” of the invention
  • FIG. 1B shows a combined two-dimensional area for calculations used in an “area method” of the invention.
  • FIG. 2 is a graph representing measured anteversion obtained by the above-noted area (“Area”) and orthogonal (“Orthogonal”) methods as well as obtained by “Widmer” and “Lewinnek” methods (discussed below), all of which are plotted against CT three-dimensional reconstruction measurements (“CT measurements”). Measurements were averaged between multiple users. Pearson correlation coefficient (r) and coefficient of determination (r 2 ) were calculated for each method against CT three-dimensional reconstruction measurements.
  • FIG. 3 is a graph representing Whisker and box plots for the area, orthogonal, Widmer, and Lewinnek methods as compared to CT three-dimensional reconstruction measurements (“CT”).
  • CT three-dimensional reconstruction measurements
  • FIGS. 4A, 4B, 4C, and 4D are images depicting a conventional prosthesis used for a total hip replacement.
  • FIGS. 4A and 4B are, respectively, exploded and assembly views schematically representing the various components of the implant
  • FIG. 4C schematically represents the implant as implanted within a human subject
  • FIG. 4D shows a radiograph of a subject's hip having the implant implanted therein after total hip arthroplasty (THA).
  • THA total hip arthroplasty
  • FIG. 5 represents the current convention by which the three-dimensional orientation of an acetabular cup can be described by its inclination and anteversion angles (RI, RA) from, respectively, the sagittal and coronal planes of the human body.
  • FIG. 6 contains a table that presents data obtained during investigations leading to the present invention and includes inter-observer and intra-observer reliability for CT three-dimensional reconstruction measurements (CT) and the Widmer, Lewinnek, orthogonal, and area methods.
  • CT three-dimensional reconstruction measurements
  • methods disclosed herein could be used to measure the orientation of various implants (for example, femoral, knee, or shoulder replacements) or other objects captured in various types of two-dimensional images (for example, radiographic images or photographs), for convenience the methods will be discussed hereinafter in reference to the orientation of an acetabular cup component of a hip prosthesis after total hip arthroplasty (THA) based on a two-dimensional radiographic image (for example, a fluoroscopic or plain radiographic image).
  • THA total hip arthroplasty
  • Methods disclosed herein can be extended to identify characteristics of an object, as nonlimiting examples, the type, size, material, make, and/or model of an implanted prosthesis. Such methods can also be extended to determine the orientations and characteristics of objects that are visible in the same image as an object of interest, for example, bones and other objects in the vicinity of an implanted prosthesis, to determine the relative orientation of the object of interest to other objects in its vicinity.
  • orthogonal method Two embodiments described in detail herein will be referred to as the “orthogonal method” and the “area method.” Both of these methods are configured to minimize measurement assumptions by the user to allow for a more accurate and precise result relative to conventional methods. It is believed that these methods provide a more focal standard deviation relative to conventional methods of analyzing two-dimensional radiographs while allowing for less radiation exposure relative to a CT (computed tomography) scan performed during standard CT three-dimensional reconstruction.
  • CT computed tomography
  • the orthogonal method uses the following equation to identify the acetabular cup orientation angle.
  • D 2 and D 3 being dimensions represented in FIG. 1A as D 2 and TL, respectively.
  • this equation is similarly subject to the identification of the edge of the anterolateral ellipse border that is frequently overshadowed by the femoral head of the implant.
  • the orthogonal method only requires the identification of one of the two edges of the ellipse.
  • An advantage of this method is that the above-noted equation utilizes readily identifiable in a plain two-dimensional radiographic image, namely, the diameter of the acetabular cup (D 2 in FIG. 1A ) and the superior edge of the acetabular cup (to determine TL in FIG. 1A ). It is unnecessary to identify the anteromedial cup margin, thereby limiting measurement of poorly defined margins to the posterolateral cup border.
  • the area method determines the three-dimensional orientation of the acetabular cup based on a measurement of its radiopaque area captured in a two-dimensional radiographic image, specifically the acetabular cup and the femoral head or acetabular cup alone.
  • this method determines orientation of an implant not only based on the measurement of its two-dimensional area as displayed on a two-dimensional image, but also based on the measurement of an object that partially obscures it. Since the method relies on area measurements in a two-dimensional image, it requires the radiographic image to be calibrated to the measurements to be taken. For the purposes of investigations described below, measurements were calibrated with the known diameter of a femoral head, though this may also be performed with a calibration ball, a cup diameter, or any other known dimension of an object that is visible in the two-dimensional image.
  • FIG. 1B represents a radiograph wherein a border defined by the combination of the acetabular cup and femoral head is outlined, and the combined two-dimensional area of the acetabular cup and femoral head enclosed therein is highlighted.
  • This highlighted area is equal to a two-dimensional area of the acetabular cup and a two-dimensional area of the femoral head less the two-dimensional area of overlap between these two components.
  • the areas of the two individual components may be determined based on known measurements of the implant or may be measured directly on the radiopaque prosthesis of the radiograph. By measuring the highlighted area and subtracting it from the combined two-dimensional area of the two components, one can determine the two-dimensional area of overlap between these two components.
  • This value relates to the anteversion angle (RA, measured from the coronal plane of the human body) as follows:
  • T and AoD are the thickness and outer diameter, respectively, of the acetabular cup
  • FD is the diameter of the femoral head.
  • the area method is implemented by a system configured to analyze a radiograph, having the border of the combined components outlined by a user, to determine the combined two-dimensional area within the border and simultaneously measure other values that may be necessary for determining the anteversion angle, such as the femoral head diameter, femoral head truncation (i.e., where the femoral head sphere is cut), acetabular cup outer diameter, and the acetabular cup inner diameter. It is foreseeable and within the scope of the invention that the system may be configured to identify the border of the combined components itself.
  • Nonlimiting embodiments of the invention will now be described in reference to experimental investigations leading up to the invention. These investigations were intended to evaluate the accuracy and precision of the orthogonal and area methods against other known plain radiographic methods, and against a CT three-dimensional reconstruction measurement.
  • the conventional methods used in these investigations are described in the publications Lewinnek et al., Dislocations after total hip - replacement arthroplasties , The Journal of Bone and Joint Surgery, American volume, 1978; 60(2):217-220 (referred to herein as the “Lewinnek method”) and Widmer K., A simplified method to determine acetabular cup anteversion from plain radiographs , The Journal of arthroplasty, 2004; 19(3):387-390 (referred to herein as the “Widmer method”).
  • the orthogonal, area, Widmer, and Lewinnek methods were each used to determine the acetabular cup anteversion angle on 160 anteroposterior (AP) pelvis radiographs for 160 total hip replacements (130 patients) performed between January 2012 and December 2015.
  • AP anteroposterior
  • twenty-one CT scans were included, allowing for assessment via three-dimensional reconstruction.
  • anteversion angles were measured relative to the functional coronal plane.
  • Each of the two-dimensional methods were compared to the CT measurements, with a positive value representing an overestimation on planar radiography compared to CT-based measurement. The methods were performed on each radiograph by multiple users.
  • Whisker and box plots were used to represent the difference between each method and the CT-based measurement. Using Pearson's regression and coefficient of determination, the linearity of each measurement was determined with increasing anteversion according to CT-based measurement. The inter- and intra-observer reliability was represented with intraclass correlation coefficients (ICC) and a 95% confidence interval (CI). The two-way random-effects intraclass correlation model and absolute agreement was used for this calculation. An ICC of one represented perfect reliability, while zero represented no relationship. The resulting inter-observer reliability for each method is disclosed in a table shown in FIG. 6 . There were no acetabular cups with retroversion according to CT and cross-table radiograph assessment.
  • ICC intraclass correlation coefficients
  • CI 95% confidence interval
  • the mean differences from CT-based measurements were ⁇ 0.2° (standard deviation (SD), 2.7°), ⁇ 1.2° (SD, 4.5°), ⁇ 2.3° (SD, 4.8°), and 6.9° (SD, 4.5°) for the area, orthogonal, Lewinnek, and Widmer methods, respectively ( FIG. 3 ).
  • the Lewinnek, orthogonal, and area methods resulted in statistically lower recorded anteversion than the Widmer method with (p ⁇ 0.0001).
  • the coefficients of determination (r 2 ) of the methods with respect to CT-based measurement were 0.915, 0.811, 0.774, and 0.733 for the area, Widmer, orthogonal, and Lewinnek methods, respectively ( FIG. 2 ).
  • the variability of measurements between observers averaged 0.37°, 3.110, 3.17°, and 3.80° for the area, Lewinnek, orthogonal, and Widmer methods.
  • the area method provided the greatest accuracy and precision, within 10 of CT three-dimensional reconstruction within the range of 10 to 30° of measured radiographic anteversion.
  • the area method was determined to be the most reliable radiographic measurement method tested (intra- and inter-observer reliability at 0.998 and 0.992, respectively). Furthermore, this method most closely mirrored the anteversion measured from the CT three-dimensional reconstruction (coefficients of determination (r 2 ) of 0.915).
  • the above methods using two-dimensional images were applied to 2909 unique AP radiographs of the pelvis, each with a corresponding prosthesis orientation determined from each radiograph using the area method described above.
  • a convolutional neural network was then trained with a given input of each radiograph and a corresponding output of the implant orientation from the area method (obtained by hand). The purpose of this was to create a convolutional neural network (an automated method) that may take any similar radiograph to that trained as an input and the output would be the orientation of the implant.
  • the convolutional neural network after an initial training, was able to achieve an accuracy of within 4.23°+4.26° from hand measurements.
  • the use of a convolutional network also allowed a speedy calculation, taking only 0.0211 seconds to achieve results from one image.
  • the accuracy of the convolutional neural network may be further optimized with Bayesian optimization, where the parameters used to train the convolutional neural network may be optimized under this method. This should enable greater accuracy and precision than that obtained by the procedure described above.
  • the structure of the convolutional neural network may be further adjusted for optimization.
  • the above methods were further applied to 2909 unique AP radiographs of the pelvis and corresponding convolutional neural network to classify each pixel in the image as the pelvis, femur, prosthesis femoral stem component, and prosthesis acetabular component.
  • This information was applied to the 2909 images and used to train the convolutional neural network.
  • the preliminary results offered pixel classification accuracy of 98%, meaning every pixel in the image provided would be classified with 98% accuracy of either a pelvis, femur, prosthetic femoral stem, or prosthetic acetabulum. This may also be further optimized with Bayesian optimization and the structure of the convolutional neural network.
  • the above methods are believed to be applicable to a similar number of CT or MRI scans of the pelvis. Doing so would allow one of ordinary skill in the art to identify the true orientation of a component of interest relative to any other structure appearing in a two-dimensional image.
  • the orientation of the acetabular cup was determined relative to a patient's anterior pelvic plane, although methods disclosed herein are not limited thereto. Cup orientation obtained with several thousand images can be used to train a convolutional neural network, thus providing a means for automating a program to determine the three-dimensional orientation of the acetabular cup relative to nearby structures using a two-dimensional radiograph as its input.
  • aspects of the present disclosure include applying measurements of an implant orientation to train a convolutional neural network.
  • the use of a convolutional neural network enables a computer to automate the described methods, thereby measuring the orientation of an implant (or other object) from a two-dimensional image based on the training data provided. Briefly, it does so by optimizing a relationship between its inputs and corresponding outputs.
  • Such a method may be applied to numerous images as a means to train the convolutional neural network.
  • Additional aspects of the present disclosure involve identifying structures and tissues in the vicinity of an implant and visible with the implant in a two-dimensional image as a means to ascertain an orientation of the implant relative to nearby structures/tissues.
  • a convolutional neural network as described above may be employed to identify the nearby structures in the two-dimensional images discussed above.
  • data obtained from three-dimensional (3D) images as an example, a computed tomography (CT) or magnetic resonance imaging (MRI) scan, were used to accurately identify true orientation of an object of interest relative to nearby structures of interest.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • the convolutional neural network was then trained on the original two-dimensional image with the true orientation obtained from the three-dimensional image. In essence, doing so enabled the convolutional neural network to identify structures in the two-dimensional image that may determine the relative three-dimensional orientation.
  • a nonlimiting example of an application for the above is to identify the orientation of the acetabular component relative to the functional coronal plane of the pelvis.
  • the functional coronal plane is a plane defined by the right and left anterior superior iliac spine (ASIS) and the pubic symphysis.
  • ASIS anterior superior iliac spine
  • the 2D radiographs may be used as the input to the convolutional neural network, while the 3D orientation will be the output, thereby presenting a method to teach a program to automatically determine the 3D orientation of an object of interest relative to nearby structures of interest from a 2D image.
  • Convolutional neural networks are believed, at this time, to be a suitable version of artificial intelligence and deep learning for use in the methods described above due to their application to a matrix.
  • the matrix is the image (a matrix of red, green, and blue values).
  • a convolutional neural network can be taught to identify characteristics of an object of interest in an input 2D image, as nonlimiting examples, the make, model, type, material, and size of an implant in the input 2D image.
  • the operative notes of 1594 total hip arthroplasty procedures were identified that contained mention of one of eight commercially available femoral stems.
  • a convolutional neural network was developed from 1410 AP Hip radiographs, after which the neural network was tested on a subsequent 706 AP Hip radiographs. The neural network was then run on an iPhone 6 to evaluate its potential use in app design.
  • the neural network achieved 100.00% accuracy on the 1410 learning radiographs, and achieved 95.15% accuracy in classifying femoral stem constructs when tested on the novel 706 radiographs.
  • the neural network also displayed the probability (confidence) of the femoral stem classification for any input radiograph, and on the basis of general model alone was able to achieve percent confidence ranging from 91.31% to 99.97% for seven different models.
  • the neural network averaged a runtime of 1.03+0.05 seconds for an iPhone 6 to calculate from a given radiograph. From this, it was concluded that a relatively simple convolutional neural network is capable of generating high accuracy in identifying implant designs, and can run on a personal device to offer additional benefits to a learning resident or attending surgeon.
  • Another application is to use radiographs as inputs, dual-energy X-ray absorptiometry (DXA, or DEXA) scan results as outputs, and thus present an option to create a convolutional neural network that may use radiographs to predict DEXA scan results as a means to predict osteopenia or osteoporosis.
  • DXA dual-energy X-ray absorptiometry

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Dentistry (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Prostheses (AREA)

Abstract

Methods for classifying and measuring orientations of objects, as nonlimiting examples, implants utilizing two-dimensional radiographs. One such method determines a three-dimensional orientation of an object based on its area projected onto a two-dimensional image and known or measured geometry. Another such method provides an automated solution to computationally determine the orientation and characterizing features of an implant based on two-dimensional radiographs. Orientations and characteristics of one or more objects in the vicinity of an object of interest may also be determined.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 62/715,891, filed Aug. 8, 2018, and U.S. Provisional Application No. 62/818,929, filed Mar. 15, 2019. The contents of these prior applications are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • The present invention generally relates to methods for measuring orientations of objects that may be present or placed in a body or other object, as a nonlimiting example, a prosthesis or prosthetic implant in a subject, all of which will hereinafter simply be referred to as “implant(s)” as a matter of convenience. The invention particularly relates to methods of obtaining information regarding an implant that involve analyzing two-dimensional images thereof to determine the orientation of the implant within a subject, optionally various characteristics of the implant, as nonlimiting examples, type, size, material, make, and/or model, and optionally the orientation and characteristics of objects in the vicinity of the implant and visible in the same image to determine a relative orientation of the implant to nearby objects.
  • FIGS. 4A, 4B, and 4C contain images that schematically represent a total hip replacement implant composed of components including an acetabular component (“cup”), a plastic liner, a femoral head, and a femoral stem. Mal-position of the acetabular cup after total hip arthroplasty (THA) is important to assess due to associated risk of impingement, dislocation, and accelerated wear. Metal objects are radiopaque in x-ray images, as readily seen in the x-ray image d of the total hip replacement implant in FIG. 4D. As such, radiopaque images of implanted acetabular cups have been utilized to assess their orientation. Implant models have unique designs and thus unique radiographic signatures on an x-ray image. These characteristic features can be used to identify the implant size, type, material, make, and model within a radiopaque image.
  • Both computed tomography (CT) scans and plain two-dimensional (2D) radiographs (also referred to herein as radiographic images) have been used to measure acetabular cup orientation. FIG. 5 represents the current convention where acetabular cup orientation is described by its inclination and anteversion angles (RI, RA) from, respectively, the sagittal and coronal planes of a human body.
  • While CT three-dimensional reconstruction is considered to be the most accurate radiographic assessment of acetabular cup orientation, plain radiographs are frequently employed in its place due to the additional radiation exposure and cost to the patient associated with CT scans. Since plain radiographs produce a 2D image, measuring implant orientation that is not orthogonal or parallel to the plane of the image (i.e., anteversion and inclination) can be difficult. While various methods exist for measuring acetabular cup orientation from radiographs, significant variability has been reported on the accuracy of these methods. One contribution to the variability reported may be the dependency these methods have on radiographically obscured points. Several conventional methods rely on accurate visualization and definition of an elliptical opening of the acetabular cup on radiographs. However, this opening is masked by the radiopacity of the acetabular cup and further hidden by the femoral head. Furthermore, the clarity of the opening is variable given its dependency on acetabular cup material, contour, and thickness. These methods have reported a standard deviation up to 10.8°, span upwards of 20° relative to CT measurements, inter-observer reliability as low as 0.747, and intra-observer reliability as low as 0.746.
  • In cases where a subject requires the replacement of a failed implant, a critical part in pre-operative planning for the orthopaedic surgeon involves the identification of the failed implant, which generally involves identifying its type, size, material, make, and/or model. This step is critical to the surgeon in deciding what materials are necessary for the surgery. Studies have reported cases where surgeons were unable to pre-operatively identify the implant, necessitating that at least twice the number of implants must be brought to the operating room, and resulted in increased time for the surgery, increased healthcare cost, and higher complexity of the surgery.
  • In view of the above, it can be appreciated that it would be desirable if alternative methods were available for measuring orientations of one or more components of an implant and/or identifying implants, and which may be capable of at least partly overcoming or avoiding problems, shortcomings or disadvantages noted above as being associated with existing methods.
  • BRIEF DESCRIPTION OF THE INVENTION
  • The present invention provides methods suitable for accurately and precisely measuring orientations of objects, as nonlimiting examples, implants, and optionally one or more objects in the vicinity of an implant, utilizing two-dimensional radiographs, and optionally also identifying such objects through characteristics thereof that are identifiable in a two-dimensional radiograph.
  • According to one aspect of the invention, a method is provided for determining a three-dimensional orientation of an object, and optionally other characteristics of the object, based on its area projected onto a two-dimensional image and known or measured geometry.
  • According to another aspect of the invention, a method is provided for determining a three-dimensional orientation of a first object that is partially obscured by a second object based on a two-dimensional image of the first and second objects. The method involves measuring a combined two-dimensional area of the first and second objects as displayed in the two-dimensional image. The combined two-dimensional area is then used to determine the three-dimensional orientation of the first object by accounting for a measured or estimated two-dimensional area of overlap between the first and second objects, a measured or estimated two-dimensional area of the first object, and a measured or estimated two-dimensional area of the second object. This area of overlap may be estimated with known areas and measurements of the first and second objects and measured, known, or assumed orientations of the first and second objects. Two-dimensional areas may be estimated for first and second objects having known or measured geometries and shapes.
  • A particular but nonlimiting application of the method described above is to employ the method to determine the three-dimensional orientation of an acetabular cup of a hip implant based on its two-dimensional area displayed in a two-dimensional radiographic image, wherein the acetabular cup is partially obscured by the femoral head of the hip implant in the two-dimensional radiographic image.
  • Other aspects include methods capable of accurately and precisely identifying characteristics of one or more objects that are identifiable in a two-dimensional radiographic image, as nonlimiting examples, the type, size, material, make, and/or model of an implant, and optionally the orientation and characteristics of one or more objects in the vicinity of the implant and visible in the same two-dimensional radiographic image to determine a relative orientation of the implant to nearby objects. Such a method comprises image processing techniques that identify an implant in question or components thereof (as nonlimiting examples, a total hip arthroplasty, femoral stem, acetabular cup), then identify features of the implant, and subsequently use the features to predict characteristics of the implant, for example, the type, size, material, make, and/or model of the implant. The method may be further applied to structures in the vicinity of the implant, for example, characterizing structures and tissues in the vicinity of the implant and visible in the same two-dimensional radiographic image. Such tissues and structures include, but are not limited to, the pelvis and femur, whose identities and orientations can be utilized to determine the orientation of the implant relative to these structures.
  • According to another aspect of the invention, a method is provided for determining the three-dimensional orientation of an acetabular cup of a hip prosthesis that is partially obscured by a femoral head of the hip prosthesis based on a two-dimensional radiographic image of the acetabular cup and the femoral head cup after the hip prosthesis has been implanted in a subject. The method includes measuring a first dimension equal to a diameter of the acetabular cup, measuring a second dimension equal to a maximum distance between a posterolateral edge of an opening of the acetabular cup and a point on an exterior surface of the acetabular cup along an axis thereof, and determining the orientation of the acetabular cup based on the measured first and second dimensions.
  • Still other aspects include methods that apply any of the aforementioned methods to a convolutional neural network, which as used herein refers to a type of artificial intelligence where a program is able to learn a relationship between many inputs and their corresponding outputs. In the context of the aforementioned methods, a convolutional neural network can be employed as a means of automating an otherwise manual process of accurately and precisely identifying characteristics of objects that are identifiable in a two-dimensional radiographic image.
  • Depending on the particular application or purpose of the method employed, technical effects of the methods described above can include the ability to measure the orientation of an implant (and/or components thereof) in two-dimensional radiographic images in a manner that is significantly more accurate and precise than conventional methods that utilize two-dimensional radiographic images, and/or the ability to identify characteristics of an implant, such as its type, size, material, make, and/or model.
  • Other aspects and advantages of this invention will be appreciated from the following detailed description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1A and 1B are two plain radiographs representing aspects of measurement methods in accordance with certain nonlimiting embodiments of the invention. FIG. 1A shows dimensions (TL, D2) for calculations used in an “orthogonal method” of the invention, and FIG. 1B shows a combined two-dimensional area for calculations used in an “area method” of the invention.
  • FIG. 2 is a graph representing measured anteversion obtained by the above-noted area (“Area”) and orthogonal (“Orthogonal”) methods as well as obtained by “Widmer” and “Lewinnek” methods (discussed below), all of which are plotted against CT three-dimensional reconstruction measurements (“CT measurements”). Measurements were averaged between multiple users. Pearson correlation coefficient (r) and coefficient of determination (r2) were calculated for each method against CT three-dimensional reconstruction measurements.
  • FIG. 3 is a graph representing Whisker and box plots for the area, orthogonal, Widmer, and Lewinnek methods as compared to CT three-dimensional reconstruction measurements (“CT”). The horizontal line in each box represents the median value, the outer regions of each box represent the interquartile range, the outside lines associated with each box represent the exclusive range, and the outside points represent outliers.
  • FIGS. 4A, 4B, 4C, and 4D are images depicting a conventional prosthesis used for a total hip replacement. FIGS. 4A and 4B are, respectively, exploded and assembly views schematically representing the various components of the implant, FIG. 4C schematically represents the implant as implanted within a human subject, and FIG. 4D shows a radiograph of a subject's hip having the implant implanted therein after total hip arthroplasty (THA).
  • FIG. 5 represents the current convention by which the three-dimensional orientation of an acetabular cup can be described by its inclination and anteversion angles (RI, RA) from, respectively, the sagittal and coronal planes of the human body.
  • FIG. 6 contains a table that presents data obtained during investigations leading to the present invention and includes inter-observer and intra-observer reliability for CT three-dimensional reconstruction measurements (CT) and the Widmer, Lewinnek, orthogonal, and area methods.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Disclosed herein are methods suitable for ascertaining the three-dimensional (3D) orientation of an object, a particular but nonlimiting example of which is an implant surgically placed in a subject, using measurements obtained from a two-dimensional image. Although it is foreseeable and within the scope of the invention that methods disclosed herein could be used to measure the orientation of various implants (for example, femoral, knee, or shoulder replacements) or other objects captured in various types of two-dimensional images (for example, radiographic images or photographs), for convenience the methods will be discussed hereinafter in reference to the orientation of an acetabular cup component of a hip prosthesis after total hip arthroplasty (THA) based on a two-dimensional radiographic image (for example, a fluoroscopic or plain radiographic image).
  • Methods disclosed herein can be extended to identify characteristics of an object, as nonlimiting examples, the type, size, material, make, and/or model of an implanted prosthesis. Such methods can also be extended to determine the orientations and characteristics of objects that are visible in the same image as an object of interest, for example, bones and other objects in the vicinity of an implanted prosthesis, to determine the relative orientation of the object of interest to other objects in its vicinity.
  • Two embodiments described in detail herein will be referred to as the “orthogonal method” and the “area method.” Both of these methods are configured to minimize measurement assumptions by the user to allow for a more accurate and precise result relative to conventional methods. It is believed that these methods provide a more focal standard deviation relative to conventional methods of analyzing two-dimensional radiographs while allowing for less radiation exposure relative to a CT (computed tomography) scan performed during standard CT three-dimensional reconstruction.
  • The orthogonal method uses the following equation to identify the acetabular cup orientation angle.
  • θ = arcsin ( 2 D 3 D 2 - 1 )
  • with D2 and D3 being dimensions represented in FIG. 1A as D2 and TL, respectively. Like many other anteversion equations, this equation is similarly subject to the identification of the edge of the anterolateral ellipse border that is frequently overshadowed by the femoral head of the implant. However, the orthogonal method only requires the identification of one of the two edges of the ellipse. An advantage of this method is that the above-noted equation utilizes readily identifiable in a plain two-dimensional radiographic image, namely, the diameter of the acetabular cup (D2 in FIG. 1A) and the superior edge of the acetabular cup (to determine TL in FIG. 1A). It is unnecessary to identify the anteromedial cup margin, thereby limiting measurement of poorly defined margins to the posterolateral cup border.
  • The area method determines the three-dimensional orientation of the acetabular cup based on a measurement of its radiopaque area captured in a two-dimensional radiographic image, specifically the acetabular cup and the femoral head or acetabular cup alone. As such, this method determines orientation of an implant not only based on the measurement of its two-dimensional area as displayed on a two-dimensional image, but also based on the measurement of an object that partially obscures it. Since the method relies on area measurements in a two-dimensional image, it requires the radiographic image to be calibrated to the measurements to be taken. For the purposes of investigations described below, measurements were calibrated with the known diameter of a femoral head, though this may also be performed with a calibration ball, a cup diameter, or any other known dimension of an object that is visible in the two-dimensional image.
  • FIG. 1B represents a radiograph wherein a border defined by the combination of the acetabular cup and femoral head is outlined, and the combined two-dimensional area of the acetabular cup and femoral head enclosed therein is highlighted. This highlighted area is equal to a two-dimensional area of the acetabular cup and a two-dimensional area of the femoral head less the two-dimensional area of overlap between these two components. The areas of the two individual components may be determined based on known measurements of the implant or may be measured directly on the radiopaque prosthesis of the radiograph. By measuring the highlighted area and subtracting it from the combined two-dimensional area of the two components, one can determine the two-dimensional area of overlap between these two components. This value relates to the anteversion angle (RA, measured from the coronal plane of the human body) as follows:
  • Area of overlap = π DF 2 sin - 1 ( 2 ( AoD 2 ( sin ( θ ) + 1 ) - T ) FD - 1 ) + 90 2 * 360 + 1 2 ( ( AoD 2 ( sin ( θ ) + 1 ) - T ) - 1 2 FD - 1 2 FD ) 2 tan ( sin - 1 ( 2 ( AoD 2 ( sin ( θ ) + 1 ) - T ) FD - 1 ) ) + AoD 2 sin ( θ ) ( π 180 tan - 1 ( 2 ( ( AoD 2 ( sin ( θ ) + 1 ) - T ) + T - AoD 2 ) 2 * ( AoD 2 ( sin ( θ ) + 1 ) - T ) - 1 2 FD - 1 2 FD ) + 1 ) - ( AoD 2 ( sin ( θ ) + 1 ) - T ) - 1 2 FD - 1 2 FD 2 ( ( AoD 2 ( sin ( θ ) + 1 ) - T ) + T - AoD 2 )
  • where θ represents the planar anteversion, T and AoD are the thickness and outer diameter, respectively, of the acetabular cup, and FD is the diameter of the femoral head.
  • According to a preferred aspect of the invention, the area method is implemented by a system configured to analyze a radiograph, having the border of the combined components outlined by a user, to determine the combined two-dimensional area within the border and simultaneously measure other values that may be necessary for determining the anteversion angle, such as the femoral head diameter, femoral head truncation (i.e., where the femoral head sphere is cut), acetabular cup outer diameter, and the acetabular cup inner diameter. It is foreseeable and within the scope of the invention that the system may be configured to identify the border of the combined components itself.
  • Nonlimiting embodiments of the invention will now be described in reference to experimental investigations leading up to the invention. These investigations were intended to evaluate the accuracy and precision of the orthogonal and area methods against other known plain radiographic methods, and against a CT three-dimensional reconstruction measurement. The conventional methods used in these investigations are described in the publications Lewinnek et al., Dislocations after total hip-replacement arthroplasties, The Journal of Bone and Joint Surgery, American volume, 1978; 60(2):217-220 (referred to herein as the “Lewinnek method”) and Widmer K., A simplified method to determine acetabular cup anteversion from plain radiographs, The Journal of arthroplasty, 2004; 19(3):387-390 (referred to herein as the “Widmer method”).
  • In particular, the orthogonal, area, Widmer, and Lewinnek methods were each used to determine the acetabular cup anteversion angle on 160 anteroposterior (AP) pelvis radiographs for 160 total hip replacements (130 patients) performed between January 2012 and December 2015. In addition, twenty-one CT scans were included, allowing for assessment via three-dimensional reconstruction. For the CT three-dimensional reconstructions, anteversion angles were measured relative to the functional coronal plane. Each of the two-dimensional methods were compared to the CT measurements, with a positive value representing an overestimation on planar radiography compared to CT-based measurement. The methods were performed on each radiograph by multiple users.
  • Whisker and box plots were used to represent the difference between each method and the CT-based measurement. Using Pearson's regression and coefficient of determination, the linearity of each measurement was determined with increasing anteversion according to CT-based measurement. The inter- and intra-observer reliability was represented with intraclass correlation coefficients (ICC) and a 95% confidence interval (CI). The two-way random-effects intraclass correlation model and absolute agreement was used for this calculation. An ICC of one represented perfect reliability, while zero represented no relationship. The resulting inter-observer reliability for each method is disclosed in a table shown in FIG. 6. There were no acetabular cups with retroversion according to CT and cross-table radiograph assessment.
  • The mean differences from CT-based measurements were −0.2° (standard deviation (SD), 2.7°), −1.2° (SD, 4.5°), −2.3° (SD, 4.8°), and 6.9° (SD, 4.5°) for the area, orthogonal, Lewinnek, and Widmer methods, respectively (FIG. 3). The Lewinnek, orthogonal, and area methods resulted in statistically lower recorded anteversion than the Widmer method with (p<0.0001). The Lewinnek method resulted in a statistically lower anteversion measurement than the orthogonal (−1.6°; p<0.001) and area methods (−1.2°; p=0.004). The orthogonal method was not statistically different from the area method (−0.9°; p=0.293). The coefficients of determination (r2) of the methods with respect to CT-based measurement were 0.915, 0.811, 0.774, and 0.733 for the area, Widmer, orthogonal, and Lewinnek methods, respectively (FIG. 2). The variability of measurements between observers averaged 0.37°, 3.110, 3.17°, and 3.80° for the area, Lewinnek, orthogonal, and Widmer methods. The area method provided the greatest accuracy and precision, within 10 of CT three-dimensional reconstruction within the range of 10 to 30° of measured radiographic anteversion.
  • In the above described investigations, the area method was determined to be the most reliable radiographic measurement method tested (intra- and inter-observer reliability at 0.998 and 0.992, respectively). Furthermore, this method most closely mirrored the anteversion measured from the CT three-dimensional reconstruction (coefficients of determination (r2) of 0.915).
  • It is believed that the standard deviation of 2.7° observed for the area method was largely contributed by pelvis positioning in the radiographs. It has been reported that pelvis orientation and posture significantly affect measured anteversion, with 1° of pelvic tilt changing anteversion by about 0.7 to 0.8°. Given the intended application of these methods to a diverse clinical setting, no radiographs were excluded from the investigations despite variability in posture and quality. When controlling for pelvic orientation by having users separately assess the same radiograph, variability averaged 0.37° and inter-observer reliability at 0.992. Thus, these investigations indicated that the area method is precise, but measured anteversion is similarly dependent on pelvic orientation. This conclusion was supported by an adjunct study involving radiographic measurement of 160 acetabular cup orientations when compared to anteversion measured by an iPhone accelerometer. In this experiment, the coefficient of determination improved from 0.915 to 0.997; equivalent to a Pearson correlation coefficient of 0.999.
  • In further investigations, the application of artificial neural networks was evaluated to determine their ability to both automate the described measurement technique performed on an implant and identify the implant, thus offering a timely and accurate computational solution to a common problem. One of the most advanced forms of artificial intelligence to date is known as “deep learning” and includes artificial neural networks. When plotted out, an artificial neural network's architecture appears much like neurons and dendrites, intended to mirror human thought. In the context of the ability to identify an implant, the use of a convolutional neural network would also be able to provide a percent confidence that an implant as been accurately classified, offering additional valuable information to the user.
  • For purposes of investigating artificial neural networks, the above methods using two-dimensional images were applied to 2909 unique AP radiographs of the pelvis, each with a corresponding prosthesis orientation determined from each radiograph using the area method described above. A convolutional neural network was then trained with a given input of each radiograph and a corresponding output of the implant orientation from the area method (obtained by hand). The purpose of this was to create a convolutional neural network (an automated method) that may take any similar radiograph to that trained as an input and the output would be the orientation of the implant. The convolutional neural network, after an initial training, was able to achieve an accuracy of within 4.23°+4.26° from hand measurements. The use of a convolutional network also allowed a speedy calculation, taking only 0.0211 seconds to achieve results from one image. The accuracy of the convolutional neural network may be further optimized with Bayesian optimization, where the parameters used to train the convolutional neural network may be optimized under this method. This should enable greater accuracy and precision than that obtained by the procedure described above. Similarly, the structure of the convolutional neural network may be further adjusted for optimization.
  • The above methods were further applied to 2909 unique AP radiographs of the pelvis and corresponding convolutional neural network to classify each pixel in the image as the pelvis, femur, prosthesis femoral stem component, and prosthesis acetabular component. This information was applied to the 2909 images and used to train the convolutional neural network. The preliminary results offered pixel classification accuracy of 98%, meaning every pixel in the image provided would be classified with 98% accuracy of either a pelvis, femur, prosthetic femoral stem, or prosthetic acetabulum. This may also be further optimized with Bayesian optimization and the structure of the convolutional neural network.
  • The above methods are believed to be applicable to a similar number of CT or MRI scans of the pelvis. Doing so would allow one of ordinary skill in the art to identify the true orientation of a component of interest relative to any other structure appearing in a two-dimensional image. In the example above, the orientation of the acetabular cup was determined relative to a patient's anterior pelvic plane, although methods disclosed herein are not limited thereto. Cup orientation obtained with several thousand images can be used to train a convolutional neural network, thus providing a means for automating a program to determine the three-dimensional orientation of the acetabular cup relative to nearby structures using a two-dimensional radiograph as its input.
  • In view of the above, aspects of the present disclosure include applying measurements of an implant orientation to train a convolutional neural network. The use of a convolutional neural network enables a computer to automate the described methods, thereby measuring the orientation of an implant (or other object) from a two-dimensional image based on the training data provided. Briefly, it does so by optimizing a relationship between its inputs and corresponding outputs. Such a method may be applied to numerous images as a means to train the convolutional neural network.
  • Additional aspects of the present disclosure involve identifying structures and tissues in the vicinity of an implant and visible with the implant in a two-dimensional image as a means to ascertain an orientation of the implant relative to nearby structures/tissues. For example, a convolutional neural network as described above may be employed to identify the nearby structures in the two-dimensional images discussed above. In a particular example, data obtained from three-dimensional (3D) images, as an example, a computed tomography (CT) or magnetic resonance imaging (MRI) scan, were used to accurately identify true orientation of an object of interest relative to nearby structures of interest. The convolutional neural network was then trained on the original two-dimensional image with the true orientation obtained from the three-dimensional image. In essence, doing so enabled the convolutional neural network to identify structures in the two-dimensional image that may determine the relative three-dimensional orientation.
  • A nonlimiting example of an application for the above is to identify the orientation of the acetabular component relative to the functional coronal plane of the pelvis. The functional coronal plane is a plane defined by the right and left anterior superior iliac spine (ASIS) and the pubic symphysis. Once this orientation is determined by numerous patients accurately from three-dimensional imaging (e.g., CT or MRI), the patient's corresponding 2D radiographs can be obtained. The 2D radiographs may be used as the input to the convolutional neural network, while the 3D orientation will be the output, thereby presenting a method to teach a program to automatically determine the 3D orientation of an object of interest relative to nearby structures of interest from a 2D image.
  • Convolutional neural networks are believed, at this time, to be a suitable version of artificial intelligence and deep learning for use in the methods described above due to their application to a matrix. In this case, the matrix is the image (a matrix of red, green, and blue values). These techniques may also be applied to other available version of artificial intelligence similarly readily applicable to images.
  • Given the ready application to images of convolutional neural networks, a convolutional neural network can be taught to identify characteristics of an object of interest in an input 2D image, as nonlimiting examples, the make, model, type, material, and size of an implant in the input 2D image. As an example, the operative notes of 1594 total hip arthroplasty procedures were identified that contained mention of one of eight commercially available femoral stems. From these, a convolutional neural network was developed from 1410 AP Hip radiographs, after which the neural network was tested on a subsequent 706 AP Hip radiographs. The neural network was then run on an iPhone 6 to evaluate its potential use in app design. The neural network achieved 100.00% accuracy on the 1410 learning radiographs, and achieved 95.15% accuracy in classifying femoral stem constructs when tested on the novel 706 radiographs. The neural network also displayed the probability (confidence) of the femoral stem classification for any input radiograph, and on the basis of general model alone was able to achieve percent confidence ranging from 91.31% to 99.97% for seven different models. The neural network averaged a runtime of 1.03+0.05 seconds for an iPhone 6 to calculate from a given radiograph. From this, it was concluded that a relatively simple convolutional neural network is capable of generating high accuracy in identifying implant designs, and can run on a personal device to offer additional benefits to a learning resident or attending surgeon.
  • Another application is to use radiographs as inputs, dual-energy X-ray absorptiometry (DXA, or DEXA) scan results as outputs, and thus present an option to create a convolutional neural network that may use radiographs to predict DEXA scan results as a means to predict osteopenia or osteoporosis.
  • While the invention has been described in terms of particular embodiments and investigations, it should be apparent that alternatives could be adopted by one skilled in the art. For example, the methods may be used to determine the orientation of implants or other objects other than acetabular cups of a hip prosthesis, the implant measured and its components could differ in appearance and construction from the embodiments described herein and shown in the drawings, and the measurements may be used to analyze image types other than radiographs. Accordingly, it should be understood that the invention is not necessarily limited to any embodiment described herein or illustrated in the drawings. It should also be understood that the phraseology and terminology employed above are for the purpose of describing the illustrated embodiments and investigations, and do not necessarily serve as limitations to the scope of the invention. Therefore, the scope of the invention is to be limited only by the following claims.

Claims (20)

1. A method of determining a three-dimensional orientation of an object and optionally other characteristics of the object based on its area projected onto a two-dimensional image and known or measured geometry.
2. The method of claim 1, wherein the other characteristics of the object comprise one or more of make, model, and material of the object based on the area thereof projected onto the two-dimensional image and the known or measured geometry and radiographic opacity thereof.
3. The method of claim 1, further comprising determining a three-dimensional orientation of the object relative to a second object in the vicinity of the object in the two-dimensional image.
4. The method of claim 1, wherein the object is a first object that is partially obscured by a second object, the method comprising:
measuring a combined two-dimensional area of the first and second objects as displayed in the two-dimensional image; and
using the combined two-dimensional area to determine the three-dimensional orientation of the first object by accounting for a measured or estimated two-dimensional area of overlap between the first and second objects, a measured or estimated two-dimensional area of the first object, and a measured or estimated two-dimensional area of the second object, wherein the two-dimensional area of overlap is estimated with known areas and measurements of the first and second objects and measured, known, or assumed orientations of the first and second objects, and wherein optionally the first and second objects have known or measured geometries and shapes and the two-dimensional areas of the first and second objects are estimated.
5. The method of claim 1, wherein the method uses artificial intelligence to determine the three-dimensional orientation of the object.
6. The method of claim 1, wherein the method uses deep learning to determine the three-dimensional orientation of the object.
7. The method of claim 1, wherein the method uses a convolutional neural network to determine the three-dimensional orientation of the object.
8. A method of determining the three-dimensional orientation of a first object that is partially obscured by a second object based on a two-dimensional image of the first and second objects, the method comprising:
measuring a combined area of the first and second objects as displayed in the image;
determining a two-dimensional area of overlap between the first and second object by adding known actual areas of the first and second objects and subtracting therefrom the combined area measured in the image; and
determining the orientation of the first object based on the two-dimensional area of overlap between the first and second objects.
9. The method of claim 8, wherein the first object is a first prosthetic component.
10. The method of claim 9, wherein the second object is a second prosthetic component.
11. The method of claim 8, wherein the first and second components are an acetabular cup and a femoral head of a hip prosthesis.
12. The method of claim 8, wherein the image is a radiographic image.
13. The method of claim 8, further comprising determining other characteristics of the first object based on its area projected onto the two-dimensional image and the known or measured geometry thereof.
14. The method of claim 13, wherein the other characteristics of the first object comprise one or more of make, model, and material of the first object based on the area thereof projected onto the two-dimensional image and the known or measured geometry and radiographic opacity thereof.
15. The method of claim 13, further comprising determining the orientation and characteristics of a second object in the vicinity of the first object in the two-dimensional image and determining therefrom a relative orientation of the first object to the second object.
16. A method of determining the three-dimensional orientation of an acetabular cup of a hip prosthesis that is partially obscured by a femoral head of the hip prosthesis based on a two-dimensional radiographic image of the acetabular cup and the femoral head after the hip prosthesis has been implanted in a subject, the method comprising:
measuring a first dimension equal to a diameter of the acetabular cup;
measuring a second dimension equal to a maximum distance between a posterolateral edge of an opening of the acetabular cup and a point on an exterior surface of the acetabular cup along an axis thereof; and
determining the orientation of the acetabular cup based on the measured first and second dimensions.
17. The method of claim 16, wherein determining the orientation of the acetabular cup is based on the following equation:
θ = arcsin ( 2 D 3 D 2 - 1 )
where D2 and D3 are the first and second dimensions, respectively.
18. The method of claim 16, further comprising determining other characteristics of the hip prosthesis based on its area projected onto the two-dimensional image and the known or measured geometry thereof.
19. The method of claim 18, wherein the other characteristics of the hip prosthesis comprise one or more of make, model, and material of the hip prosthesis based on the area thereof projected onto the two-dimensional image and the known or measured geometry and radiographic opacity thereof.
20. The method of claim 18, further comprising determining the orientation and characteristics of a structure or tissue in the vicinity of the hip prosthesis and visible in the two-dimensional image and determining therefrom a relative orientation of the first object to the structure or tissue.
US16/535,566 2018-08-08 2019-08-08 Methods of classifying and/or determining orientations of objects using two-dimensional images Active US11227385B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/535,566 US11227385B2 (en) 2018-08-08 2019-08-08 Methods of classifying and/or determining orientations of objects using two-dimensional images

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862715891P 2018-08-08 2018-08-08
US201962818929P 2019-03-15 2019-03-15
US16/535,566 US11227385B2 (en) 2018-08-08 2019-08-08 Methods of classifying and/or determining orientations of objects using two-dimensional images

Publications (2)

Publication Number Publication Date
US20200082526A1 true US20200082526A1 (en) 2020-03-12
US11227385B2 US11227385B2 (en) 2022-01-18

Family

ID=69415626

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/535,566 Active US11227385B2 (en) 2018-08-08 2019-08-08 Methods of classifying and/or determining orientations of objects using two-dimensional images

Country Status (2)

Country Link
US (1) US11227385B2 (en)
WO (1) WO2020033656A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210327065A1 (en) * 2020-04-18 2021-10-21 Mark B. Wright Prosthesis scanning and identification system and method
CN113744214A (en) * 2021-08-24 2021-12-03 北京长木谷医疗科技有限公司 Femoral stem placement method and device based on deep reinforcement learning and electronic equipment
US20220028113A1 (en) * 2018-11-26 2022-01-27 Metamorphosis Gmbh Artificial-intelligence based reduction support
WO2024090050A1 (en) * 2022-10-27 2024-05-02 富士フイルム株式会社 Image processing device, method, and program, and learning device, method, and program

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007524438A (en) * 2003-03-25 2007-08-30 イメージング セラピューティクス,インコーポレーテッド Compensation method in radiological image processing technology
DE102005012708A1 (en) * 2005-03-11 2006-09-21 Eberhard-Karls-Universität Tübingen Method for determining body orientations in space based on two x-ray images
WO2009087214A1 (en) * 2008-01-09 2009-07-16 Stryker Leibinger Gmbh & Co. Kg Stereotactic computer assisted surgery based on three-dimensional visualization
JP2009195490A (en) * 2008-02-21 2009-09-03 Lexi:Kk Program for preoperative plan of artificial hip joint replacement, and instrument for supporting the replacement
US8160326B2 (en) * 2008-10-08 2012-04-17 Fujifilm Medical Systems Usa, Inc. Method and system for surgical modeling
US20140276872A1 (en) * 2013-03-15 2014-09-18 Otismed Corporation Customized acetabular cup positioning guide and system and method of generating and employing such a guide
ES2683370T3 (en) * 2013-05-08 2018-09-26 Stryker European Holdings I, Llc C arm adjustment
US10258256B2 (en) * 2014-12-09 2019-04-16 TechMah Medical Bone reconstruction and orthopedic implants
US10758198B2 (en) * 2014-02-25 2020-09-01 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US10433914B2 (en) * 2014-02-25 2019-10-08 JointPoint, Inc. Systems and methods for intra-operative image analysis
WO2016123700A1 (en) * 2015-02-02 2016-08-11 Orthosoft Inc. Mechanically guided impactor for hip arthroplasty
US20180342315A1 (en) * 2015-08-31 2018-11-29 Halifax Biomedical Inc. Method for 3d imaging of mechanical assemblies transplanted into mammalian subjects
US10321961B2 (en) * 2015-11-05 2019-06-18 Howmedica Osteonics Corp. Patient specific implantation method for range of motion hip impingement
US20170245942A1 (en) * 2016-02-26 2017-08-31 Radlink, Inc. System and Method For Precision Position Detection and Reproduction During Surgery
US11071596B2 (en) * 2016-08-16 2021-07-27 Insight Medical Systems, Inc. Systems and methods for sensory augmentation in medical procedures
DE102017201164B3 (en) * 2017-01-25 2018-01-18 ACES Ing.-GmbH Method for measuring an X-ray image of a medical examination area together with the associated device and computer program
JP7227168B2 (en) * 2017-06-19 2023-02-21 モハメド・アール・マーフーズ Surgical Navigation of the Hip Using Fluoroscopy and Tracking Sensors
US11166764B2 (en) * 2017-07-27 2021-11-09 Carlsmed, Inc. Systems and methods for assisting and augmenting surgical procedures
WO2019079521A1 (en) * 2017-10-17 2019-04-25 Friedrich Boettner Fluoroscopy-based measurement and processing system and method
WO2020123928A1 (en) * 2018-12-14 2020-06-18 Mako Surgical Corp. Systems and methods for preoperative planning and postoperative analysis of surgical procedures

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220028113A1 (en) * 2018-11-26 2022-01-27 Metamorphosis Gmbh Artificial-intelligence based reduction support
US11954887B2 (en) * 2018-11-26 2024-04-09 Metamorphosis Gmbh Artificial-intelligence based reduction support
US20210327065A1 (en) * 2020-04-18 2021-10-21 Mark B. Wright Prosthesis scanning and identification system and method
CN113744214A (en) * 2021-08-24 2021-12-03 北京长木谷医疗科技有限公司 Femoral stem placement method and device based on deep reinforcement learning and electronic equipment
WO2024090050A1 (en) * 2022-10-27 2024-05-02 富士フイルム株式会社 Image processing device, method, and program, and learning device, method, and program

Also Published As

Publication number Publication date
WO2020033656A1 (en) 2020-02-13
US11227385B2 (en) 2022-01-18

Similar Documents

Publication Publication Date Title
US11227385B2 (en) Methods of classifying and/or determining orientations of objects using two-dimensional images
US20210290394A1 (en) Hybrid Tracking System
US20210169367A1 (en) Bone reconstruction and orthopedic implants
US20160317309A1 (en) Acquiring and Utilizing Kinematic Information for Patient-Adapted Implants, Tools and Surgical Procedures
Lazennec et al. Acetabular and femoral anteversions in standing position are outside the proposed safe zone after total hip arthroplasty
US11337760B2 (en) Automated hip analysis methods and devices
AU2016369607A1 (en) IMU calibration
Bayraktar et al. Accuracy of measuring acetabular cup position after total hip arthroplasty: comparison between a radiographic planning software and three-dimensional computed tomography
Thelen et al. Normative 3D acetabular orientation measurements by the low-dose EOS imaging system in 102 asymptomatic subjects in standing position: analyses by side, gender, pelvic incidence and reproducibility
Woerner et al. Visual intraoperative estimation of cup and stem position is not reliable in minimally invasive hip arthroplasty
Wright et al. Functional and anatomic orientation of the femoral head
Pineau et al. Dual mobility hip arthroplasty wear measurement: experimental accuracy assessment using radiostereometric analysis (RSA)
Guenoun et al. Reliability of a new method for evaluating femoral stem positioning after total hip arthroplasty based on stereoradiographic 3D reconstruction
Raymond et al. Magnetic resonance scanning vs axillary radiography in the assessment of glenoid version for osteoarthritis
Demzik et al. Inter-rater and intra-rater repeatability and reliability of EOS 3-dimensional imaging analysis software
US20230005232A1 (en) Systems and methods of using three-dimensional image reconstruction to aid in assessing bone or soft tissue aberrations for orthopedic surgery
Wang et al. Two-dimensional and three-dimensional cup coverage in total hip arthroplasty with developmental dysplasia of the hip
Shareghi et al. Clinical evaluation of model‐based radiostereometric analysis to measure femoral head penetration and cup migration in four different cup designs
Jaramaz et al. 2D/3D registration for measurement of implant alignment after total hip replacement
Hansen et al. Dynamic radiostereometric analysis for evaluation of hip joint pathomechanics
Reito et al. Assessment of inter-and intra-observer reliability in the determination of radiographic version and inclination of the cup in metal-on-metal hip resurfacing
Steppacher et al. Validation of a new method for determination of cup orientation in THA
Schindler et al. Comparison of radiographs and computed tomography for the screening of anterior inferior iliac spine impingement
Veilleux et al. Automated femoral version estimation without the distal femur
Agten et al. Measurement of acetabular version based on biplanar radiographs with 3D reconstructions in comparison to CT as reference standard in cadavers

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

AS Assignment

Owner name: LOYOLA UNIVERSITY CHICAGO, ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MURPHY, MICHAEL PATRICK;KILLEN, CAMERON JAMES;WU, KAREN;REEL/FRAME:051108/0151

Effective date: 20191121

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE