CN117813060A - Intraoperative adjustment system and method for measuring patient position using radiography - Google Patents

Intraoperative adjustment system and method for measuring patient position using radiography Download PDF

Info

Publication number
CN117813060A
CN117813060A CN202280055429.3A CN202280055429A CN117813060A CN 117813060 A CN117813060 A CN 117813060A CN 202280055429 A CN202280055429 A CN 202280055429A CN 117813060 A CN117813060 A CN 117813060A
Authority
CN
China
Prior art keywords
image
pelvic
distance
function
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280055429.3A
Other languages
Chinese (zh)
Inventor
F·贝特纳
A·普勒蒙库马尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Akupredic Co
Original Assignee
Akupredic Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Akupredic Co filed Critical Akupredic Co
Priority claimed from PCT/US2022/033270 external-priority patent/WO2022261548A1/en
Publication of CN117813060A publication Critical patent/CN117813060A/en
Pending legal-status Critical Current

Links

Landscapes

  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A system and method for image guided implant placement as a function of at least one intra-operative image during a surgical procedure is provided. The at least one computing device is configured by executing code stored in a non-transitory processor-readable medium to process the at least one pre-operative image to evaluate axial rotation and/or sagittal pelvic tilt. Further, at least one of distance, angle, and area is measured as a function of the plurality of identified anatomical landmarks in the at least one preoperative image. Thereafter, an axial rotation associated with the at least one image is measured as a function of the calculation associated with the at least one of distance, angle, and area. Thereafter, at least one value associated with placement of the implant is adjusted during the surgical procedure and information associated therewith is provided through the graphical user interface.

Description

Intraoperative adjustment system and method for measuring patient position using radiography
Technical Field
The present invention relates generally to the field of surgery and, more particularly, to particular application in image-guided total hip arthroplasty.
Background
Surgery is typically performed using radiographs, including intra-operative radiographs. Unfortunately, radiographs may be affected by the patient's position at the time of image capture, and measurements obtained using radiographs, particularly over time, may be misleading. This is especially true in the field of total joint replacement where precise implant positioning, including acetabular and femoral components, is critical to successful outcome.
It is with respect to these and other considerations that the present invention has been made herein.
Disclosure of Invention
A system and method for image guided implant placement is provided. In one or more embodiments of the invention, at least one computing device is configured by executing code stored in a readable medium of a non-transitory processor to determine a value representative of absolute axial rotation by processing at least one pelvic image presenting a lateral view and at least one pelvic image presenting an AP view. For example, the at least one lateral image is a preoperative image and the at least one AP image is an intra-operative image. At least one of the distance, angle, and area may be measured using a plurality of identified anatomical landmarks in at least one image. Thereafter, a value representative of absolute pelvic axial rotation can be determined as a function of the calculation related to at least one of distance, angle, and area. The pelvic image may be provided by radiography, fluoroscopy, or both.
In one or more embodiments of the invention, at least one computing device is configured by executing code stored in a readable medium of a non-transitory processor to determine a value representing a change in axial rotation by processing at least one pelvic image presenting a lateral view and at least two pelvic images presenting an AP view. For example, the at least one AP image is a preoperative image and the at least one AP image is an intra-operative image. At least one of the distance, angle, and area may be measured using a plurality of identified anatomical landmarks in at least one image. Thereafter, a value representative of the axial rotation change may be determined as a function of the calculation associated with at least one of the distance, the angle, and the area.
In one or more embodiments of the invention, the at least one computing device is configured by executing code stored in a readable medium of the non-transitory processor to determine a value representing a sagittal pelvic tilt change by processing the at least one pelvic image presenting a lateral view and the at least two pelvic images presenting an AP view. For example, the at least one lateral image is a preoperative image, and the at least two pelvic images present an AP view. At least one of the distance, angle, and area may be measured using a plurality of identified anatomical landmarks in at least one image. Thereafter, as a function of the calculations related to at least one of distance, angle and area, a value representing a change in pelvic sagittal inclination between the respective AP images may be determined. For example, the values may represent the degree of change in pelvic sagittal inclination from the pre-operative to intra-operative AP image.
In one or more embodiments of the invention, the at least one computing device is configured by executing code stored in a readable medium of the non-transitory processor to determine a value representing a predicted absolute axial rotation by processing the at least one AP image and/or a value representing a predicted change in sagittal inclination of the pelvis by processing the at least two AP images using machine learning and artificial intelligence. For example, a plurality of training images (including lateral images and corresponding AP images) are processed for training to determine absolute axial rotation and pelvic sagittal inclination as described above. Anatomical landmarks in the training image may then be identified and used to measure at least one of distance, angle, and area. Thereafter, values representing absolute axial rotation and changes in pelvic sagittal inclination may be determined in the training image as a function of calculations relating to at least one of distance, angle and area. Once trained, a single pelvic AP image can be used to predict a value representing absolute axial rotation as a function of artificial intelligence and machine learning, including based on at least one of distance, angle, and area measured in the single pelvic AP image. Furthermore, the two pelvic AP images may be used to predict a value representing a change in the sagittal inclination of the pelvis as a function of artificial intelligence and machine learning, including based on at least one of the distance, angle, and area measured in the two pelvic AP images.
Other features of the invention are shown and described in the present disclosure.
Drawings
The various aspects of the invention may be more readily understood upon review of the following detailed description of the various embodiments of the invention taken in conjunction with the accompanying drawings in which:
FIG. 1 shows an AP pelvic radiograph with three lines (line 1, line 2, and line 3) plotted thereon;
FIGS. 2A-2B illustrate steps in a method of assessing pelvic axial rotation in accordance with an embodiment of the invention;
FIGS. 3A-3C illustrate exemplary lateral pelvic radiographs;
FIG. 4 is a diagrammatic representation of three points on a side radiograph as shown and described in FIG. 3; and
fig. 5A-5E show exemplary radiographs and other variables that relate axial rotation to pelvic tilt.
Detailed Description
By way of overview and introduction, the present invention includes a number of technical features, relative to a user computing device, that include image-guided, specially configured hardware associated with surgical implant positioning. The combination of features presented in the present invention includes, for example, providing a system and method to determine and adjust implant positioning after determining a change in the position of the patient in the intraoperative patient as opposed to the position of the patient in the preoperative or anticipated postoperative images. Furthermore, as a function of the teachings of the present invention, one or more computing devices may be configured to detect changes in three-dimensional space. In one or more embodiments, radiographs or other images are analyzed that represent examples of pre-operative, intra-operative, and/or intended post-operative images, such as pelvis. Using the automatically identified anatomical landmarks shown in the one or more radiographs, distances, angles, and areas can be generated and used to determine changes in patient positioning and calculate a more accurate implant positioning, respectively. For example, measurements based on the location of the identified anatomical landmarks may be used to adjust implant placement to improve accuracy.
In one or more embodiments, a system and method are provided that include at least one computing device that can interface with one or more devices for acetabular cup position adjustment (e.g., until the acetabular cup is consistent with data (registered)). Further, one or more computing devices may be provided, e.g., a graphical user interface that may be configured to display one or more images (e.g., radiographs), and a tool for alerting a user, e.g., when implant positioning is achieved. One or more navigation instruments may be in communication with hardware, including the hardware shown and described in commonly owned U.S. patent No. 11,241,287, which is incorporated herein by reference, and may be configured to adjust the position of an acetabular cup. The one or more navigation instruments may include or provide navigation indicia that may be used to calculate the location of the instrument being navigated and, correspondingly, the cup to which it may be coupled. Thus, acetabular cup movement may be detected and measured in substantially real-time. Thus, the console or other hardware described herein may provide instructions to the user (which may be displayed on a display) that instruct how the acetabular cup should be positioned and/or repositioned with the patient.
It should be appreciated that various forms of computing devices may be used and provided in accordance with the invention, including server computers, personal computers, tablet computers, notebook computers, mobile computing devices (e.g., smartphones), or other suitable devices configured to access one or more data communication networks and to communicate over the networks with various machines configured to send and receive content, data, and instructions. Content and data provided via one or more computing devices may include various forms of information, including text, audio, images, and video, as non-limiting examples, and may include embedded information, such as other resources, metadata, and/or machine-executable instructions linked to a network. Each computing device may be of conventional architecture and may be configured to provide different content and services to other devices, such as a mobile computing device, one or more server computing devices. As will be appreciated by those of ordinary skill in the art, the apparatus may comprise the same machine or may be distributed across multiple machines in a large scale embodiment. In a related aspect, each computer server has one or more processors, a computer readable memory storing code that configures the processor to perform at least one function, and a communication port for connection to a network. The code may comprise one or more programs, libraries, functions or routines, which for the purposes of this description may be described in terms of modules held in a representative code/instruction memory, which implement the different portions of the processes described herein.
Furthermore, a computer program (also commonly referred to as computer control logic or computer readable program code in the present invention), such as imaging software, may be stored in main memory and/or secondary memory and implemented by one or more processors (controllers or the like) to cause the one or more processors to perform the functions of the present invention as described herein. In the present context, the terms "memory," "computer-readable medium," "computer program medium," and "computer-usable medium" are used to generally refer to, for example, random Access Memory (RAM); read Only Memory (ROM); removable storage units (e.g., magnetic or optical disks, flash memory devices, or the like); a hard disk; or the like. It should be appreciated that for a mobile computing device (e.g., tablet), a computer program such as imaging software may be in the form of an application program executing on the mobile computing device.
Referring to the drawings, wherein like reference numbers refer to like elements, FIG. 1 shows an exemplary graphical user interface provided in accordance with one or more embodiments of the present invention. As shown in fig. 1, an anteroposterior ("AP") radiograph of the pelvis is shown, on which three lines (line 1, line 2, and line 3) are drawn. In the embodiment shown in fig. 1, the line 1 is drawn between the inferior sides of the sacroiliac joint. Line 2 is drawn between the underside of the acetabular tear drop. The line 3 is drawn between the two lowest sides of the ischium and can be adjusted for significant skeletal abnormalities, such as the presence of excessive bone spurs. The distance between the midpoints of these lines is recorded on each AP pelvic image and used to measure the change in pelvic sagittal inclination between the images (e.g. two AP pelvic radiographs).
The following symbols are shown in the drawings.
The preoperative distance W represents the distance between the points on line 2, which corresponds to the pubic symphysis center and the point from this line pointing upward at a 90 degree angle to intersect line 1 on the preoperative image AP pelvic image.
The preoperative distance V represents the distance between the points on line 2, which corresponds to the pubic symphysis center and the point on the preoperative AP pelvic image from this line pointing downward at a 90 degree angle to intersect line 3.
The preoperative distance U represents the sum of the preoperative distance W and the preoperative distance V.
Similar measurements may be made during or after surgery.
As in the examples, for intra-operative measurements:
the intra-operative distance W represents the distance between the points on line 2, which corresponds to the pubic symphysis center and the point from this line pointing upward at a 90 degree angle to intersect line 1 on the intra-operative image AP pelvic image.
The intraoperative distance V represents the distance between points on line 2, which corresponds to the pubic symphysis center and the point on the intraoperative AP pelvic image from this line pointing downward at a 90 degree angle to intersect line 3.
The intra-operative distance U represents the sum of the intra-operative distance W and the intra-operative distance V.
Figures 2A-2B illustrate steps in a method of assessing pelvic axial rotation according to an embodiment of the invention. In fig. 2A, for example, an AP pelvic radiograph is obtained, and line 1 and line 2 are drawn as shown and described with reference to fig. 1. Next, as shown in fig. 2A, a line extending from the line 1 to the line 2 is drawn. The apex on line 1 corresponds to the intersection of line 1 and the center of the sacral caudal spinal column. Which extends downwards in a manner perpendicular to the line 2. The distance between the intersection of this line and the center of the pubic symphysis on line 2 (preoperative distance X on the preoperative image, intraoperative distance X on the intraoperative image, or the like) is then measured. This measurement can help to measure pelvic axial rotation and changes in pelvic axial rotation between images. Fig. 2B shows another AP pelvic radiograph, and shows lines 1 and 2, in accordance with one or more embodiments of the present invention.
Fig. 3A, 3B and 3C show exemplary lateral pelvic radiographs. Fig. 3A shows three points on the radiograph: anterior to the pubic symphysis, inferior to the ischium, and the point where the sacrum meets the posterior ilium. Fig. 3B shows how the image is calibrated using the measurements of the calibrated AP pelvic image (fig. 1). Fig. 3C shows the distance between these points (distance A, B, C) measured and used as an integral part of the measurement of pelvic axial rotation and changes in pelvic sagittal inclination between radiographs.
To calculate the axial rotation of the arithmetical front image, Z is calculated as
The axial rotation is calculated as:
similarly, the axial rotation on any continuous (intra-or post-operative) image AP pelvic radiograph or C-arm (fluoroscopic) image can be calculated as:
furthermore, the change in the pelvic sagittal inclination on the intra-or post-operative image relative to the pelvic inclination on the pre-operative image can be calculated in the following manner. This change is calculated based on measurements of the preoperative lateral pelvic image (fig. 3A).
If all three distances (distance A, distance B, and distance C) are measured, the change can be calculated as follows:
changes in pelvic sagittal inclination based on distance a (fig. 3A):
inclination change based on distance a = change a = I A -P A
Change in pelvic sagittal inclination based on distance B (fig. 3B):
inclination change based on distance B = change B = I B -P B
Change in pelvic sagittal inclination based on distance C (fig. 3C):
inclination change based on distance c=change c=i C -P C
If only one distance is available, then the change in pelvic tilt is calculated as change A, change B or change C, relative to the tilt on the preoperative image, rather than the average between them.
Once the amount of axial rotation of the pelvis and sagittal inclination of the pelvis are known, the measurement of the acetabular component during total hip arthroplasty can be corrected, as can the measurement of the femoral offset and leg length when the femoral component is inserted simultaneously using X-ray imaging (fluoroscopy), in accordance with the teachings of the present invention. Axial rotation of the pelvis to the opposite position increases the anteversion of the acetabular component and decreases the inclination of the acetabular component on the radiograph. Furthermore, increasing the anterior sagittal inclination reduces the inclination and anteversion of the acetabular component, while increasing the posterior sagittal inclination (pelvic retroversion) increases the inclination and anteversion of the acetabular component. Changes in axial rotation of the pelvis also affect the measurement of leg length and double hip offset. The change can be calculated from the change in pelvic sagittal inclination and axial rotation and used for measurement correction.
Fig. 4 is a schematic view of three points on a side radiograph such as shown and described in connection with fig. 3. Fig. 4 shows how the distances measured between the three points, as well as the various distances previously obtained on the AP pelvic radiograph, can be used to calculate the change in sagittal pelvic tilt between the two radiographs. For example, the change between the calculated angles x, y, and z (fig. 4) can be used to determine the change in sagittal pelvic position between AP radiographs.
In one or more embodiments, an artificial intelligence image recognition algorithm may be used to identify specific anatomical landmarks to facilitate measurement of angle, distance, or surface area on radiographs or fluoroscopic/C-arm images (fig. 5A-5E). Initially, one or more computing devices may be configured to identify anatomical landmarks that may be used to determine between other distances (distance 1 and distance 2, distance 3 and distance 4, distance 5 and distance 6, distance 7 and distance 8). Furthermore, the identified anatomical landmarks can be determined between other angles and areas (angle 1 and angle 2, area 1, area 2 and area 3). Such distances, angles, and areas may be identified based on, for example, pixels within the image. Based on this identification, the marker may be substantially automatically positioned within the image.
Machine learning, including those further shown and described herein, may include corrections by human experts who may correct the location of one or more markers in order to increase the accuracy of one or more measurements and/or placements. Over time, machine learning provides improvements and corrections that increase the accuracy of image recognition, measurement, and marker placement associated with image recognition algorithms. Accuracy improves to within the scope of human expert. Thus, the present invention may provide fully automatic and independent identification of patient anatomy, which may be used to measure the variables shown and described herein.
In one or more embodiments, artificial intelligence is used in conjunction with guided image recognition, including through the use of anatomical landmarks that are present in the image and are automatically identified. During machine learning, pelvic AP radiographs are analyzed to locate a specific point, such as pubic symphysis. Multiple AP pelvic images may be submitted for training, for example, by using a tool provided by a GUI that may be used to mark the joint location using a rectangle whose one corner is the joint location. For example, a rectangle is selected to define a unique set of pixels (or close thereto) to minimize the likelihood of errors. Rectangles can be drawn over all trained AP pelvic images and exported as a dataset. In one or more embodiments, the CREATE ML can be used to model and generate and/or provide a corresponding input file that is, for example, compatible with CREATE ML. CREATE ML provides a visual interface that can be used to train models using the CORE ML framework. In connection with the present invention, models can be used to accomplish a wide variety of tasks that would otherwise be difficult or impractical to write in programming code. Thus, the model is trained to classify the image or perform other tasks, such as detecting specific anatomical landmarks (e.g., unions) within the image (e.g., AP pelvic radiographs) as a function of pixels. Thus, in one or more embodiments, the IOS application executes on a computing device IOS, such as an IPAD. Certain parameters are used in CREATE ML to optimize the training process for a particular situation, including based on the number of images used, how identifiable the pixels are, image color, or other variables.
CORE ML can be used to run and optimize the hardware of the IOS and provides a smooth and ideal user experience. Of course, one of ordinary skill will recognize that there are other modeling techniques and application development environments that may be used in conjunction with machine learning, artificial intelligence, and application (e.g., mobile application) development. The machine learning process includes executing an algorithm of a set of training images to create a model. Specific inputs are provided to identify a set of pixels (e.g., rectangular selections) and to identify features for training. Training may be an iterative process in which the model attempts to predict the location of a rectangle, including by comparing information determined from the selection to one or more values provided as input. Further, one or more entries may be used to teach the model how to more closely approximate the desired output.
After training, the model may be used for automatic anatomical landmark detection and predictive capabilities related to processing new input data. The model may analyze, for example, a newly entered AP pelvic image provided by the user, and the model is based on a learned set of coordinates reporting anatomical landmarks (e.g., unions). Coordinates may be generated and then used by a graphical user interface to calculate angles and measured distances using basic equations, e.g., based on a desired drawn straight line or ellipse of software, and by one or more interfaces provided on a computing device, e.g., a mobile application running on a mobile device (e.g., IPAD).
The present invention recognizes that increasing axial rotation of the pelvis can result in changes in the appearance of the patient's pelvis on the AP pelvis or fluoroscopic/C-arm radiograph. For example, axial rotation of the pelvis results in left and right side asymmetry of the pelvis. This may result in increased differences between distance 1 and distance 2, distance 3 and distance 4, distance 5 and distance 6, and/or distance 7 and distance 8 (e.g., fig. 5A-5E). Furthermore, this may result in an increase in the difference between angle 1 and angle 2, and/or area 1 and area 2. Such as shown and described herein, these differences may be related to the calculated axial rotation.
The present invention, as described herein, provides machine learning that includes applying multiple images according to respective embodiments, training on changes represented on the multiple images, and on correlations of overall statistics or artificial intelligence established with respect to measurements of individual images. The present invention includes one or more computing devices specifically configured to automatically and accurately identify corresponding anatomical landmarks and apply to one or more corresponding calculations, such as those shown and described herein, to predict the amount of axial rotation, sagittal drift, or other corresponding change or condition.
In one or more embodiments of the invention, the differences between distance 1 and distance 2, distance 3 and distance 4, distance 5 and distance 6, and/or distance 7 and distance 8 (e.g., fig. 5A-5E) can be measured and used to predict and correct axial rotation of the pelvis. Further, such a configured computing device may predict and correct the amount of rotation based on differences between angle 1 and angle 2, and/or areas, such as area 1 and area 2 (e.g., fig. 5A-5E). The present invention provides significantly improved accuracy, including as a function of calculated measurements on lateral images of the pelvis (e.g., fig. 3A-3C and fig. 4). Thus, the present invention may eliminate the need to determine the axial rotation of the AP pelvic radiograph or C-arm/fluoroscopic image.
Further, changes in sagittal pelvic tilt between successive radiographs/fluoroscopes/C-arm images can result in changes in distances W, V and U (e.g., fig. 1), distances 1 and 2, distances 3 and 4, and distances 5 and 6 (e.g., fig. 5A-5E). Further, the variation results from between measurements of angles 1 and 2, and the relationship of areas 1 and 2 to area 3 (e.g., fig. 5A-5E), from one image to the next (e.g., in successive images). Such changes may be determined, for example, by recognition in pre-operative AP radiographs and intra-operative fluoroscopic/C-arm images of the same patient. These differences are then compared and correlated to calculate changes in pelvic tilt and to adjust implant placement.
The artificial intelligence provided as a function of machine learning and training using a sufficient number of images effectively establishes an overall correlation between the types of changes shown and described herein. Furthermore, using measurements of the same sequential images, one or more computing devices configured in accordance with the teachings of the present invention can predict changes in sagittal pelvic tilt from one image to the next (sequential images of the same patient), including changes based on functions represented as distances W, V and U (e.g., fig. 1), and distances 1 and 2, distances 3 and 4, distances 5 and 6 (e.g., fig. 5A-5E), and angles 1 and 2, and changes in area 1 and area 2 relative to area 3 (e.g., fig. 5A-5E). Once high accuracy is provided, measurement of lateral images of the pelvis (e.g., fig. 3A-3C and fig. 4) is no longer required to determine changes in sagittal pelvic inclination of successive AP pelvic radiographs and/or C-arm/fluoroscopic images of the same patient. Instead, the variation of the variables, as described in the present invention, can be sufficient to predict the variation of the sagittal pelvis inclination.
Further, in one or more embodiments, as shown in fig. 5A-5E, the amount of axial rotation in degrees and the change in sagittal pelvic tilt in degrees can be related to a change in distance 1, distance 2, distance 3, distance 4, distance 5, distance 6, distance 7, and/or distance 8, and/or angle 1, angle 2 (fig. 5A-5E), and/or a change in surface area including, but not limited to, area 1, area 2, and area 3. One or more specially configured computing devices may execute algorithms including artificial intelligence and/or machine learning to detect changes in distance, and the "look" of the two-dimensional bone surface area or AP pelvic image may be used to predict changes in axial rotation or sagittal pelvic tilt.
Thus, the methods shown and described herein utilize specific radiographic measurements of anterior-posterior pelvic (AP) and lateral radiographs to determine changes in pelvic position in three dimensions. This may be used by the surgeon for pre-operative planning, or, alternatively (or additionally), intra-operative assessment of pelvic position changes and pelvic position changes between pelvic radiographs.
While the invention has been particularly shown and described with reference to a preferred embodiment thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention. Thus, the present invention is defined not by the discussion presented above but by the following points, the corresponding features recited in these points, and equivalents of these features.
While many of the embodiments shown and described herein relate to distribution of collaborative presentations to multiple users, the present invention is not so limited. While the illustrated embodiments of the present invention have been shown and described, it should be understood that various changes, substitutions and alterations can be made herein by those having ordinary skill in the art without departing from the scope of the present invention.

Claims (21)

1. A system for image-guided implant placement during a surgical procedure using at least one intra-operative image as a function of a determined value representing axial rotation, the system comprising:
at least one computing device configured by executing code stored in a non-transitory processor-readable medium to process at least one pre-operative image:
processing the at least one pelvic image presenting a lateral view and the at least one pelvic image presenting an AP view, including measuring at least one of a distance, an angle, and an area in the at least one image as a function of the plurality of identified anatomical landmarks;
determining a value representative of absolute pelvic axial rotation as a function of a calculation associated with at least one of the distance, angle, and area;
adjusting at least one value associated with implant placement during the surgical procedure using the determined value; and
information associated with the at least one adjusted value is provided through a graphical user interface.
2. The system of claim 1, wherein the at least one pelvic image presenting a lateral view is a preoperative image and the at least one pelvic image presenting an AP view is an intra-operative image.
3. The system of claim 1, wherein the anatomical landmarks are automatically identified as a function of machine learning and artificial intelligence.
4. The system of claim 1, wherein the at least one computing device is further configured by the general execution code to:
providing, through a graphical user interface, a first line drawn between the inferior sides of the identified sacroiliac joint in at least one of the pelvic images;
providing, through the graphical user interface, a second line drawn between the underside of the acetabular tear drop identified in the at least one pre-operative image; and
a third line drawn between the two lowermost sides of the ischium is provided through the graphical user interface.
5. The system of claim 1, wherein the at least one computing device is further configured by the execution code to:
at least one distance between midpoints of the first line, the second line, and the third line is measured.
6. A system for image-guided implant placement during a surgical procedure using at least one intra-operative image as a function of a determined value representing axial rotation variation, the system comprising:
at least one computing device configured by executing code stored in a non-transitory processor-readable medium to process at least one pre-operative image:
processing the at least one pelvic image presenting a lateral view and the at least two pelvic images presenting an AP view, including measuring at least one of a distance, an angle, and an area in the at least one image as a function of the plurality of identified anatomical landmarks; determining a value representative of a change in axial rotation of the pelvis as a function of a calculation associated with at least one of the distance, angle, and area;
adjusting at least one value associated with implant placement during the surgical procedure using the determined value; and
information associated with the at least one adjusted value is provided through a graphical user interface.
7. The system of claim 6, wherein the anatomical landmarks are automatically identified as a function of machine learning and artificial intelligence.
8. The system of claim 6, wherein the at least one computing device is further configured by executing code to:
providing, through a graphical user interface, a first line drawn between the inferior sides of the identified sacroiliac joint in at least one of the pelvic images;
providing, through the graphical user interface, a second line drawn between the underside of the acetabular tear drop identified in the at least one pre-operative image; and
a third line drawn between the two lowermost sides of the ischium is provided through the graphical user interface.
9. The system of claim 6, wherein the at least one computing device is further configured by executing code to:
at least one distance between midpoints of the first line, the second line, and the third line is measured.
10. A system for image-guided implant placement during a surgical procedure using at least one intra-operative image as a function of a determined value representing a change in pelvic sagittal inclination, the system comprising:
at least one computing device configured by executing code stored in a non-transitory processor-readable medium to process at least one pre-operative image:
processing the at least one pelvic image presenting a lateral view and the pelvic images presenting at least two current AP views, including measuring at least one of distance, angle, and area in the at least one image as a function of the plurality of identified anatomical landmarks; determining a value representing a change in pelvic sagittal inclination between at least two pelvic images presenting AP views as a function of a calculation associated with at least one of the distance, angle and area;
adjusting at least one value associated with implant placement during the surgical procedure using the determined value; and
information associated with the at least one adjusted value is provided through a graphical user interface.
11. The system of claim 10, wherein the determined value represents a degree of change in pelvic sagittal inclination from a preoperative image to an intra-operative image.
12. A system for image-guided implant placement as a function of machine learning and artificial intelligence, the system comprising:
at least one computing device configured by executing code stored in a non-transitory processor-readable medium to process at least one pre-operative image:
processing a plurality of training pelvic images presenting lateral views and at least one corresponding AP view, including measuring at least one of distance, angle, and area in the training images as a function of a plurality of identified anatomical landmarks;
determining, as a function of the calculations associated with at least one of the distance, angle and area in the training image, a respective value representative of absolute pelvic axial rotation, a respective value representative of pelvic sagittal inclination, or a respective value representative of both absolute pelvic axial rotation and pelvic sagittal inclination; and, a step of, in the first embodiment,
further, wherein the at least one computing device is configured by executing code stored in a non-transitory processor-readable medium:
processing a single pelvic image presenting an AP view to predict a value representing absolute axial rotation as a function of artificial intelligence and machine learning of the training image, including based on at least one of distance, angle, and area measured in the single pelvic image presenting the AP view;
processing the plurality of pelvic images presenting the AP view to predict a value representative of a change in sagittal inclination of the pelvis as a function of artificial intelligence and machine learning of the training image includes based on at least one of distance, angle, and area measured in the plurality of pelvic images presenting the AP view.
13. The system of claim 12, wherein the at least one pelvic image presenting a lateral view is a preoperative image and the at least one pelvic image presenting an AP view is an intra-operative image.
14. The system of claim 12, wherein the anatomical landmarks are automatically identified as a function of machine learning and artificial intelligence.
15. The system of claim 12, wherein the at least one computing device is further configured by the execution code to:
providing, through a graphical user interface, a first line drawn between the inferior sides of the identified sacroiliac joint in at least one of the pelvic images;
providing, through the graphical user interface, a second line drawn between the underside of the acetabular tear drop identified in the at least one pre-operative image; and
a third line drawn between the two lowermost sides of the ischium is provided through the graphical user interface.
16. The system of claim 12, wherein the at least one computing device is further configured by the execution code to:
at least one distance between midpoints of the first line, the second line, and the third line is measured.
17. A method for image-guided implant placement during a surgical procedure using at least one intra-operative image as a function of a determined value representing axial rotation, the method comprising:
processing, by at least one computing device configured by executing code stored in a non-transitory processor-readable medium, the at least one pelvic image presenting a lateral view and the at least one pelvic image presenting an AP view, including measuring at least one of a distance, an angle, and an area as a function of a plurality of identified anatomical landmarks in the at least one image;
determining, by the at least one computing device, a value representative of absolute pelvic axial rotation as a function of a calculation associated with at least one of distance, angle, and area;
using, by the at least one computing device, the determined values to adjust at least one value associated with implant placement during the surgical procedure; and
information associated with the at least one adjusted value is provided through a graphical user interface.
18. A method for image-guided implant placement during a surgical procedure using at least one intra-operative image as a function of a determined value representing axial rotation variation, the method comprising:
processing, by at least one computing device configured by executing code stored in a non-transitory processor-readable medium, the at least one pelvic image presenting a lateral view and the at least two pelvic images presenting an AP view, including measuring at least one of a distance, an angle, and an area as a function of a plurality of identified anatomical landmarks in the at least one image;
determining, by the at least one computing device, a value representative of a change in axial rotation of the pelvis as a function of a calculation associated with at least one of distance, angle, and area;
using, by the at least one computing device, the determined values to adjust at least one value associated with implant placement during the surgical procedure; and
information associated with the at least one adjusted value is provided through a graphical user interface.
19. A method for image-guided implant placement during a surgical procedure using at least one intra-operative image as a function of a determined value representing a change in pelvic sagittal inclination, the method comprising:
processing, by at least one computing device configured by executing code stored in a non-transitory processor-readable medium, the at least one pelvic image presenting a lateral view and the at least two pelvic images presenting an AP view, including measuring at least one of a distance, an angle, and an area as a function of a plurality of identified anatomical landmarks in the at least one image;
determining, by the at least one computing device, a value representing a change in pelvic sagittal inclination between at least two pelvic images presenting AP views as a function of a calculation associated with at least one of distance, angle, and area;
using, by the at least one computing device, the determined values to adjust at least one value associated with implant placement during the surgical procedure; and
information associated with the at least one adjusted value is provided through a graphical user interface.
20. A method for image-guided implant placement as a function of machine learning and artificial intelligence, the method comprising:
processing, by at least one computing device configured by executing code stored in a non-transitory processor-readable medium, a plurality of training pelvic images presenting lateral views and at least one corresponding AP view, including measuring at least one of distance, angle, and area in the training images as a function of a plurality of identified anatomical landmarks;
determining, by the at least one computing device, as a function of a calculation associated with at least one of the distance, angle, and area in the training image, a respective value representative of absolute pelvic axial rotation, a respective value representative of pelvic sagittal inclination, or a respective value representative of both absolute pelvic axial rotation and pelvic sagittal inclination; and
processing, by the at least one computing device, the single pelvic image presenting the AP view to predict a value representative of absolute axial rotation as a function of artificial intelligence and machine learning of the training image, including based on at least one of distance, angle, and area measured in the single pelvic image presenting the AP view.
21. The method of claim 20, further comprising:
processing, by the at least one computing device, a plurality of images presenting the AP view to predict a value representing a change in sagittal inclination of the pelvis as a function of artificial intelligence and machine learning of the training image, including based on at least one of distance, angle, and area measured in the plurality of pelvic images presenting the AP view.
CN202280055429.3A 2021-06-11 2022-06-13 Intraoperative adjustment system and method for measuring patient position using radiography Pending CN117813060A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US63/209,656 2021-06-11
US202163279481P 2021-11-15 2021-11-15
US63/279,481 2021-11-15
PCT/US2022/033270 WO2022261548A1 (en) 2021-06-11 2022-06-13 Adjustment system and method for patient position intraoperatively using radiographic measurements

Publications (1)

Publication Number Publication Date
CN117813060A true CN117813060A (en) 2024-04-02

Family

ID=90422124

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280055429.3A Pending CN117813060A (en) 2021-06-11 2022-06-13 Intraoperative adjustment system and method for measuring patient position using radiography

Country Status (1)

Country Link
CN (1) CN117813060A (en)

Similar Documents

Publication Publication Date Title
US10991070B2 (en) Method of providing surgical guidance
US11937888B2 (en) Artificial intelligence intra-operative surgical guidance system
JP6685580B2 (en) System and method for intraoperative image analysis
EP3025305B1 (en) Method for creating a surgical resection plan for treating a pathological deformity of a bone
US11925420B2 (en) Adjustment system and method for patient position intraoperatively using radiographic measurements
Korez et al. A deep learning tool for fully automated measurements of sagittal spinopelvic balance from X-ray images: performance evaluation
EP3946058B1 (en) Positioning of an x-ray imaging system
US11883219B2 (en) Artificial intelligence intra-operative surgical guidance system and method of use
AU2022200996B2 (en) Systems and methods for intra-operative image analysis
US9576353B2 (en) Method for verifying the relative position of bone structures
US20080212871A1 (en) Determining a three-dimensional model of a rim of an anatomical structure
CN117813060A (en) Intraoperative adjustment system and method for measuring patient position using radiography
US20230368922A1 (en) System and method for analyzing acetabular cup position
EP4351446A1 (en) Adjustment system and method for patient position intraoperatively using radiographic measurements
JP2023525967A (en) A method for predicting lesion recurrence by image analysis
US20240050045A1 (en) Computer-implemented method for ascertaining an item of torsion information of a bone, x-ray facility, computer program and electronically readable data carrier
CN115475005A (en) Technique for generating surgical information from intraoperative and preoperative acquired image data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination