WO2022261548A1 - Système et procédé de réglage pour la position d'un patient de manière peropératoire au moyen de mesures radiographiques - Google Patents

Système et procédé de réglage pour la position d'un patient de manière peropératoire au moyen de mesures radiographiques Download PDF

Info

Publication number
WO2022261548A1
WO2022261548A1 PCT/US2022/033270 US2022033270W WO2022261548A1 WO 2022261548 A1 WO2022261548 A1 WO 2022261548A1 US 2022033270 W US2022033270 W US 2022033270W WO 2022261548 A1 WO2022261548 A1 WO 2022261548A1
Authority
WO
WIPO (PCT)
Prior art keywords
pelvic
image
function
images
computing device
Prior art date
Application number
PCT/US2022/033270
Other languages
English (en)
Inventor
Friedrich Boettner
Ajay Premkumar
Original Assignee
AccuJoint, Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AccuJoint, Inc filed Critical AccuJoint, Inc
Priority to CA3222041A priority Critical patent/CA3222041A1/fr
Priority to CN202280055429.3A priority patent/CN117813060A/zh
Priority to JP2023576442A priority patent/JP2024523863A/ja
Priority to EP22821195.9A priority patent/EP4351446A1/fr
Priority to AU2022290870A priority patent/AU2022290870A1/en
Publication of WO2022261548A1 publication Critical patent/WO2022261548A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/32Joints for the hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/30988Other joints not covered by any of the groups A61F2/32 - A61F2/4425
    • A61F2002/30995Other joints not covered by any of the groups A61F2/32 - A61F2/4425 for sacro-iliac joints

Definitions

  • the present disclosure relates, generally, to the field of surgery and, more particularly, to a specific application in image-guided total hip replacement.
  • Radiographs are commonly performed utilizing radiographs, including intraoperative radiographs.
  • radiographs can be affected by patient positioning at the time of image capture, and measurements obtained using the radiographs, particularly changes over time, can be misleading. This is especially true in the field of total joint replacement, where precise implant positioning including the acetabular and femoral component is paramount for a successful outcome.
  • At least one computing device is configured by executing code stored in non-transitory processor readable media to determine a value representing absolute axial rotation by processing at least one pelvic image presenting a lateral view and at least one pelvic image presenting an AP view.
  • the at least one lateral image is a preoperative image
  • the at least one AP image is an intraoperative image.
  • At least one computing device is configured by executing code stored in non-transitory processor readable media to determine a value representing a change in axial rotation by processing at least one pelvic image presenting a lateral view and at least two pelvic images presenting an AP view.
  • at least one of the AP images is a preoperative image and at least one of the AP images is an intraoperative image.
  • measurements can be made for at least one of distances, angles, and areas. Thereafter, as a function of calculations associated with at least one of the distances, angles, and areas, the value representing a change in axial rotation can be determined.
  • At least one computing device is configured by executing code stored in non-transitory processor readable media to determine a value representing change in sagittal pelvic inclination by processing at least one pelvic image presenting a lateral view and at least two pelvic images presenting an AP view.
  • the at least one lateral image is a preoperative image and at least two pelvic images presenting an AP view.
  • measurements can be made of at least one of distances, angles, and areas.
  • the value representing change in pelvic sagittal inclination between the respective AP images can be determined.
  • the value can represent an amount of degrees of change in pelvic sagittal inclination from a preoperative to an intraoperative AP image.
  • At least one computing device is configured by executing code stored in non-transitory processor readable media to use machine learning and artificial intelligence to determine a value representing predicted absolute axial rotation by processing at least one AP image and/or a value representing predicted change in pelvic sagittal inclination by processing at least two AP images.
  • a plurality of training images are processed for training to determine the absolute axial rotation and pelvic sagittal inclination as described above.
  • Anatomical landmarks in the training images can then be identified and used for measuring at least one of distances, angles, and areas.
  • values representing absolute axial rotation and change in pelvic sagittal inclination can be determined.
  • a value representing absolute axial rotation can be predicted using a single pelvic AP image, as a function of artificial intelligence and machine learning, including based on at least one of distances, angles, and areas measured in the single pelvic AP image.
  • a value representing change in sagittal pelvic inclination can be predicted using two pelvic AP images, as a function of artificial intelligence and machine learning, including based on at least one of distances, angles, and areas measured in the two pelvic AP images.
  • FIG. 1 illustrates an AP pelvis radiograph with three lines (Line 1, Line 2, and
  • FIGS. 2A-2B illustrate steps in a methodology to assess pelvic axial rotation, in accordance with an implementation of the present disclosure
  • FIGS. 3A-3C illustrate example lateral pelvic radiographs
  • FIG. 4. is a diagrammatic representation of the three points on a lateral radiograph, as shown and described in FIG. 3;
  • FIGS. 5A-5E illustrate example radiographs and other variables to correlate axial rotation and pelvic tilt.
  • the present disclosure includes a plurality of technological features, vis-a-vis user computing devices that include specially configured hardware for image guidance in connection with surgical implant positioning.
  • the combination of features set forth in the present disclosure include, for example, providing a system and method to determine and adjust implant positioning after determining changes in intraoperative patient position, as opposed to patient position in preoperative or expected postoperative images.
  • one or more computing devices can be configured to detect changes in three- dimensional space, as a function of the teachings herein.
  • analyses are made of radiographs or other images that represent, for example, preoperative, intraoperative, and/or expected postoperative images of the pelvis.
  • respective distances, angles, and areas can be generated and used to determine changes in patient positioning and to calculate more accurate implant positioning. For example, adjustments can be made, using measurements based on locations of identified anatomical landmarks, to implant placement, thereby increasing accuracy.
  • a system and method include at least one computing device that can interface with one or more devices for acetabular cup position adjustment, such as until the cup is in line (in registration) with the data.
  • one or more computing devices can provide, for example, a graphical user interface that can be configured to display one or more images (e.g., radiographs), as well as tools for a user to be alerted, for example, when implant position is achieved.
  • One or more navigational instruments can be in communication with hardware, including as shown and described in commonly owned U.S. Patent Number 11,241,287, which is incorporated by reference herein, and can be configured to adjust the position of the acetabular cup.
  • One or more navigational instruments can include or provide navigational markers which are usable to calculate the location of the navigated instrument and, correspondingly, a cup that can be coupled thereto.
  • An acetabular cup’s movements, therefore, can be detected and measured substantially in real-time.
  • the control console or other hardware described herein can thus provide instructions (which can be displayed on the display) to the user directing how the acetabular cup should be positioned and/or repositioned with the patient.
  • computing devices can be used and provided in accordance with the present disclosure, including server computers, personal computers, tablet computers, laptop computers, mobile computing devices (e.g., smartphones), or other suitable device that is configured to access one or more data communication networks and can communicate over the network to the various machines that are configured to send and receive content, data, as well as instructions.
  • Content and data provided via one or more computing devices can include information in a variety of forms, including, as non-limiting examples, text, audio, images, and video, and can include embedded information such as links to other resources on the network, metadata, and/or machine executable instructions.
  • Each computing device can be of conventional construction, and may be configured to provide different content and services to other devices, such as mobile computing devices, one or more of the server computing devices.
  • each computer server has one or more processors, a computer-readable memory that stores code that configures the processor to perform at least one function, and a communication port for connecting to the network.
  • the code can comprise one or more programs, libraries, functions or routines which, for purposes of this specification, can be described in terms of a plurality of modules, residing in a representative code/instructions storage, that implement different parts of the process described herein.
  • computer programs can be stored in a main and/or secondary memory and implemented by one or more processors (controllers, or the like) to cause the one or more processors to perform the functions of the invention as described herein.
  • processors controllers, or the like
  • the terms “memory,” “machine readable medium,” “computer program medium” and “computer usable medium” are used to generally refer to media such as a random access memory (RAM); a read only memory (ROM); a removable storage unit (e.g., a magnetic or optical disc, flash memory device, or the like); a hard disk; or the like.
  • RAM random access memory
  • ROM read only memory
  • removable storage unit e.g., a magnetic or optical disc, flash memory device, or the like
  • hard disk or the like.
  • FIG. 1 illustrates an example graphical user interface provided in accordance with one or more implementations of the present disclosure.
  • an antero-posterior (“AP”) radiograph of a pelvis is shown, with three lines (Fine 1, Fine 2, and Fine 3) drawn thereon.
  • Fine 1 has been drawn between the inferior aspect of the sacroiliac joints.
  • Fine 2 is drawn between the inferior aspect of the acetabular teardrops.
  • Fine 3 is drawn between the two most inferior aspects of the ischium, and can be adjusted for significant bony abnormalities, such as the presence of excessive bone spurs.
  • the distances between the midpoint of these lines are recorded on each AP pelvis image and used to contribute to measuring change in pelvic sagittal tilt between images, such as two AP pelvic radiographs.
  • Distance W Preop represents the distance between the point on Line 2 which corresponds to the center of the pubic symphysis and a point directed superiorly at a 90-degree angle from this line to intersect Line 1 on the preoperative image AP Pelvis Image.
  • Distance V Preop represents the distance between the point on Line 2 which corresponds to the center of the pubic symphysis and a point directed inferiorly at a 90-degree angle from this line to intersect Line 3 on the preoperative AP Pelvis Image.
  • Distance U Preop represents the sum of Distance W Preop and Distance V
  • Distance W Intraop represents the distance between the point on Line 2 which corresponds to the center of the pubic symphysis and a point directed superiorly at a 90-degree angle from this line to intersect Line Ion the intraoperative image AP Pelvis Image.
  • Distance V Intraop represents the distance between the point on Line 2 which corresponds to the center of the pubic symphysis and a point directed inferiorly at a 90-degree angle from this line to intersect Line 3 on an intraoperative AP Pelvis Image.
  • Distance U Intraop represents the sum of Distance W Intraop and Distance V
  • FIGS. 2A-2B illustrate steps in a methodology to assess pelvic axial rotation, in accordance with an implementation of the present disclosure.
  • FIG. 2A for example, an AP pelvis radiograph is obtained, and Line 1 and Line 2 are drawn, as shown and described with reference in FIG. 1.
  • a line extending from Line 1 to Line 2 is drawn.
  • the apex on Line 1 corresponds to the intersection of Line 1 and the center of the sacrococcygeal spine. This line is extended inferiorly in a perpendicular fashion to Line 2.
  • FIG. 2B illustrates another AP pelvis radiograph and shows Line 1 and Line 2 in accordance with one or more implementations of the present disclosure.
  • FIGS. 3A, 3B and 3C illustrate example lateral pelvic radiographs.
  • FIG. 3A illustrates three points on the radiograph: the anterior aspect of the pubic symphysis, the inferior aspect of the ischium, and the point where the sacrum meets the posterior ilium.
  • FIG. 3B illustrates how the image is calibrated using measurements from the calibrated AP Pelvis Image (FIG. 1).
  • FIG. 3C illustrates the distances (Distance A, B, C) between these points are measured and used as a component of a measurement for pelvic axial rotation as well as change in pelvic sagittal tilt between radiographs.
  • changes in pelvic sagittal tilt on an intraoperative or postoperative image in relationship to the pelvic tilt on the preoperative image can be calculated in the following fashion. This change is calculated based on the measurements on the preoperative lateral pelvis image (FIG. 3A).
  • the measurement of the acetabular component during total hip arthroplasty can be corrected, as can measurements for femoral offset and leg length when inserting the femoral component while using x-ray imaging (fluoroscopy).
  • Axial rotation of the pelvis to the opposite site increases anteversion and decreases inclination of the acetabular component on radiographs.
  • increased forward sagittal tilt decreases acetabular component inclination and anteversion
  • increased backward sagittal tilt (pelvic roll back) increases acetabular component inclination and anteversion.
  • Changes in pelvic axial rotation also impact the measurement of leg length and offset for both hips. The changes can be calculated based on changes in pelvic sagittal tilt and axial rotation and used for measurement corrections.
  • FIG. 4. is a diagrammatic representation of the three points on a lateral radiograph, such as shown and described in connection with FIG. 3.
  • FIG. 4 illustrates how the measured distance between the three points, and the various distances previously obtained on an AP pelvis radiograph, can be used to calculate changes in sagittal pelvic tilt between two radiographs. For example, changes between calculated angles x, y, and z (FIG. 4) can be used to determine changes in sagittal pelvic position between AP radiographs.
  • an artificial intelligence image recognition algorithm can be used to recognize certain anatomic landmarks to facilitate measurements of angles, distances or surface areas on a radiograph or fluoroscopic/C -Arm image (FIGS. 5 A to 5E).
  • one or more computing devices can be configured to identify anatomic landmarks that are usable to determine, among other distances, Distance 1 and Distance 2, Distance 3 and Distance 4, Distance 5 and Distance 6, Distance 7 and Distance 8.
  • the identified anatomic landmarks are usable to determine, among other angles and areas, Angle 1 and Angle 2, Area 1, Area 2 and Area 3.
  • Such distances, angles, and areas can be based on, for example, pixel recognition within the image. Markers can be positioned substantially automatically within an image, according to such recognition.
  • Machine learning can include corrections made by a human expert who can correct the position of one or more markers in order to increase the accuracy of one or more measurements and/or placements. Over time, machine learning provides for improvements and corrections, thereby increasing the accuracy of image recognition, measurements, and marker placements in connection with an image recognition algorithm. Accuracy improves until being within the range of a human expert. Accordingly, the present disclosure can provide for completely automatic and independent recognition of a patient’s anatomic landmarks, which are usable to measure the variables shown and described herein.
  • artificial intelligence is used in connection with guided image recognition, including by using anatomic landmarks that are present in an image and automatically recognized.
  • a pelvis AP radiograph is analyzed for locating a specific point, such as the pubic symphysis.
  • a number of AP Pelvis images can be submitted for training, for example, by using tools provided by a GUI that are usable for marking the location of the symphysis using a rectangle with one of its comers being the location of the symphysis. For example, a rectangle is selected to define a set of unique (or close thereto) pixels to minimize a likelihood of error.
  • Rectangles can be drawn on all the training AP Pelvis images and exported as a dataset.
  • CREATE ML can be used for modeling, and a respective input file, such as compatible with CREATE ML, is generated and/or provided.
  • CREATE ML provides a visual interface that is usable for training models using the CORE ML framework.
  • Models are usable in connection with the present disclosure to accomplish a wide variety of tasks that would be otherwise difficult or impractical to write in programming code. Accordingly, a model trained to categorize images or perform other tasks, such as to detect specific anatomic landmarks (e.g., the symphysis) within an image (e.g., an AP Pelvis radiograph) as a function of pixels.
  • an IOS application executes on a computing device IOS, e.g., an IPAD.
  • a computing device IOS e.g., an IPAD.
  • Certain parameters are utilized in CREATE ML to optimize the training process for a particular case, including based on the number of images that are used, how recognizable the pixels are, the image colors, or other variables.
  • CORE ML is usable and optimized for hardware running IOS and provides a smooth and desirable user experience.
  • Machine learning processes include executing an algorithm for a set of training images to create a model. Specific input is provided to identify a set of pixels (e.g., a rectangle selection) and to identify characteristics for training.
  • the training can be an iterative process, in which the model tries to predict the location of the rectangle, including by comparing information that is determined from the selection with one or more values provided as an input.
  • one or more entries can be used for teaching the model how to get closer to the desired output.
  • the model can be usable for automatic anatomical landmark detection and for predictive capabilities in connection with processing new input data.
  • the model can analyze newly input AP pelvis images, for example, provided by the user, and the model reports a set of coordinates for the anatomic landmark (e.g., symphysis) based on what has been learned. Coordinates can be generated and, thereafter, used via a graphical user interface, for example, to draw a line or an ellipse based on what the software needs and to calculate angles and measure distances using basic equations, via one or more interfaces provided on a computing device, such as a mobile app running on a mobile device (e.g., an IPAD).
  • a computing device such as a mobile app running on a mobile device (e.g., an IPAD).
  • the present disclosure provides for machine learning, which includes applying a plurality of images, depending on a respective implementation, training for an established, sound statistical or artificial intelligent correlation for variations represented over a plurality of images, as well as for measurements vis-a-vis a single image.
  • the present disclosure includes one or more computing devices specially configured to recognize respective anatomical landmarks automatically and accurately, and to apply one or more respective calculations, such as shown and described herein, to predict an amount of axial rotation, sagittal drift, or other respective change or condition.
  • Distance 1 and Distance 2, Distance 3 and Distance 4, Distance 5 and Distance 6, and/or Distance 7 and Distance 8 can be measured and used to predict and correct for axial rotation of the pelvis. Further such configured computing device(s) can predict and correct for an amount of rotation based on a difference between Angle 1 and Angle 2, and/or areas, such as Area 1 and Area 2 (e.g., FIGS. 5A-5E).
  • the present disclosure provides for significantly improved accuracy, including as a function of calculated measurements on a lateral image of the pelvis (e.g., FIGS. 3A-3C and FIG. 4). Accordingly, the present disclosure can eliminate a need to determine axial rotation of an AP Pelvis Radiograph or C-arm/fluoroscopy image.
  • /Fluoroscopy/C-arm images can result in changes in Distance W, V and U (e.g., FIG. 1), Distance 1 and Distance 2, Distance 3 and Distance 4, Distance 5 and Distance 6 (e.g., FIGS. 5A-5E). Furthermore, changes result in Angle 1 and Angle 2, and Area 1 and Area 2 in relationship to Area 3 (e.g., FIGS. 5A-5E), between measurements from one image to the next image (e.g., in successive images). For example, such changes can be determined by identified in a preoperative AP radiograph and an intraoperative fluoroscopy/C-arm image) of the same patient. These differences can then be compared and correlated to calculate changes in Pelvic tilt and to adjust for implant placement.
  • one or more computing devices configured in accordance with the teachings herein can predict changes in sagittal pelvic tilt from one image to the next (successive images of the same patient), including based on changes represented as a function of Distance W, V and U (e.g., FIG. 1), and Distance 1 and Distance 2, Distance 3 and Distance 4, Distance 5 and Distance 6 (e.g., FIGS. 5A-5E), as well as changes in Angle 1 and Angle 2, and Area 1 and Area 2 in relationship to Area 3 (e.g., FIGS. 5A-5E).
  • the measurements on a lateral image of the pelvis can no longer be required to determine changes in sagittal pelvic tilt of successive AP Pelvis Radiographs AND/OR C-arm/fluoroscopy images of the same patient. Instead, changes in the variables, such as described herein, can be sufficient to predict the change in sagittal pelvic tilt.
  • the amount of axial rotation in degrees and the change in sagittal pelvic tilt in degrees can be correlated to changes in Distance 1, Distance 2, Distance 3, Distance 4, Distance 5, Distance 6, Distance 7, and/or Distance 8, and/or Angle 1, Angle 2 (FIGS. 5A-5E), and/or changes in surface areas including but not limited to Area 1, Area 2 and Area 3, as displayed in FIGS. 5A-5E.
  • One or more specially configured computing devices can execute algorithms that include artificial intelligence and/or machine learning, to detect changes in distances, and 2-dimensional bone surface areas or the “appearance” of the AP pelvis image can be used to predict axial rotation or changes in sagittal pelvic tilt.
  • the methodology shown and described herein utilizes specific radiographic measurements on antero-posterior (AP) and lateral pelvis radiographs to determine changes in pelvic position in three dimensions. This is usable by surgeons to preoperatively plan or, alternatively (or in addition), intraoperatively assess changes in pelvic position and change in pelvic position between pelvic radiographs.
  • AP antero-posterior
  • lateral pelvis radiographs to determine changes in pelvic position in three dimensions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Data Mining & Analysis (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Gynecology & Obstetrics (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)

Abstract

Un système et un procédé selon l'invention permettent un placement d'implant guidé par image en fonction d'au moins une image peropératoire pendant une intervention chirurgicale. Au moins un dispositif informatique est configuré par l'exécution d'un code stocké dans des supports lisibles par un processeur non transitoires pour traiter au moins une image préopératoire afin d'évaluer une rotation axiale et/ou une inclinaison pelvienne sagittale. En outre, en fonction d'une pluralité de points de repère anatomiques identifiés dans ladite au moins une image préopératoire, au moins l'un de distances, d'angles et de surfaces est mesuré. Ensuite, en fonction de calculs associés à la/l' ou aux distances, aux angles et aux surfaces, une rotation axiale associée à au moins une image est mesurée. Ensuite, au moins une valeur associée au placement d'un implant pendant la procédure chirurgicale est ajustée et des informations associées à celui-ci sont fournies par l'intermédiaire d'une interface utilisateur graphique.
PCT/US2022/033270 2021-06-11 2022-06-13 Système et procédé de réglage pour la position d'un patient de manière peropératoire au moyen de mesures radiographiques WO2022261548A1 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CA3222041A CA3222041A1 (fr) 2021-06-11 2022-06-13 Systeme et procede de reglage pour la position d'un patient de maniere peroperatoire au moyen de mesures radiographiques
CN202280055429.3A CN117813060A (zh) 2021-06-11 2022-06-13 术中使用射线照相测量患者位置的调整系统和方法
JP2023576442A JP2024523863A (ja) 2021-06-11 2022-06-13 X線画像計測を術中に使用する患者位置のための調整システム及び方法
EP22821195.9A EP4351446A1 (fr) 2021-06-11 2022-06-13 Système et procédé de réglage pour la position d'un patient de manière peropératoire au moyen de mesures radiographiques
AU2022290870A AU2022290870A1 (en) 2021-06-11 2022-06-13 Adjustment system and method for patient position intraoperatively using radiographic measurements

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163209656P 2021-06-11 2021-06-11
US63/209,656 2021-06-11
US202163279481P 2021-11-15 2021-11-15
US63/279,481 2021-11-15

Publications (1)

Publication Number Publication Date
WO2022261548A1 true WO2022261548A1 (fr) 2022-12-15

Family

ID=84425373

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/033270 WO2022261548A1 (fr) 2021-06-11 2022-06-13 Système et procédé de réglage pour la position d'un patient de manière peropératoire au moyen de mesures radiographiques

Country Status (5)

Country Link
EP (1) EP4351446A1 (fr)
JP (1) JP2024523863A (fr)
AU (1) AU2022290870A1 (fr)
CA (1) CA3222041A1 (fr)
WO (1) WO2022261548A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6567681B1 (en) * 1998-04-01 2003-05-20 Medical Robotics I Stockholm Ab Method and arrangement for determining where to position fixation means
US20050251113A1 (en) * 2000-11-17 2005-11-10 Kienzle Thomas C Iii Computer assisted intramedullary rod surgery system with enhanced features
US20110249875A1 (en) * 2004-11-10 2011-10-13 Agfa Healthcare Method of performing measurements on digital images
US20150119966A1 (en) * 2013-10-31 2015-04-30 Pacesetter, Inc. Method and system for characterizing stimulus sites and providing implant guidance
US20200352529A1 (en) * 2014-02-25 2020-11-12 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6567681B1 (en) * 1998-04-01 2003-05-20 Medical Robotics I Stockholm Ab Method and arrangement for determining where to position fixation means
US20050251113A1 (en) * 2000-11-17 2005-11-10 Kienzle Thomas C Iii Computer assisted intramedullary rod surgery system with enhanced features
US20110249875A1 (en) * 2004-11-10 2011-10-13 Agfa Healthcare Method of performing measurements on digital images
US20150119966A1 (en) * 2013-10-31 2015-04-30 Pacesetter, Inc. Method and system for characterizing stimulus sites and providing implant guidance
US20200352529A1 (en) * 2014-02-25 2020-11-12 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
INABA YUTAKA, KOBAYASHI NAOMI, SUZUKI HARUKA, IKE HIROYUKI, KUBOTA SO, SAITO TOMOYUKI: "Preoperative planning for implant placement with consideration of pelvic tilt in total hip arthroplasty: postoperative efficacy evaluation", BMC MUSCULOSKELETAL DISORDERS, vol. 17, no. 1, 1 December 2016 (2016-12-01), XP055806920, DOI: 10.1186/s12891-016-1120-x *

Also Published As

Publication number Publication date
JP2024523863A (ja) 2024-07-02
CA3222041A1 (fr) 2022-12-15
AU2022290870A1 (en) 2024-01-04
EP4351446A1 (fr) 2024-04-17

Similar Documents

Publication Publication Date Title
JP7203148B2 (ja) 術中画像分析のためのシステム及び方法
US10991070B2 (en) Method of providing surgical guidance
US11241287B2 (en) Fluoroscopy-based measurement and processing system and method
US20240245468A1 (en) Adjustment system and method for patient position intraoperatively using radiographic measurements
Korez et al. A deep learning tool for fully automated measurements of sagittal spinopelvic balance from X-ray images: performance evaluation
AU2022200996B2 (en) Systems and methods for intra-operative image analysis
US11883219B2 (en) Artificial intelligence intra-operative surgical guidance system and method of use
US20230368922A1 (en) System and method for analyzing acetabular cup position
US20230105822A1 (en) Intraoperative guidance systems and methods
AU2022290870A1 (en) Adjustment system and method for patient position intraoperatively using radiographic measurements
US11887306B2 (en) System and method for intraoperatively determining image alignment
US20230109015A1 (en) Surgical impactor navigation systems and methods
US20230108487A1 (en) Intraoperative localisation systems and methods
CN117813060A (zh) 术中使用射线照相测量患者位置的调整系统和方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22821195

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023576442

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 3222041

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2022290870

Country of ref document: AU

Ref document number: AU2022290870

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 202327088135

Country of ref document: IN

ENP Entry into the national phase

Ref document number: 2022290870

Country of ref document: AU

Date of ref document: 20220613

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2022821195

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 202280055429.3

Country of ref document: CN

ENP Entry into the national phase

Ref document number: 2022821195

Country of ref document: EP

Effective date: 20240111