AU2022235552A1 - A computer program, apparatus, and computer-implemented method of pre-operative planning for patient surgery - Google Patents

A computer program, apparatus, and computer-implemented method of pre-operative planning for patient surgery Download PDF

Info

Publication number
AU2022235552A1
AU2022235552A1 AU2022235552A AU2022235552A AU2022235552A1 AU 2022235552 A1 AU2022235552 A1 AU 2022235552A1 AU 2022235552 A AU2022235552 A AU 2022235552A AU 2022235552 A AU2022235552 A AU 2022235552A AU 2022235552 A1 AU2022235552 A1 AU 2022235552A1
Authority
AU
Australia
Prior art keywords
patient
personalised
operative
model
anatomical structure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
AU2022235552A
Inventor
David BADE
Martina BARZAN
Chris CARTY
David Lloyd
Derek Smith
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Griffith University
Original Assignee
Griffith University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Griffith University filed Critical Griffith University
Priority to AU2022235552A priority Critical patent/AU2022235552A1/en
Priority to PCT/AU2023/050905 priority patent/WO2024059902A1/en
Publication of AU2022235552A1 publication Critical patent/AU2022235552A1/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/14Surgical saws ; Accessories therefor
    • A61B17/15Guides therefor
    • A61B17/151Guides therefor for corrective osteotomy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/14Surgical saws ; Accessories therefor
    • A61B17/15Guides therefor
    • A61B17/151Guides therefor for corrective osteotomy
    • A61B17/152Guides therefor for corrective osteotomy for removing a wedge-shaped piece of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/17Guides or aligning means for drills, mills, pins or wires
    • A61B17/1703Guides or aligning means for drills, mills, pins or wires using imaging means, e.g. by X-rays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/56Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/56Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
    • A61B17/58Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor for osteosynthesis, e.g. bone plates, screws, setting implements or the like
    • A61B17/68Internal fixation devices, including fasteners and spinal fixators, even if a part thereof projects from the skin
    • A61B17/80Cortical plates, i.e. bone plates; Instruments for holding or positioning cortical plates, or for compressing bones attached to cortical plates
    • A61B17/8095Wedge osteotomy devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/17Guides or aligning means for drills, mills, pins or wires
    • A61B17/1728Guides or aligning means for drills, mills, pins or wires for holes for bone plates or plate screws
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/56Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
    • A61B17/58Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor for osteosynthesis, e.g. bone plates, screws, setting implements or the like
    • A61B17/88Osteosynthesis instruments; Methods or means for implanting or extracting internal or external fixation devices
    • A61B17/8866Osteosynthesis instruments; Methods or means for implanting or extracting internal or external fixation devices for gripping or pushing bones, e.g. approximators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00526Methods of manufacturing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/56Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
    • A61B2017/564Methods for bone or joint treatment
    • A61B2017/565Methods for bone or joint treatment for surgical correction of axial deviation, e.g. hallux valgus or genu valgus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/56Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
    • A61B2017/568Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor produced with shape and dimensions specific for an individual patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/252User interfaces for surgical systems indicating steps of a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y80/00Products made by additive manufacturing

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Databases & Information Systems (AREA)
  • Urology & Nephrology (AREA)
  • Physiology (AREA)
  • Robotics (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Neurology (AREA)
  • Vascular Medicine (AREA)
  • Transplantation (AREA)
  • Cardiology (AREA)
  • Prostheses (AREA)
  • Surgical Instruments (AREA)

Abstract

Embodiments include a computer-implemented method of pre-operative planning for patient surgery, the method including: generating a personalised pre-operative software model based on received pre-operative medical imagery and pre-operative patient movement analysis data related to movement of the patient anatomical structure of the individual patient; receiving by the least one computer processor a software definition of a surgical procedure to be performed in relation to the personalised pre-operative software model of the patient anatomical structure; generating by the least one computer processor a modified personalised post-operative software model of the patient anatomical structure; simulating by the at least one computer processor movement of the patient anatomical structure according to the modified personalised post-operative software model to generate simulation output; allowing by the least one computer processor adjustment of the personalised post-operative software model based on the simulation output; generating by the least one computer processor a personalised surgical cutting guide; and generating by the least one computer processor at least one of: 3D printing instructions based on the personalised surgical cutting guide model or personalised surgical operation instructions to transmit to a robotic surgical system for use in performing the surgical procedure. [FIGURE 2] 2/10 200 ---- - i - Pre-------t- Pre-operative I mPre-operative i I medical imagery L - - - - -- movement analysis 116 I data Generate personalised pre-operative software model of the patient's anatomical structure S202 Receiving software definition of surgical procedure S204 Generate personalised post-operative software model of the patient's anatomical structure S206 Simulate movement of the bones and muscles of the patient's anatomical structure to generate simulation output S208 Allow adjustment of personalised post operative software model based on simulation output S210 Generate personalised cutting guide model S212 Generate personalised Generate 3D printing surgical operation instructions instructions S216 S214 i I3Dmodelprinting surgical operation I instructions instructions I | 122 120 Fg_ Fig. 2

Description

2/10
200 ---- - i- Pre-------t- Pre-operative I mPre-operative i I medical imagery L- - - - -- movement analysis 116 I data
Generate personalised pre-operative software model of the patient's anatomical structure S202
Receiving software definition of surgical procedure S204
Generate personalised post-operative software model of the patient's anatomical structure S206
Simulate movement of the bones and muscles of the patient's anatomical structure to generate simulation output S208
Allow adjustment of personalised post operative software model based on simulation output S210
Generate personalised cutting guide model S212
Generate personalised Generate 3D printing surgical operation instructions instructions S216 S214
i I3Dmodelprinting surgical operation I instructions instructions I | 122 120 Fg_ Fig. 2
"A computer program, apparatus, and computer-implemented method of pre operative planning for patient surgery"
Technical Field
[0001] Embodiments generally relate to methods, systems, and devices for modelling and simulation of orthopaedic surgery. In particular, embodiments relate to methods, systems, and devices for patient physiology modelling and surgical simulation for orthopaedic surgeries.
Background
[0002] Orthopaedic surgeons are increasingly relying on virtual surgery planning technologies to aid their clinical decision making. However, existing technologies are based on fitting three-dimensional (3D) images of a deformed bone to the 3D image of an idealised bone, with no consideration of functional consequences to a patient, i.e., muscular action, and local and whole-body movement. As a result, the patient's functional capacity might not improve after undergoing corrective surgery.
[0003] It is desired to address or ameliorate one or more shortcomings or disadvantages of prior virtual surgery planning technologies, or to at least provide a useful alternative thereto.
[0004] Any discussion of documents, acts, materials, devices, articles or the like which has been included in the present specification is not to be taken as an admission that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present disclosure as it existed before the priority date of each claim of this application.
[0005] Throughout this specification the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not the exclusion of any other element, integer or step, or group of elements, integers or steps.
Summary
[0006] Embodiments include a computer-implemented method of pre-operative planning for patient surgery, the method including: generating by at least one computer processor a personalised pre-operative software model of patient anatomical structure of an individual patient based on received pre-operative medical imagery of the patient anatomical structure of the individual patient and pre-operative patient movement analysis data related to movement of the patient anatomical structure of the individual patient, the patient movement analysis data being generated at least in part by a movement sensor system observing movement of the patient anatomical structure of the individual patient over a period of time, wherein the personalised pre-operative software model of the patient includes bone position and dimension data of bones in the patient anatomical structure of the individual patient, muscle position and dimension data of muscles in the patient anatomical structure of the individual patient, and relationship definition data defining relationships between bones and muscles in the patient anatomical structure of the individual patient; receiving by the least one computer processor a software definition of a surgical procedure to be performed in relation to the personalised pre-operative software model of the patient anatomical structure of the individual patient; generating by the least one computer processor a modified personalised post-operative software model of the patient anatomical structure of the individual patient based on the personalised pre-operative software model and the software definition of the surgical procedure; simulating by the at least one computer processor movement of the bones and muscles of the patient anatomical structure of the individual patient according to the modified personalised post-operative software model to generate simulation output; allowing by the least one computer processor adjustment of the personalised post-operative software model based on the simulation output; generating by the least one computer processor a personalised surgical cutting guide model to facilitate the surgical procedure based on the personalised post-operative software model or the adjusted personalised post-operative software model; and generating by the least one computer processor at least one of: 3D printing instructions based on the personalised surgical cutting guide model to transmit to a 3D printer to form a surgical cutting guide; or personalised surgical operation instructions to transmit to a robotic surgical system for use in performing the surgical procedure.
[0007] Optionally, the surgery is one from among: an osteotomy; a femoral osteotomy; a proximal femoral osteotomy; a tibial osteotomy; a high tibial osteotomy.
[0008] Optionally, the surgical procedure includes one or more osteotomies, and the simulation output comprises the surgical cutting guide model, the surgical cutting guide model defining one or more osteotomy planes in which to cut a bone or bones of the patient anatomical structure to facilitate reconfiguration of the patient anatomical structure in accordance with the modified software model or the adjusted modified software model.
[0009] Optionally, the surgical procedure includes one or more osteotomies, and the simulation output comprises an implant configured to secure a first portion of a bone cut by the one or more osteotomies to a second, separate, portion of the same bone cut by the one or more osteotomies, in a configuration determined in accordance with the modified software model or the adjusted modified software model.
[0010] Optionally, the software definition of the surgical procedure defines a value range for each of the following parameters: a number of osteotomies in each of one or more specified bones; for each osteotomy, a specific bone from the patient anatomical structure to be cut by the osteotomy; for each osteotomy, a position and orientation of an osteotomy plane; for each osteotomy, a relative position and orientation of the two or more post-osteotomy distinct bone portions; wherein the simulation iteratively constrains the software definition of the surgical procedure to include a patient-specific osteotomy plan by determining a specific value from within the value range for each of the parameters, and wherein the surgical cutting guide model and the implant configuration implement the specific values for the parameters.
[0011] Optionally, the software definition of the surgical procedure defines a value range for each of the following parameters: a repositioning of one or more specified bones within an allowable range; a range of available implants with corresponding chisels; wherein the simulation iteratively constrains the software definition of the surgical procedure to include a patient-specific implant plan by determining one or more repositioning's and for each determined repositioning a selected implant and chisel from the available range.
[0012] Optionally, the surgical cutting guide model implements the specific values for one or more osteotomies from the patient-specific osteotomy plan, wherein the surgical cutting guide model is generated by, for each osteotomy: defining a mask portion of the surgical cutting guide model configured to conform to one or more curves or other geometric features of the bone position and dimension data of bones in the patient anatomical structure of the individual patient.
[0013] Optionally, the surgical cutting guide model is further generated by, for each osteotomy: defining an osteotomy slot portion of the surgical cutting guide model being an aperture in the mask portion positioned and orientated according to the specific values for the osteotomy; defining an osteotomy saw blade insertion profile and extruding the surgical cutting guide model according to the saw blade insertion profile to a predefined distance proximally and distally of the defined slot portion.
[0014] Optionally, the surgical cutting guide model is further generated by, for each osteotomy: defining an osteotomy chisel insertion as a location on the defined mask portion and a direction relative to the mask, extruding the surgical cutting guide model according to the location and direction by a predefined distance distal from the osteotomy chisel insertion location.
[0015] Optionally, the surgical cutting guide model is further generated by, for each osteotomy: defining one or more implant fixation slots as a location on the defined mask portion based on a shaft surface of the implant, extruding the surgical cutting guide model at the one or more defined implant fixation slots by a predefined distance distally.
[0016] Optionally, the defined osteotomy chisel insertion further comprises a hole in the surgical cutting guide model configured to removably receive a guide wire and a guide wire seat, and wherein the surgical cutting guide model is extruded distally around the hole to define the guide wire seat, the guide wire seat being configured for insertion into the hole at one end and to longitudinally receive the guide wire.
[0017] Optionally, the surgical cutting guide model is converted to 3D printing instructions, and the method further comprises 3D printing surgical cutting guide from the 3D printing instructions.
[0018] Optionally, the received pre-operative medical imagery of the patient anatomical structure of the individual patient is obtained by one or more from among: anthropometric data acquisition; attachment of MRI-compatible markers to the individual patient and MRI-scanning thereof, and analysis of the MRI-scanning to obtain MRI-images scans of the patient anatomical structure of the individual patient; placement of electromyography (EMG) units on the skin of the individual patient and measurement and analysis of EMG signals generated by the EMG units.
[0019] Optionally, generating the personalised pre-operative software model of the patient includes: generating at least one MRI-reconstructed portion by one or more from among: obtaining the bone position and dimension data of bones in the patient anatomical structure of the individual patient by executing a segmentation process on the MRI scans; obtaining the muscle position and dimension data of muscles in the patient anatomical structure of the individual patient by executing a segmentation process on the MRI scans; obtaining position and dimension data of a growth plate in one or more bones, by executing a segmentation process on the MRI scans.
[0020] Optionally, generating the personalised pre-operative software model of the patient includes: generating at least one CT-reconstructed portion by imaging the patient anatomical structure of the individual patient by a computerized tomography scan to obtain at least one CT scan, and obtaining the bone position and dimension data of bones in the patient anatomical structure of the individual patient by executing a segmentation process on the at least one CT scan.
[0021] Optionally, methods further comprise registering common landmarks in the bone position and dimension data of bones in the CT-reconstructed portion and the MRI reconstructed portion
[0022] Optionally, generating the personalised pre-operative software model of the patient includes a 3D anatomical analysis of the bone position and dimension data, including defining one or more axes and planes in the bone position and dimension data, and measuring one or more 3D angles between the defined one or more axes and planes.
[0023] Optionally, generating the post-operative personalised software model of the patient anatomical structure of the individual patient includes: combining a CAD model of the selected implant and chisel with the bone position and dimension data to verify that adequate bone thickness remains in place once the implant is implanted into a bone orifice created by the chisel.
[0024] Optionally, generating the post-operative personalised software model of the patient anatomical structure of the individual patient includes: combining a CAD model of the selected implant and chisel with the growth plate position and dimension data to verify that adequate growth plate volume remains once the implant is implanted into a bone orifice created by the chisel.
[0025] Optionally, the patient movement analysis data is measured motion data of the individual patient.
[0026] Optionally, the patient movement analysis data is measured motion data of the individual patient obtained by motion capture while the individual patient is walking or performing another natural body movement.
[0027] Optionally, the personalised pre-operative software model of patient anatomical structure of the individual patient based on received pre-operative medical imagery of the patient anatomical structure of the individual patient and pre-operative patient movement analysis data related to movement of the patient anatomical structure of the individual patient, is a 4D personalised functional model, generated by one or more steps from among: identifying surfaces and landmarks in the received pre-operative medical imagery, and fitting the body parts to which the identified surfaces and landmarks belong to an anatomical structure reference system by mapping the identified surfaces and landmarks to equivalents in the anatomical structure reference system; adding body parts fitted to the anatomical reference system including one or more from among: bones; joints; muscles; to the 4D personalised functional model.
[0028] Optionally, the post-operative personalised software model of the patient anatomical structure of the individual patient based on the personalised pre-operative software model and the software definition of the surgical procedure, is a 4D personalised functional model, generated by: identifying surfaces and landmarks in the pre-operative software model of patient anatomical structure as modified by the software definition of the surgical procedure, optionally as iteratively constrained according to claim 5 and claim 6, and fitting the body parts to which the identified surfaces and landmarks belong to an anatomical structure reference system by mapping the identified surfaces and landmarks to equivalents in the anatomical structure reference system; adding body parts fitted to the anatomical reference system including one or more from among: bones; joints; muscles; to the 4D personalised functional model.
[0029] Optionally, the method further includes simulating by the at least one computer processor movement of the bones and muscles of the patient anatomical structure of the individual patient according to the post-operative personalised software model to generate simulation output, wherein the simulation comprises comparing the 4D personalised functional model of the pre-operative software model of patient anatomical structure with the 4D personalised functional model of the post-operative personalised software model of patient anatomical structure.
[0030] Optionally, the software definition of the surgical procedure is iteratively constrained to a defined solution surgical procedure within the software definition of the surgical procedure by a machine learning model or another solving algorithm seeking to achieve a defined optimum outcome in the comparison of the 4D personalised functional model of the pre-operative software model of patient anatomical structure with the 4D personalised functional model of the post-operative personalised software model of patient anatomical structure.
[0031] Embodiments may include a computer program which, when executed by a computing apparatus comprising processor hardware and memory hardware, causes the processor hardware to perform a computer-implemented method of an embodiment.
[0032] The computer program may be stored on a computer-readable medium.
[0033] The computer-readable medium storing the computer program may be non transitory.
[0034] Embodiments may include an apparatus comprising a processor and a memory, the processor being configured to execute processing instructions stored by the memory, and by executing the processing instructions to perform a computer-implemented method comprising of pre-operative planning for patient surgery, the method including: generating by at least one computer processor a personalised pre-operative software model of patient anatomical structure of an individual patient based on received pre operative medical imagery of the patient anatomical structure of the individual patient and pre-operative patient movement analysis data related to movement of the patient anatomical structure of the individual patient, the patient movement analysis data being generated at least in part by a movement sensor system observing movement of the patient anatomical structure of the individual patient over a period of time, wherein the personalised pre-operative software model of the patient includes bone position and dimension data of bones in the patient anatomical structure of the individual patient, muscle position and dimension data of muscles in the patient anatomical structure of the individual patient, and relationship definition data defining relationships between bones and muscles in the patient anatomical structure of the individual patient; receiving by the least one computer processor a software definition of a surgical procedure to be performed in relation to the personalised pre-operative software model of the patient anatomical structure of the individual patient; generating by the least one computer processor a modified personalised post-operative software model of the patient anatomical structure of the individual patient based on the personalised pre-operative software model and the software definition of the surgical procedure; simulating by the at least one computer processor movement of the bones and muscles of the patient anatomical structure of the individual patient according to the modified personalised post-operative software model to generate simulation output; allowing by the least one computer processor adjustment of the personalised post-operative software model based on the simulation output; generating by the least one computer processor a personalised surgical cutting guide model to facilitate the surgical procedure based on the personalised post-operative software model or the adjusted personalised post-operative software model; and generating by the least one computer processor at least one of: 3D printing instructions based on the personalised surgical cutting guide model to transmit to a 3D printer to form a surgical cutting guide; or personalised surgical operation instructions to transmit to a robotic surgical system for use in performing the surgical procedure.
[0035] Embodiments may include a computer-implemented method of pre-operative planning for patient surgery, the method including: generating a personalised pre operative software model based on received pre-operative medical imagery and pre operative patient movement analysis data related to movement of the patient anatomical structure of the individual patient; receiving by the least one computer processor a software definition of a surgical procedure to be performed in relation to the personalised pre-operative software model of the patient anatomical structure; generating by the least one computer processor a modified personalised post-operative software model of the patient anatomical structure; simulating by the at least one computer processor movement of the patient anatomical structure according to the modified personalised post-operative software model to generate simulation output; allowing by the least one computer processor adjustment of the personalised post operative software model based on the simulation output; generating by the least one computer processor a personalised surgical cutting guide; and generating by the least one computer processor at least one of: 3D printing instructions based on the personalised surgical cutting guide model or personalised surgical operation instructions to transmit to a robotic surgical system for use in performing the surgical procedure.
Brief Description of Drawings
[0036] Embodiments are described in further detail below, by way of example and with reference to the accompanying drawings, in which:
[0037] Figure 1 shows a schematic diagram of an example orthopaedic surgery planning device, according to some embodiments;
[0038] Figure 2 shows a flowchart of a method of generation of surgical operation instructions and three-dimensional model printing instructions for an orthopaedic surgery, according to some embodiments;
[0039] Figure 3 shows a flowchart of a method of performing a medical imagery segmentation process, according to some embodiments;
[0040] Figures 4A to 4F illustrate patient muscular skeletal systems according to some embodiments;
[0041] Figures 5A to 5C illustrates planned rotational corrections according to some embodiments;
[0042] Figures 6A to 6D illustrate constraints on software definition of surgical procedure according to some embodiments;
[0043] Figure 7 illustrates muscle segmentation according to an embodiment;
[0044] Figure 8 illustrates movement analysis data according to some embodiments;
[0045] Figure 9 illustrates surgical cutting guide design according to some embodiments;
[0046] Figure 10 illustrates a surgical cutting guide manufacture according to some embodiments.
Description of Embodiments
[0047] Embodiments generally relate to methods, systems, and devices for modelling and simulation of orthopaedic surgery. Particular embodiments relate to methods, systems, and devices for patient physiology modelling and surgical simulation for orthopaedic surgeries, specifically osteotomy surgery. Embodiments also include computer programs, processing instructions, and/or other forms of software, for performing the method and processing steps (other than those specified as being performed by a human expert). Such software may be stored on a computer-readable medium such as a non-transitory computer-readable medium.
[0048] Referring to the drawings, Figure 1 shows a schematic illustration of an orthopaedic surgery planning device 100 (planning device 100) for generating orthopaedic surgery instructions and three-dimensional (3D) model printing instructions for an orthopaedic surgery, according to some embodiments. A corresponding worked example is illustrated in Figures 4A to 10. The worked example is presented in the context of a proximal femoral osteotomy. However, methods, systems, and devices, of embodiments may also be applied to other osteotomies, such as tibial osteotomy. The worked example is presented in the context of a proximal femoral osteotomy in order to enable the concepts disclosed herein to be understood in a consistent manner in a particular anatomical implementation example.
[0049] In some embodiments, planning device 100 comprises device processor circuitry 111 (described herein as a processor 111 for convenient reference) and a memory 114 accessible to device processor circuitry 111. Processor 111 may be configured to access data stored in memory 114, to execute instructions stored in memory
114, and to read and write data to and from memory 114. Device processor circuitry 111 (i.e. processor 111) may comprise one or more microprocessors, microcontrollers, central processing units (CPUs), application specific instruction set processors (ASIPs), or other processor capable of reading and executing instruction code.
[0050] Memory 114 may comprise one or more volatile or non-volatile memory types, such as RAM, ROM, EEPROM, or flash, for example. Memory 114 may be configured to store executable applications for execution by processor 110. For example, memory 114 may store at least one 3D anatomical modelling module 126 configured to allow a user to generate at least one 3D anatomical model for use in orthopaedic surgery planning. Memory 114 may also store pre-operative medical imagery 116, pre-operative movement analysis data 118, personalised surgical operation instructions 120, 3D model printing instructions 122, surgery planning module 124, and/or software definition module 128 for example.
[0051] Exemplary use cases are illustrated in Figures 4A to 4C, showing hip deformities in juvenile patients. Figure 4A shows an image of a hip with Perthes' disease, Figure 4B shows an image of a hip with deformity subsequent to prior management of a slipped capital femoral epiphysis, and Figure 4C shows a hip joint of a patient with a neuromuscular disease such as cerebral palsy. Either case is a candidate for proximal femoral osteotomy: a complex corrective surgical procedure. In some embodiments, pre operative medical imagery 116 may include a plurality of medical imaging scans of an individual patient, such as magnetic resonance imaging (MRI) scans and/or computed tomography (CT) scans. For example, pre-operative medical imagery 116 may include at least one of: a full lower limb MRI scan, a full length femur MRI scan, an affected hip MRI scan, and/or a full length femur CT scan. In some embodiments, a plurality of MRI compatible markers may be used in obtaining the MRI scans of the pre-operative medical imagery 116. MRI and CT scans may be an input to a step of anatomical modelling. Figure 4D illustrates images collected in the pre-operative medical imagery step 116 including pelvis and full-length femurs MRI in Figure 4D (obtained with 1.5T, 3D PD SPACE, slice thickness 1.1mm, voxel size 0.83x0.83x1.Omm) and a right hip CT scan in Figure 4E (slice thickness 1.1mm). It is noted that the hip imaged in Figures 4D and 4E has deformity which has developed following previous stabilisation management of a right slipped capital femoral epiphysis fracture with a canulated screw. The pre-operative medical imagery may be, for example, segmented by an Al algorithm to generate a personalised pre-operative software model of the patient. It is noted that, in the alternative anatomical context of a tibial osteotomy, the scanning would be of a tibia and may also include knee joint, ankle joint, and optionally also the fibula.
[0052] For example, a segmentation algorithm may execute on the collected pre operative medical imagery by processing either the MRI scans, or by combining the MRI scans with the CT scans, to: obtain the bone position and dimension data of bones in the patient anatomical structure of the individual patient; obtain the muscle position and dimension data of muscles in the patient anatomical structure of the individual patient; and/or obtain position and dimension data of a growth plate in one or more bones. Figure 4F illustrates, in 2D, 3D segmented femur MRI-reconstructed portions obtained by executing a segmentation algorithm on the obtained pre-operative medical imagery.
[0053] In some embodiments, the full lower limb MRI scan may be acquired with the individual patient laying supine with their legs extended. The full lower limb MRI scan may be conducted from the most superior point of the ilium to the feet. The full lower limb MRI scan may have a '3D PD SPACE' space sequence, for example. The full lower limb MRI scan may have a slice thickness of about 1.1mm, for example. The full lower limb MRI scan may have a voxel of about 0.83x0.83x1.Omm, for example.
[0054] In some embodiments, the full length femur MRI scan may be acquired with the individual patient laying supine with their legs extended. The full length femur MRI scan may be conducted from the most superior point of the ilium to below the femoral condyles. The full length femur MRI scan may have a '3D DIXON' space sequence, for example. The full length femur MRI scan may have a slice thickness of about 0.9mm, for example. The full length femur MRI scan may have a voxel of about 0.88x0.88x0.88mm, for example.
[0055] In some embodiments, the affected hip MRI scan may be acquired with the individual patient laying supine with their legs extended. The affected hip MRI scan may be conducted from the anterior superior iliac spine of the affected hip to the lesser trochanter. The affected hip MRI scan may have a '3D T2' space sequence, for example. The affected hip MRI scan may have a slice thickness of about 0.7mm, for example. The affected hip MRI scan may have a voxel of about .43x0.43x0.43mm, for example.
[0056] In some embodiments, the full length femur CT scan may be acquired with the individual patient laying supine with their legs extended. The full length femur CT scan may be conducted from the anterior superior iliac spine of the affected hip to below the femoral condyles. The full length femur CT scan may have a slice thickness of about less than 1.0mm, for example.
[0057] In some embodiments, regardless of the anatomical context being a femoral osteotomy procedure, or alternatively a tibial osteotomy procedure, pre-operative movement analysis data 118 may include at least one of: anthropometric data, electromyography (EMG) data, and/or 3D gait data. Anthropometric data may include at least one of the individual patient's: height, mass, leg length, frontal plane knee alignment, knee width, and/or ankle width. EMG data may be acquired via a motion capture system, such as a 'Vicon system', for example. EMG data may be acquired via the use of a plurality of EMG units attached to the individual patient's body. That is, a plurality of EMG units may be attached on the skin of the individual patient and measurement and analysis of EMG signals generated by the EMG units conducted to obtain EMG data. In some embodiments, the pre-operative movement analysis data 118 may be measured motion data of the individual patient. In some embodiments, 3D gait data may be acquired via a motion capture system, such as a 'Vicon system', for example. 3D gait data may be acquired via a standing calibration trial and/or at least 10 walking trials, for example.
[0058] Planning device 100 further comprises an electronic interface 113 to allow communication between planning device 100 and a user. Electronic interface 113 may comprise one or more of a camera, a speaker, a mouse, a keyboard, a touchpad, buttons, sliders, and LEDs, for example. In some embodiments, electronic interface 113 may be used to alert the user of a particular event, such as the device being ready for use, for example.
[0059] To facilitate communication with external and/or remote devices, planning device 100 further comprises a communications module 112. Communications module 112 may allow for wired and/or wireless communication between planning device 100 and external computing devices and components. Communications module 112 may facilitate communication via Bluetooth, USB, Wi-Fi, Ethernet, or via a telecommunications network, for example. According to some embodiments, communication module 112 may facilitate communication with external devices and systems via a network 140. The external devices may include a computer server (or server system), a user device, such as a handheld computing device or other form of computing device, and a doctor device, which may also be a handheld computing device or other form of computing device. In some embodiments, processor 111, memory 114, and communications module 112 may be in the form of a microcontroller, such as an Arduino, for example.
[0060] Network 140 may comprise one or more local area networks or wide area networks that facilitate communication between planning device 100 and other devices, such as servers or computers, connected to network 140. For example, according to some embodiments, network 140 may be the internet. However, network 140 may comprise at least a portion of any one or more networks having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, some combination thereof, or so forth. Network 140 may include, for example, one or more of: a wireless network, a wired network, an internet, an intranet, a public network, a packet-switched network, a circuit switched network, an ad hoc network, an infrastructure network, a public-switched telephone network (PSTN), a cable network, a cellular network, a satellite network, a fibre-optic network, or some combination thereof.
[0061] Figure 2 shows a flowchart of a method 200 for pre-operative planning for patient surgery, according to some embodiments. The pre-operative planning for patient surgery may include personalised surgical operation instructions 120 and 3D model printing instructions 122 for an orthopaedic corrective surgery. Processor 111 begins method 200 on execution of surgery planning module 124. At step S202, processor 111 generates a personalised pre-operative software model of the individual patient's anatomical structure (pre-operative software model). In some embodiments, the personalised pre-operative software model is based on at least one of: the pre-operative medical imagery 116 and the pre-operative movement analysis data 118. In some embodiments, the pre-operative software model may include bone position and dimension data of bones in the anatomical structure of the individual patient. In some embodiments, the pre-operative software model may include muscle position and dimension data of muscles in the anatomical structure of the individual patient. In some embodiments, the pre-operative software model may include relationship definition data defining relationships between bones and muscles in the anatomical structure of the individual patient. Figure 4F represents 3D segmented femurs in a personalised pre operative software model of an individual patient's anatomical structure. In the alternative anatomical context of a tibial osteotomy, a tibia may be segmented from surrounding muscles, bones and joints.
[0062] In some embodiments, generation of the personalised pre-operative software model of the patient's anatomical structure of an individual patient includes generation of at least one MRI-reconstructed portion. In some embodiments, generation of the personalised pre-operative software model of the patient's anatomical structure of an individual patient includes generation of at least one CT-reconstructed portion. To generate the at least one MRI-reconstructed portion and/or the at least one CT reconstructed portion processor 111 performs the steps of method 300 (Figure 3). That is, method 300 may form a part of step S202 of method 200, for example. In some embodiments, the steps of method 300 may be performed by processor 111 on execution of 3D anatomical modelling module 126. In some embodiments, the steps of method 300 may be performed by processor 111 during execution of the surgery planning module
124. In some embodiments, the 3D anatomical modelling module 126 is a commercially available solution, such as 'Mimics' and/or "3-matic", for example.
[0063] Figure 3 shows a flowchart of method 300 for segmenting pre-operative medical imagery 116 of the individual patient's anatomical structure, according to some embodiments. In some embodiments, performing method 300 generates at least one MRI-reconstructed portion by performing a segmentation process on the MRI scans included in pre-operative medical imagery 116. In some embodiments, performing method 300 generates at least one CT-reconstructed portion by performing a segmentation process on the CT scan included in pre-operative medical imagery 116. In some embodiments, processor 111 may perform the steps of method 300 multiple times to generate a plurality of reconstructed portions. That is, processor 111 may perform the steps of method 300 a first time to generate MRI-reconstructed bone portions, for example. Processor 111 may perform the steps of method 300 a second time to generate MRI-reconstructed muscle portions, for example. Processor 111 may perform the steps of method 300 a third time to generate an MRI-reconstructed growth plate portion, for example. Processor 111 may perform the steps of method 300 a fourth time to generate CT-reconstructed bone portions, for example. Figures 4D and 4E illustrate inputs to the segmentation process: MRI and CT scans; Figure 4F illustrates an output: a segmented bone image.
[0064] In some embodiments, the MRI-reconstructed bone portions are generated based on the full lower limb MRI scan and the affected hip MRI scan included in the pre operative medical imagery 116. In the alternative anatomical context of the tibial osteotomy it may be one or both of an affected knee and an affected ankle MRI scan included in the pre-operative medical imagery. Returning to the context of a proximal femoral osteotomy, the MRI-reconstructed bone portions may be of the individual patient's lower limb bones, affected hip, and/or pelvis, for example. In some embodiments, the MRI-reconstructed muscle portions are generated based on the full length femur MRI scan included in the pre-operative medical imagery 116. The MRI reconstructed muscle portions may be of the individual patient's glutei muscles, for example. In the context of a tibial osteotomy, the reconstructed muscle portions may be of the individual patient's gastrocnemius muscle, as a particular muscle part of interest. In the proximal femoral osteotomy context, the MRI-reconstructed growth plate portion is generated based on the full length femur MRI scan included in the pre-operative medical imagery 116. The MRI-reconstructed growth plate portion may be of the individual patient's growth plate, for example. In some embodiments, the CT reconstructed bone portions are generated based on the full length femur CT scan included in the pre-operative medical imagery 116. The CT-reconstructed bone portions may be of the individual patient's affected femur and pelvis, for example.
[0065] To generate the MRI-reconstructed bone portions, processor 111, at step S302, creates a mask for each of the individual patient's lower limb bones, affected hip, and pelvis (noting that the selection of bones and joints is peculiar to the surgery being performed and in the case of a tibial osteotomy may include knee and/or ankle and exclude hip and/or pelvis). That is, processor 111, receiving the full lower limb MRI scan and the affected hip MRI scan as input, isolates, or segments, each lower limb bone, the affected hip, and the pelvis in the full lower limb MRI scan and the affected hip MRI scan into a plurality of lower limb bone masks, a hip mask, and a pelvis mask, for example. In some embodiments, the created bone masks are for both legs of the individual patient. Processor 111, at step S304, then generates a plurality of 3D lower limb bone, hip, and pelvis parts, one for each of the individual patient's lower limb bones, affected hip, and pelvis, based on the corresponding lower limb bone masks, hip mask, and pelvis mask. Processor 111, at step S306, wraps each of the plurality of 3D generated lower limb bone, hip, and pelvis parts, resulting in a plurality of wrapped 3D lower limb bone, hip, and pelvis parts. In some embodiments, a smallest detail setting is configured to be equal to 0.3mm. In some embodiments, a gap closing distance setting is configured to be equal to 0.5 mm. Again, it is noted that the generation of MRI-reconstructed bone portions follows an equivalent procedure in the context of a tibial osteotomy, save for the identity of the bones and affected joints. The bones include at least the tibia and may also include the fibula and may also include the femur. The affected joints may include one or both of the knee joint and the ankle joint. In some embodiments, even in the context of a tibial osteotomy, the affected joints may include the hip joint.
[0066] Optionally, a smoothing step may be included after the wrapping S306. The smoothing step may comprise smoothing each of the wrapped 3D lower limb, hip, and pelvis parts to generate a plurality of final lower limb, hip, and pelvis parts. However, the smoothing step may be omitted. For example, omitting the smoothing step may be preferable in order that the 3D generated parts remain true to the original anatomy. Thus, the plurality of final lower limb, hip, and pelvis parts may be the output of the wrapping step S306, or where smoothing is included, the smoothing step.
[0067] After wrapping (and optionally smoothing) each of the plurality of lower limb bone, hip, and pelvis parts, processor 111, at step S310, compares each of the plurality of final 3D lower limb bone, hip, and pelvis parts to the plurality of medical imaging scans of the pre-operative medical imagery 116 (noting that the selection of bones and joints is peculiar to the surgery being performed and in the case of a tibial osteotomy may include knee and/or ankle and exclude hip and/or pelvis). That is, processor 111 compares the contours of each of the plurality of final 3D lower limb bone, hip, and pelvis parts to the plurality of medical imaging scans of the pre-operative medical imagery 116 to determine part generation accuracy. In some embodiments, a user (e.g. a medical expert) may interact with the electronic interface 113 to confirm the accuracy of each of the plurality of final 3D lower limb bone, hip, and pelvis parts. In some embodiments, processor 111, upon determining that the plurality of generated final 3D lower limb bone, hip, and pelvis parts are inaccurate, may return to step S302 and repeat the previously described steps, S302, S304, S306, and S310 in relation to generating the MRI-reconstructed bone portions.
[0068] In some embodiments, processor 111, upon determining that the plurality of generated final 3D lower limb bone, hip, and pelvis parts are accurate, moves to step S312. At S312, processor 111 performs remeshing on each of the plurality of final 3D lower limb bone, hip, and pelvis parts to generate the MRI-reconstructed bone portions. Remeshing rebuilds the geometry of each of the plurality of final 3D lower limb bone, hip, and pelvis parts with an increased uniform topology. In some embodiments, remeshing may change the number, shape, size, or arrangement of polygons forming a surface mesh of each of the plurality of final 3D lower limb bone, hip, and pelvis parts.
In some embodiments, remeshing may result in a simpler and less computationally difficult surface mesh for each of the MRI-reconstructed bone portions.
[0069] In some embodiments, processor 111, upon generating the MRI-reconstructed bone portions may perform method 300 to generate the MRI-reconstructed muscle portions. Figure 7 illustrates the function of muscle segmentation in generating the personalised pre-operative software model of patient anatomical structure by reference to a two-dimensional image of a hip joint in which the gluteus medius and gluteus minimus muscles are segmented (again, noting that in the context of a tibial osteotomy the muscles of interest may be gastrocnemius and/or soleus). That is, processor 111 may perform method 300 again to generate the MRI-reconstructed muscle portions, for example. To generate the MRI-reconstructed muscle portions, processor 111, at step S302, creates a mask for the individual patient's gluteus minimus and gluteus medius of the affected limb. That is, processor 111, receiving the full length femur MRI scan as input, isolates, or segments, the gluteus minimus and the gluteus medius of the affected limb in the full length femur MRI scan into a plurality of muscle masks, for example. Processor 111, at step S304, then generates a plurality of 3D muscle parts, one for each of the gluteus minimus and the gluteus medius, based on the corresponding muscle masks. Processor 111, at step S306, wraps each of the plurality of 3D generated muscle parts, resulting in a plurality of wrapped 3D muscle parts. Wrapping provides a continuous surface around the mesh. In some embodiments, a smallest detail setting is configured to be equal to 0.3 mm. In some embodiments, a gap closing distance setting is configured to be equal to 0.5 mm. In the context of a tibial osteotomy, the muscle of interest may be gastrocnemius in particular, and may also include soleus.
[0070] In the context of planning a proximal tibia osteotomy, in order to enhance estimation of post-operative knee range of motion, segmentation processing may be applied to the knee ligaments and knee cartilage in addition to the bones and muscles.
[0071] Optionally, a smoothing step may be included after the wrapping S306. The smoothing step may comprise smoothing each of the wrapped 3D muscle parts to generate a plurality of final muscle parts. However, the smoothing step may be omitted.
For example, omitting the smoothing step may be preferable in order that the 3D generated parts remain true to the original anatomy. Thus, the plurality of final muscle parts may be the output of the wrapping step S306, or where smoothing is included, the smoothing step.
[0072] At step S310, processor 111 compares each of the plurality of final 3D muscle parts to the plurality of medical imaging scans of the pre-operative medical imagery 116. That is, processor 111 compares the contours of each of the plurality of final 3D muscle parts to the plurality of medical imaging scans of the pre-operative medical imagery 116 to determine part generation accuracy. In some embodiments, a user (e.g. a medical expert) may interact with the electronic interface 113 to confirm the accuracy of each of the plurality of final 3D muscle parts. In some embodiments, processor 111, upon determining that the plurality of generated final 3D muscle parts are inaccurate, may return to step S302 and repeat the previously described steps, S302, S304, S306, and S310 in relation to generating the MRI-reconstructed muscle portions.
[0073] In some embodiments, processor 111, upon determining that the plurality of generated final 3D muscle parts are accurate, moves to step S312. At S312, processor 111 performs remeshing on each of the plurality of final 3D muscle parts to generate the MRI-reconstructed muscle portions. Remeshing rebuilds the geometry of each of the plurality of final 3D muscle parts with an increased uniform topology. In some embodiments, remeshing may change the number, shape, size, or arrangement of polygons forming a surface mesh of each of the plurality of final 3D muscle parts. In some embodiments, remeshing may result in a simpler and less computationally difficult surface mesh for each of the MRI-reconstructed muscle portions.
[0074] In some embodiments, processor 111, upon generating the MRI-reconstructed muscle portions may perform method 300 to generate the MRI-reconstructed growth plate portion. That is, processor 111 may perform method 300 again to generate the MRI reconstructed growth plate portion, for example. To generate the MRI-reconstructed growth plate portion, processor 111, at step S302, creates a mask for the individual patient's femoral physis of the affected limb. That is, processor 111, receiving the full length femur MRI scan as input, isolates, or segments, the femoral physis of the affected limb in the full length femur MRI scan into a growth plate mask, for example Processor 111, at step S304, then generates a 3D growth plate part for the femoral physis based on the corresponding growth plate mask. Processor 111, at step S306, wraps the 3D generated growth plate part, resulting in a wrapped 3D growth plate part. Wrapping provides a continuous surface around the mesh. In some embodiments, a smallest detail setting is configured to be equal to 0.3 mm. In some embodiments, a gap closing distance setting is configured to be equal to 0.5 mm. In the context of a tibial osteotomy the same process is applied to the tibia rather than the femur.
[0075] Optionally, a smoothing step may be included after the wrapping S306. The smoothing step may comprise smoothing each of the wrapped 3D growth plate parts to generate a plurality of final 3D growth plate parts. However, the smoothing step may be omitted. For example, omitting the smoothing step may be preferable in order that the 3D generated parts remain true to the original anatomy. Thus, the plurality of final growth plate parts may be the output of the wrapping step S306, or where smoothing is included, the smoothing step.
[0076] At step S310, processor 111 compares the final 3D growth plate part to the plurality of medical imaging scans of the pre-operative medical imagery 116. That is, processor 111 compares the contours the final 3D growth plate part to the plurality of medical imaging scans of the pre-operative medical imagery 116 to determine part generation accuracy. In some embodiments, a user may interact with the electronic interface 113 to confirm the accuracy of the final 3D growth plate part. In some embodiments, processor 111, upon determining that the generated final 3D growth plate part is inaccurate, may return to step S302 and repeat the previously described steps, S302, S304, S306, and S310 in relation to generating the MRI-reconstructed growth plate portion.
[0077] In some embodiments, processor 111, upon determining that the final 3D growth plate part is accurate, moves to step S312. At S312, processor 111 performs remeshing on the final 3D growth plate part to generate the MRI-reconstructed growth plate portion. Remeshing rebuilds the geometry of the final 3D growth plate part with an increased uniform topology. In some embodiments, remeshing may change the number, shape, size, or arrangement of polygons forming a surface mesh of the final 3D growth plate part. In some embodiments, remeshing may result in a simpler and less computationally difficult surface mesh for the MRI-reconstructed growth plate portion.
[0078] In some embodiments, processor 111, upon generating the MRI-reconstructed growth plate portion may perform method 300 to generate the CT-reconstructed bone portions. That is, processor 111 may perform method 300 again to generate the CT reconstructed bone portions, for example. To generate the CT-reconstructed bone portions, processor 111, at step S302, creates a mask for each of the individual patient's affected femur and pelvis. That is, processor 111, receiving the full length femur CT scan as input, isolates, or segments, each of the affected femur and pelvis in the full length femur CT scan into a plurality of bone masks, for example. In some embodiments, processor 111 may eliminate unwanted pixels and fill part cavities in the plurality of bone masks prior to moving to step S304.
[0079] Processor 111, at step S304, then generates a plurality of 3D bone parts, one for each of the individual patient's affected femur and pelvis, based on the corresponding bone masks (or for tibia, and/or knee, and/or ankle, in the context of a tibial osteotomy). Processor 111, at step S306, wraps each of the plurality of 3D generated bone parts, resulting in a plurality of wrapped 3D bone parts. Wrapping provides a continuous surface around the mesh. In some embodiments, a smallest detail setting is configured to be equal to 0.15 mm. In some embodiments, a gap closing distance setting is configured to be equal to 0.3 mm. Optionally, a smoothing step may be included after the wrapping S306. The smoothing step may comprise smoothing each of the wrapped 3D bone parts to generate a plurality of final bone parts. However, the smoothing step may be omitted. For example, omitting the smoothing step may be preferable in order that the 3D generated bone parts remain true to the original anatomy. Thus, the plurality of final bone parts may be the output of the wrapping step S306, or where smoothing is included, the smoothing step.
[0080] At S310, processor 111 compares each of the plurality of final 3D bone parts to the plurality of medical imaging scans of the pre-operative medical imagery 116. That is, processor 111 compares the contours of each of the plurality of final 3D bone parts to the plurality of medical imaging scans of the pre-operative medical imagery 116 to determine part generation accuracy. In some embodiments, a user may interact with the electronic interface 113 to confirm the accuracy of each of the plurality of final 3D bone parts. In some embodiments, processor 111, upon determining that the plurality of generated final 3D bone parts are inaccurate, may return to step S302 and repeat the previously described steps, S302, S304, S306, (and optional smoothing step where included), and S310 in relation to generating the CT-reconstructed bone portions.
[0081] In some embodiments, processor 111, upon determining that the plurality of generated final 3D bone parts are accurate, moves to step S312. At S312, processor 111 performs remeshing on each of the plurality of final 3D bone parts to generate the CT reconstructed bone portions. Remeshing rebuilds the geometry of each of the plurality of final 3D bone parts with an increased uniform topology. In some embodiments, remeshing may change the number, shape, size, or arrangement of polygons forming a surface mesh of each of the plurality of final 3D bone parts. In some embodiments, remeshing may result in a simpler and less computationally difficult surface mesh for each of the CT-reconstructed bone portions.
[0082] In some embodiments, processor 111, upon completing step S312 of method 300 may move to step S314. At step S314, processor 111 assigns labels to each of the reconstructed portions with the individual patient's full name, unit record number, date of birth, and body side. In some embodiments, step S314 may be performed after generating each of the reconstructed portions. That is, step S314 may be performed each time method 300 is performed by processor 111. In some embodiments, step S314 may be performed after generating all of the reconstructed portions. That is, step S314 may be performed after generating the MRI-reconstructed bone portions, the MRI reconstructed muscle portions, the MRI-reconstructed growth plate portion, and the CT reconstructed bone portions, for example.
[0083] Processor 111, after generating the MRI-reconstructed bone portions, and/or the MRI-reconstructed muscle portions, and/or the MRI-reconstructed growth plate portion, and/or the CT-reconstructed bone portions, and performing step S316, continues to generate the personalised pre-operative software model of the patient's anatomical structures. In some embodiments, processor 111, continuing step S202, performs an alignment of the related MRI-reconstructed portions and the CT-reconstructed portions. Processor 111 matches common landmarks on the femur and pelvis MRI-reconstructed portions and the femur and pelvis CT-reconstructed portions. Processor 111 then aligns the common landmarks of the femur and pelvis MRI-reconstructed portions and the femur and pelvis CT-reconstructed portions. In some embodiments, alignment of the femur and pelvis MRI-reconstructed portions and the femur and pelvis CT-reconstructed portions is performed by processor 111 until an average distance error of less than 0.1mm between the corresponding portions is obtained, for example. In some embodiments, alignment of the femur and pelvis MRI-reconstructed portions and the femur and pelvis CT-reconstructed portion generates a surfaced 3D model of the patient's pelvis, hip, and lower limb bones. That is, processor 111 may generate a simulated model of the patient's pre-operative bone structure (simulated bone structure model), for example. It is noted that the equivalent process is performed for the tibia with the knee joint and/or the ankle joint in the context of a tibial osteotomy procedure.
[0084] In some embodiments, generating the personalised pre-operative software model of the patient's anatomical structures includes a 3D anatomical analysis. That is, a 3D anatomical analysis of the bone position and dimension data, for example. In some embodiments, the 3D anatomical analysis may include defining one or more axes and planes in the bone position and dimension data. In some embodiments, the 3D anatomical analysis may include measuring one or more 3D angles between the defined one or more axes planes. Processor 111, continuing step S202 of method 200, may perform the 3D anatomical analysis on the previously generated MRI-reconstructed bone portions and/or the CT-reconstructed bone portions, or simulated bone structure model. In some embodiments, any one of the MRI-reconstructed bone portions or the CT-reconstructed bone portions may contain a plurality of surfaced bone models, such as a femur bone model, tibial bone model, for example.
[0085] Processor 111 may process different portions of the femur bone model to define the one or more axes and planes in the bone position and dimension data. Processor 111, in performing the 3D anatomical analysis, may define an axis of the femoral neck of the femur bone model. To determine the axis of the femoral neck, processor 111 performs slicing of the femoral neck at set distanced intervals to produce a plurality of slices. Processor 111 then computes a plurality of centroids for each of the plurality of slices. The axis of the femoral neck is then defined as the line of best fit through the plurality of centroids of the plurality of slices. In some embodiments, the set distanced intervals are at about 1mm, for example. In the context of a tibial osteotomy, the one or more axes and planes may include an axis or plane of the medial condyle, the lateral condyle, the intercondylar eminence, and/or the medial malleolus, and may include an indication of the length of each by defining end points of the respective axes.
[0086] Processor 111, in performing the 3D anatomical analysis, may define an axis of the posterior condylar of the femur bone model. To determine the axis of the posterior condylar, processor 111 first identifies the medial femoral condyles of the femur bone model and the lateral femoral condyles of the femur bone model. Processor 111 then computes a first most posterior point on the identified medial femoral condyles of the femur bone model. Processor 111 then computes a second most posterior point on the identified lateral femoral condyles of the femur bone model. The axis of the posterior condylar is then defined as the line that connects the first most posterior point and the second most posterior point. An equivalent process may be performed for the condyles, eminences, and malleolus of the tibia.
[0087] Processor 111, in performing the 3D anatomical analysis, may define along axis of the femur bone model, or the tibial bone model, depending on anatomical context. In this particular example, determination of long axis of the femur bone model is presented. To determine the long axis of the femur, processor 111 first identifies the femoral condyles and the lesser trochanter. Processor 111 then computes the centroid of the widest cross-sectional area of the femoral condyles. Processor 111 then identifies a proximal femoral slice positioned just below the lesser trochanter. Processor 111 then computes the centroid of the proximal femoral slice. The long axis of the femur is then defined as the line that connects the computed centroid of the widest cross-sectional area of the femoral condyles and the computed centroid of the proximal femoral slice.
[0088] Processor 111, in performing the 3D anatomical analysis, may define a plane across the base of the epiphysis of the femur bone model. To determine the plane across the base of the epiphysis, processor 111 first identifies the epiphysis. Processor 111 then computes a plane that best fits the base of the epiphysis. The plane across the base of the epiphysis is then defined as the computed plane that best fits the base of the epiphysis.
[0089] Processor 111, in performing the 3D anatomical analysis, may measure a femoral anteversion angle. Processor 111, in measuring the femoral anteversion angle, first defines a plane perpendicular to the defined long axis of the femur. Processor 111 then computes the angle between the defined axis of the femoral neck and the defined axis of the posterior condylar. The computed angle is the femoral anteversion angle. In some embodiments, the femoral anteversion angle is measured as the angle between the defined axis of the femoral neck and the defined axis of the posterior condylar when projected onto the defined plane perpendicular to the defined long axis of the femur.
[0090] Processor 111, in performing the 3D anatomical analysis, may measure a femoral neck shaft angle. Processor 111, in measuring the femoral neck shaft angle, computes the 3D angle between the axis of the femoral neck and the long axis of the femur. The computed angle is the femoral neck shaft angle. Processor 111, in performing the 3D anatomical analysis, may measure an inferior slip angle. Processor 111, in measuring the inferior slip angle, first defines a line perpendicular to the plane across the base of the epiphysis. Processor 111 then computes the 3D angle between the defined line perpendicular to the plane across the base of the epiphysis and the long axis of the femur. The computed angle is the inferior slip angle.
[0091] Processor 111, in performing the 3D anatomical analysis, may measure a posterior slip angle. Processor 111, in measuring the posterior slip angle, first manipulates the previously generated 3D surfaced model of the patient's pre-operative bone structure such that the knee is flexed at a 30° angle, and the hip is abducted at a 45 angle. That is, processor 111 manipulates the 3D surfaced model of the patient's pre operative bone structure into a paediatric hip view (frog leg lateral view), for example. Processor 111 then defines a line perpendicular to the plane across the base of the epiphysis. Processor 111 then computes the 3D angle between the defined line perpendicular to the plane across the base of the epiphysis and the long axis of the femur. The computed angle is the posterior slip angle.
[0092] Processor 111, in performing the 3D anatomical analysis, may measure a length of the femur. Processor 111, in measuring the length of the femur, first computes a most proximal point on the femur. Processor 111 then computes a most distal point on the femur. Processor 111 then computes the length of the femur to be the distance between the most proximal point on the femur and the most distal point on the femur. In some embodiments, a user may interact with the electronic interface 113 to manually determine, or measure, any of the aforementioned measurements. Processor 111, upon completion of the 3D anatomical analysis, compiles the aforementioned generated data to finalise the personalised pre-operative software model of the individual patient's anatomical structure. Processor 111, on finalising the pre-operative software model, moves onto step S204.
[0093] At S202, in addition to generating a personalised pre-operative software model of patient anatomical structure of an individual patient based on received pre-operative medical imagery of the patient anatomical structure, patient movement analysis data is used to generate the personalised pre-operative software model of patient anatomical structure. The patient movement analysis data may be generated by virtually mobilising the relevant joint in an anatomical simulator such as OpenSim. For example, an estimation of a muscle moment arm may be made in this manner and stored or otherwise used as all or part of the patient movement analysis data.
[0094] Alternatively or additionally, the patient movement analysis data may be generated by a movement sensor system observing movement of the patient anatomical structure of the individual patient over a period of time. Wherein the period of time may be, for example, time taken to perform a prescribed movement one or more times or a set of prescribed movements. The sensor system may include accelerometers attached to the skin of the patient, and may include cameras and other imaging technology. Figure 8 illustrates patient movement analysis data of a hip joint of a sample patient, wherein the pre-operative information is the relevant data in generating the personalised pre operative software model of patient anatomical structure. Muscle moments of the glutei muscles at a range of hip flexion angles are calculated as patient movement analysis data and included in the personalised pre-operative software model of patient anatomical structure. In the context of a tibial osteotomy, equivalent analysis of movement around a knee joint and/or an ankle joint may be performed.
[0095] At step S204, processor 111 receives a software definition of a surgical procedure to be performed in relation the pre-operative software model. In some embodiments, processor 111 may receive the software definition from software definition module 128. In some embodiments, processor 111 the software definition may be received as input from a user via electronic interface 113. In some embodiments, the orthopaedic surgery planning device 100 may receive the software definition from an external devices via communications module 112 via network 140.
[0096] In embodiments where the software definition is received via user input or an external device, the software definition may define a value range for each of: at least one osteotomies in at least one specified bone; for each osteotomy, a specified bone from the patient's anatomical structure to be cut by the osteotomy; for each osteotomy, a position and orientation of an osteotomy plane; for each osteotomy, and a relative position and orientation of at least two post-osteotomy distinct bone portions. Processor 111, upon receiving the software definition, may iteratively constrain the software definition of the surgical procedure to include a patient-specific osteotomy plan via simulation. To constrain the received software definition, processor 111 may determine a specific value from within each of the plurality of defined value ranges for the surgical procedure.
[0097] In embodiments where the software definition is received via user input or an external device, the software definition may further define a value range for each of: a repositioning of at least one specified bone within an allowable range; and a range of available implants with corresponding chisels. Processor 111, upon receiving the software definition, the software definition may iteratively constrain the software definition of the surgical procedure to include a patient-specific implant plan via simulation. To constrain the software definition, processor 111 may determine at least one repositioning, and for each of the at least one determined repositioning's, processor 111 determines a selected implant and corresponding chisel from the available range. In some embodiments, the software definition includes a 3D model of the selected implant (implant model) and corresponding chisel (chisel model).
[0098] In embodiments where the software definition is received via software definition module 128, processor 111 executes the software definition module 128 to generate the receivable software definition. Processor 111, in executing the software definition module 128, defines the plurality of value ranges for the surgical procedure. Processor 111 establishes the at least one osteotomy in the bone position and dimension data. In some embodiments, a user may interact with the electronic interface 113 to manually define the at least one osteotomy. Processor 111, for each defined osteotomy, determines the specific bone from the patient's anatomical structure to be cut by the osteotomy. Processor 111, for each defined osteotomy, determines the position and orientation of an osteotomy plane. Processor 111, for each defined osteotomy, determines the relative position and orientation of at least two post-osteotomy distinct portions. In some embodiments, processor 111 may iteratively constrain the software definition of the surgical procedure, received via the software definition module 128, to include the patient-specific osteotomy plan via simulation. To constrain the software definition, processor 111 may determine a specific value from within each of the plurality of defined value ranges for the surgical procedure.
[0099] In some embodiments, Processor 111, in executing the software definition module 128, may further define the value range for each of: the repositioning of at least one specified bone within an allowable range; and the range of available implants with corresponding chisels. The allowable range is a range wherein a range of motion of the at least one repositioned specified bone is within feasible kinematic movement constraints of the related joint. Processor 111, in defining the value range of the at least one specified bone, manipulates the simulated bone structure model to reposition the proximal femur. Processor 111 repositions the proximal femur until the desired corrections are achieved. In some embodiments, the desired corrections include increased range of motion in at least one of: hip flexion, hip adduction, and/or hip rotation, for example. In some embodiments, a user may interact with the electronic interface 113 to manually reposition the proximal femur to achieve the desired corrections. In the context of a tibial osteotomy, an equivalent process may be performed for a knee joint and/or an ankle joint.
[0100] Processor 111, upon achieving the desired corrections, determines a desired implant from the range of available implants in order to implement in the surgical procedure the post-operative patient anatomical structure. Processor 111 may choose the desired implant such that it satisfies a combination of: the largest implant possible, the longest implant possible, and the most stable implant possible. In some embodiments, processor 111 may choose a plurality of implants that satisfy the above criteria, presenting a user with a plurality of options. In some embodiments, processor 111 may perform a plurality of simulations of the simulated bone structure model to determine said implant. In some embodiments, a user may interact with the electronic interface 113 to manually select an implant from the range of available implants. Processor 111, upon determining the implant from the range of available implants, determines the corresponding chisel, or chisels, respective of the selected implant. In some embodiments, processor 111 may iteratively constrain the software definition, received via the software definition module 128, to include a patient-specific implant plan via simulation. To constrain the software definition, processor 111 may determine at least one repositioning, and for each of the at least one determined repositioning's, processor 111 determines a selected implant and corresponding chisel from the available range.
[0101] Processor 111, upon constraining the software definition, continues at step S206. At step S206, processor 111 generates a personalised post-operative software model of the patient's anatomical structure (post-operative software model). Processor 111 generates the post-operative software model based on the personalised pre-operative software model and the received software definition of the surgical procedure. In generating the post-operative software model, processor 111 integrates the received 3D model of the selected implant into the bone of interest of the simulated bone structure model, for example the femur or tibia. Processor 111 performs the integration of the implant model such that the implant model is in proximity to the centre of the femoral head. Processor 111 further integrates the implant model such that the implant model is: at least 2.5mm from the cortical bone; at least 3mm from the cortical bone; or at least mm from the cortical bone. Optionally, processor may initially seek to satisfy a relatively greater minimum cortical bone separation, for example 5mm, and if that is unachievable, may then (for example, after notifying an operator and optionally seeking approval), relax the minimum cortical bone separation requirement to a relatively smaller minimum cortical bone separation such as 3mm or 2.5mm. Such relaxation of minimum cortical bone separation may be particularly helpful in abnormal anatomy implementations. Processor 111 further integrates the implant model such that the implant model has adequate structural fixation within the femur bone. The result of the integration of the implant model with the femur of the simulated bone structure model is a corrected femur model. The result of the integration of the implant model with the tibia of the simulated bone structure model is a corrected tibia model. Processor 111 then integrates the received chisel model into the corrected femur model. Processor 111 integrates the chisel model by overlapping it with the already integrated implant model.
[0102] Upon integration of the implant model and the chisel model with the simulated bone structure model, processor 111 continues step S206 and performs a bone thickness analysis. The bone thickness analysis verifies that adequate bone thickness remains in place once the chisel or implant are inserted into the bone. . To perform the bone thickness analysis, processor 111 generates a post-operative software model of the proximal femur by subtracting the model of the chisel or implant from the pre-operative software model of the proximal femur. Processor 111 then analyses the generated post operative software model of the proximal femur to ensure that the thickness of the bone is greater than 2.5mm, greater than 3mm, or greater than 5mm. Optionally, processor may initially seek to satisfy a relatively greater minimum bone thickness, for example mm, and if that is unachievable, may then (for example, after notifying an operator and optionally seeking approval), relax the minimum bone thickness requirement to a relatively smaller minimum bone thickness such as 3mm or 2.5mm. Such relaxation of minimum bone thickness may be particularly helpful in abnormal anatomy implementations. If the thickness of the bone is less than minimum bone thickness, processor 111 may return to S204 and require receipt of an alternate software definition.
[0103] Processor 111, upon completing the bone thickness analysis, performs a growth plate violation analysis. The growth plate violation analysis verifies that adequate growth plate volume remains once the implant is implanted into a bone orifice created by the chisel. To perform the growth plate violation analysis, processor 111 generates a post operative software model of the growth plate by subtracting the model of the implant from the pre-operative software model of the growth plate. Processor 111 then analyses the pre-operative software model of the growth plate to compute a volume of the pre operative software model of the growth plate. Processor 111 then analyses the generated post-operative software model of the growth plate to compute a volume of the post operative software model of the growth plate. Processor 111 then compares the volume of the post-operative software model of the growth plate to the volume of the pre operative software model of the growth plate.
[0104] Processor 111, upon generating the personalisedpost-operative software model of the patient's anatomical structure, continues at step S208. At step S208, processor 111 simulates movement of the bones and muscles of the patient's anatomical structure to generate a simulation output. To simulate movement of the bones and muscles of the patient's anatomical structure, processor 111 first generates a pre-operative 4D personalised functional model of the patient's anatomical structure. Processor 111 then generates a post-operative 4D personalised functional model of the patient's anatomical structure. In some embodiments, a user may manually generate the pre-operative 4D personalised functional model via the electronic interface 113. In some embodiments, a user may manually generate the post-operative 4D personalised functional model via the electronic interface 113.
[0105] Processor 111, further to generating the personalised post-operative software model of the patient anatomical structure at S206, simulates movement of the bones and muscles of the patient's anatomical structure to generate simulation output at S208. The simulation of movement predicts the range of movement and moment arms at different angles that will be achievable post-operation. Furthermore, a machine learning module may be trained to predict range of movement and/or moment arms at a range of angles based on training data input data encoding patient anatomical structure (for example including two-dimensional or three-dimensional images of hip joints including hip socket, femur, glutei muscles; or including two-dimensional or three-dimensional images of knee joints or ankle joints and including gastrocnemius and/or soleus for example) and labelled with ground truths being one or more parameters representing range of movement and muscle moment arms at one angle or a range of angles, and to illustrate the prediction in the post-operative 4D personalised functional model, for example. The machine learning module is thereby trained to predict the parameters representing range of movement and muscle moment arms based on arbitrary input patient anatomical structure (such as personalised post-operative software model of patient anatomical structure at S206). In a particular example, an optimisation algorithm running different constraints on the software definition of the surgical procedure iteratively (for example via simulated annealing, backward error propagation, or another optimisation technique) optimises the predicted parameters including range of movement and moment arm at one or more angles.
[0106] In generating the pre-operative 4D personalised functional model, processor 111 identifies surfaces and landmarks of the plurality of bones in the simulated bone structure model. Processor 111 utilises the identified surfaces and landmarks to create body and joint reference systems. The body reference systems relate to the pelvis and femurs. The joint reference system relates to the hip. Processor 111 also identifies centroids of the glutei muscle attachment areas on the plurality of bones in the simulated bone structure model. The glutei muscle attachment areas may include a muscle origin and insertion point for each muscle. In some embodiments, a user may manually identify the surfaces and landmarks of the plurality of bones in the simulated bone structure model. In some embodiments, a user may manually identify the centroids of the glutei muscle attachment areas on the plurality of bones in the simulated bone structure model. In the context of the tibial osteotomy an equivalent procedure is performed for the centroids of the gastrocnemius and/or the soleus.
[0107] In some embodiments, processor 111 creates the body reference system for the pelvis using a stereophotogrammetric system and anatomical landmark calibration. Methods of stereophotogrammetry and anatomical landmark calibration can be found in "Cappozzo, A., Catani, F., Croce, U. D., & Leardini, A. (1995). Position and orientation in space of bones during movement: anatomical frame definition and determination. Clinical biomechanics (Bristol, Avon), 10(4), 171-178. https://doi.org/10.1016/0268 0033(95)91394-t", for example.
[0108] In some embodiments, processor 111 creates the body reference system for the femur using a navigation system and anatomical reference frame definition methods. Methods of navigation systems and anatomical reference frame definition methods can be found in "Belvedere, C., Ensini, A., Leardini, A., Bianchi, L., Catani, F. and Giannini, S. (2007), Alignment of resection planes in total knee replacement obtained with the conventional technique, as assessed by a modern computer-based navigation system. Int. J. Med. Robotics Comput. Assist. Surg., 3: 117-124. https://doi.org/10.1002/rcs.131", for example.
[0109] Processor 111 then creates a parent reference system and a child reference system for each hip. Processor 111 centres the parent and child reference systems on the hip joint centre for each hip. Processor 111 then pairs the appropriate bones and hip joints together to generate a 4D functional bone structure. Processor 111 then integrates the glutei muscles with the 4D functional bone structure by defining lines of action of the glutei muscles between each identified muscle origin and insertion point. The 4D functional bone structure integrated with the glutei muscles results in the pre-operative 4D personalised functional model.
[0110] Upon generating the pre-operative 4D personalised functional model, processor 111 generates the post-operative 4D personalised functional model. Processor 111 identifies surfaces and landmarks of the plurality of bones in the simulated bone structure model and the post-operative software model. Specifically, processor 111 utilises the corrected femur model in generating the post-operative 4D personalised function model. Processor 111 utilises the identified surfaces and landmarks to create corrected body and corrected joint reference systems. The corrected body reference systems relate to the pelvis and corrected femur. The corrected joint reference system relates to the hip. Processor 111 also identifies centroids of the glutei muscle attachment areas on the plurality of bones in the simulated bone structure model and the post-operative software model. The glutei muscle attachment areas may include a muscle origin and insertion point for each muscle. In some embodiments, a user may manually identify the surfaces and landmarks of the plurality of bones in the simulated bone structure model and the post-operative software model. In some embodiments, a user may manually identify the centroids of the glutei muscle attachment areas on the plurality of bones in the simulated bone structure model, noting that an equivalent procedure may be performed in the context of other bones and muscles such as tibia, gastrocnemius, soleus.
[0111] The processor 111 may iteratively repeat the process of constraining the software definition and generating the 4D personalised functional model. For example, an algorithm may calculate one or more parameter values or metrics from the 4D personalised functional model, such as range of movement, gait symmetry (i.e. comparison of stepping with left foot down to stepping with right foot down), power, among others, to represent, analyse, or otherwise assess the 4D personalised functional model. Based on a computational solver technique such as backward error propagation or simulated annealing, for example, the processor 111 may be configured to modify the constraints applied to the software definition in order to optimise the calculated parameter value or metric to achieve a maximum/minimum, a local maximum/minimum, or a predefined target. The software definition is effectively a generic definition of a surgical procedure with one or more degrees of freedom, wherein each degree of freedom is selectable within a defined range. The processor 111 may be configured, by the iterative process outlined herein, to calculate, select, or otherwise to determine, the values within the defined range that provide an optimised output, that is, to generate a specific version of the generic definition of the surgical procedure. Optimised outputs may be absolute or may be based on comparison of the pre-operative and post-operative functional models of the patient. As an example, an absolute range of movement about the hip may be sought as an optimum, or an improvement of X degrees or Y percent.
[0112] The processor 111 may, based on the post-operative 4D personalised functional model, determine one or more rotational corrections that will improve patient range of motion, power, or gait, and then determine an osteotomy plane or planes and implant selection and/or positions to achieve those rotational corrections. Figures 5A to 5C illustrate planned rotational corrections in the specific example of the patient imaged in Figures 4D to 4F. The femoral neck shaft angle is to be rotated by 17.5 degrees in a first rotational direction. The femoral anteversion is to be rotated 38.8 degrees in a second rotational direction opposing the first rotational direction. A femoral head flexion angle is to be rotated 32.7 degrees. The determined osteotomy plane or planes and implant selection and/or positions are exemplary of constraints on the software definition of the surgical procedure.
[0113] For example, the rotational corrections resulting from osteotomies may calculated deterministically by an algorithm. A machine learning algorithm may be trained, via ground truth training data which is pre-osteotomy bone (and/or muscle) images, osteotomy planes and positions, and resultant rotational corrections, to predict a rotational correction resulting from osteotomies at particular planes and positions on a particular pre-osteotomy bone. Then, by a solving algorithm such as backward error propagation or simulated annealing, the machine learning algorithm determines, based on input images of a bone and desired rotational corrections, osteotomy planes and positions to achieve the desired rotational corrections or to optimise correlation between the desired rotational corrections and the predicted rotational corrections. Similarly, the machine learning algorithm is trained, given the determined osteotomy planes and positions and the desired rotational corrections, to select an implant (from a predefined list forming part of the software definition of the surgical procedure) and implant position to best correlate the predicted rotational correction (as modified by the implant selection and position) with the desired rotational correction. Figure 6A illustrates a bone portion from the personalised pre-operative model of the patient anatomical structure, onto which candidate osteotomy planes and positions are drawn for illustrative purposes. A predicted rotational correction resulting from the osteotomies of Figure 6A is illustrated in Figure 6B, representative of a personalised post-operative model of the patient anatomical structure (e.g. predicted outcome of surgical procedure). Figure 6C illustrates a selection and position of an implant to best achieve the desired rotational correction, and Figure 6D illustrates an initial position of the implant. The machine learning algorithm may be trained by multi-variant analysis to predict rotational corrections to a bone (represented by the personalised pre-operative model of the patient anatomical structure) how variables including: osteotomy plane, osteotomy position, implant selection, and implant position; will influence rotational correction. Therefore, in use, the trained machine learning algorithm can, based on the personalised pre-operative model of the patient anatomical structure, determine one or more parameters from among osteotomy plane, osteotomy position, implant selection, and implant position, to best correlate a predicted rotational correction with a desired rotational correction. Bearing in mind those parameters are selectable within ranges or other borders or limits defined in the software definition of the surgical procedure.
[0114] In some embodiments, processor 111 creates the corrected body reference system for the pelvis using the stereophotogrammetric system and anatomical landmark calibration. In some embodiments, processor 111 creates the corrected body reference system for the corrected femur using the navigation system and anatomical reference frame definition methods. Processor 111 then creates a corrected parent reference system and a corrected child reference system for each hip. Processor 111 centres the corrected parent and corrected child reference systems on the hip joint centre for each hip. Processor 111 then pairs the appropriate bones, corrected femur, and hip joints together to generate a 4D corrected functional bone structure. Processor 111 then integrates the glutei muscles with the 4D corrected functional bone structure by defining lines of action of the glutei muscles between each identified muscle origin and insertion point. The 4D corrected functional bone structure integrated with the glutei muscles results in the post operative 4D personalised functional model, or the corrected-anatomy model.
[0115] Processor 111, upon generating the pre-operative 4D personalised functional model and the post-operative 4D personalised functional model, simulates movement of the bones and muscles of the patient's anatomical structure to generate a simulation output. Processor 111 generates the simulation output by determining hip range of motion limits and estimating glutei muscle moment arms. To determine the hips range of motion limits, processor 111 mobilises the hip through its passive range of motion and the range of motion in each degree of freedom. That is, processor 111 mobilises the hip such that is undergoes hip flexion, hip extension, internal rotation, external rotation, hip abduction, and hip adduction, for example. The range of motion limits are then defined by processor 111 when there is bone-on-bone contact during mobilisation. Processor 111 then estimates the glutei muscle moment arms by performing a muscle analysis. Muscle moment arms are the perpendicular distance of the muscle line of action from a joint axis. In the alternative context of a tibial osteotomy, an equivalent process may be performed in the context of a knee joint, noting that rotation may be excluded in that context.
[0116] Upon generating the simulation output, processor 111 moves to step S210. At step S210, processor 111 may adjust the personalised post-operative software model based on the simulation output. That is, processor 111 may alter the positioning of the implant or the positioning of the corrected body and corrected joint reference systems to improve the range of motion limits and estimated glutei muscle moment arms, for example. In some embodiments, processor 111 may utilise artificial intelligence and/or machine learning to adjust the post-operative software model based on the simulation output. An artificial intelligence and/or machine learning code module for performing the adjustment of the post-operative software model based on the simulation output may be implemented within the surgery planning module 124. The machine learning algorithm may be trained using desired, or ideal, 4D functional software models, for example.
[0117] Processor 111, upon completing adjustment of the personalised post-operative software model based on the simulation output (if required), continues executing method 200 and proceeds to step S212. At step S212, processor 111 generates a personalised surgical cutting guide model (cutting guide model) based on the personalised post operative software model or the adjusted personalised post-operative software model. Surgical guide design is illustrated in Figure 9. In generating the cutting guide model, processor 111 defines a mask portion 910 of the surgical cutting guide model. The mask portion is configured to conform to the at least one curve or other geometric feature of the bone position and dimension data of bones in the patient's anatomical structure. Processor 111 defines each mask portion such that it has a minimum span of at least 160 over the contour of respective bone 905. On defining each mask portion, processor 111 then uniformly externally offsets each mask portion. In some embodiments, the uniform external offset is about 3.2mm. Processor 111 then smooths the contour of each mask portion. In some embodiments, processor 111 performs smoothing of each mask portion with an influence distance of 0.5mm. Processor 111 then finalises each mask portion by ensuring that the mesh of each respective bone has no errors. If the mesh of the respective bone includes at least one error, processor 111 may automatically fix the at least one error.
[0118] In some embodiments, after defining each mask portion, processor 111 may further generate the cutting guide model by defining an osteotomy slot portion 920 of the surgical cutting guide model for each osteotomy. Each osteotomy slot portion may be an aperture in the respective mask portion, positioned and orientated according to the specific values for the respective osteotomy. In defining each osteotomy slot portion, processor 111 generates a sketch that best fits the respective osteotomy on the pre operative software model. Processor 111 then generates a profile for each osteotomy slot portion 920. Each profile is generated by externally (distally) offsetting the respective bone contour of the respective osteotomy. In some embodiments, the external offset is about at least 15mm. Processor 111 then extrudes each profile in both the proximal and distal directions. In some embodiments, each profile is extruded by about 1.6mm in both the proximal and distal directions. The configuration of the surgical cutting guide model by the processor 111 is to transform the skeleton of the patient from the pre-operative patient anatomical structure imaged and digitalised at S202 to a physical realisation of the post-operative software model of the patient anatomical structure generated t S206 as informed by S208 and S210.
[0119] After defining the osteotomy slot portion, processor 111 then defines an osteotomy saw blade insertion profile for each osteotomy slot portion. Processor 111 generates each saw blade profile for the respective osteotomy slot portion with respect to the respective bone contour. Processor 111 then extrudes each saw blade profile in both the proximal and distal directions. In some embodiments, each saw blade profile is extruded by about 0.4mm in both the proximal and distal directions. Processor 111 then removes the saw blade insertion profile from the osteotomy slot portion. Processor 111 then finalises the osteotomy slot portion by ensuring that the mesh of each respective bone has no errors. If the mesh of the respective bone includes at least one error, processor 111 may automatically fix the at least one error.
[0120] In some embodiments, after defining the mask portion 910 for each osteotomy, processor 111 may further generate the cutting guide model by defining one or more from among a saw blade orientation and an osteotomy chisel insertion slot 930 for each osteotomy. Each osteotomy chisel insertion slot 930 may be defined as a location on the respective defined mask portion. Each osteotomy chisel insertion slot 930 may be defined in a direction relative to the respective mask portion 910. Processor 111 generates a sketch of the chisel insertion slot. In some embodiments, the sketch of the chisel insertion slot is perpendicular to the respective saw blade insertion profile for each osteotomy on the pre-operative software model. Processor 111 then inserts the respective chisel model received at step S204 into the sketch of the chisel insertion slot for each osteotomy. Processor 111 then externally (distally) offsets each inserted chisel model. In some embodiments, the external (distal) offset is about 2.5mm. Processor 111 then externally extrudes each inserted chisel model. In some embodiments, each chisel model is extruded by a distance of about at least 15mm. Processor 111 then finalises each chisel insertion slot by ensuring that the mesh of each respective bone has no errors. If the mesh of the respective bone includes at least one error, processor 111 may automatically fix the at least one error.
[0121] In some embodiments, the defined chisel insertion slot of each osteotomy may further comprises a hole to removably receive a guide wire 1010 and a guide wire seat as illustrated in the manufactured surgical guide of Figure 10. That is, processor 111 may further define the surgical cutting guide model by defining a chisel insertion hole 1020. In embodiments that include a chisel insertion hole, the chisel model includes a hole for the guide wire insertion. Processor 111, after finalising the chisel insertion slot utilising the chisel mode that includes the hole for the guide wire insertion, generates a removable guide wire seat. In some embodiments, the removable guide wire seat may be L-shaped. In some embodiments, the guide wire seat may be configured for insertion into the hole at one end. In some embodiments, the guide wire seat may be configured to longitudinally receive the guide wire. Processor 111 then merges the chisel insertion slot with the generated removable guide wire seat. Processor 111 then finalises each chisel insertion slot with the chisel insertion hole by ensuring that the mesh of each respective bone has no errors. If the mesh of the respective bone includes at least one error, processor 111 may automatically fix the at least one error.
[0122] In some embodiments, after defining the mask portion for each osteotomy, processor 111 may further generate the cutting guide model by defining at least one implant fixation slot. The at least one implant fixation slot may be defined as a location on the respective defined mask portion based on a shaft surface of the implant. Processor 111 generates a sketch of the implant fixation slot for each osteotomy on the pre operative software model by best fitting the implant shaft surface. Processor 111 then inserts the respective implant model received at step S204 into the sketch of the implant fixation slot for each osteotomy. In some embodiments, the implant model may include at least one implant fixation screw. Processor 111 then externally offsets each inserted implant model. In some embodiments, the external (distal) offset is about 2mm. Processor 111 then externally extrudes each inserted implant model. In some embodiments, each implant model is extruded by a distance of about at least 15mm. Processor 111 then finalises each implant model by ensuring that the mesh of each respective bone has no errors. If the mesh of the respective bone includes at least one error, processor 111 may automatically fix the at least one error.
[0123] In some embodiments, processor 111, after defining at least one of: the saw blade insertion portion; the chisel insertion slot; and/or the at least one implant fixation screw; may remove at least one of: the saw blade insertion portion; the chisel insertion slot; or the at least one implant fixation screw; from the defined mask portion of the respective osteotomy. That is, processor 111 may remove the saw blade insertion portion from the mask portion of the respective osteotomy, for example. That is, processor 111 may remove the chisel insertion slot from the mask portion of the respective osteotomy, for example. That is, processor 111 may remove the at least one implant fixation screw from the mask portion of the respective osteotomy, for example.
[0124] In some embodiments, processor 111, after removing at least one of: the saw blade insertion portion; the chisel insertion slot; and/or the at least one implant fixation screw; from the defined mask portion of the respective osteotomy, may combine, for each osteotomy, the mask portion and at least one of: the osteotomy slot portion; the chisel insertion slot; and/or the implant fixation slot. That is, processor 111 may combine the mask portion with at least one of the previously defined slots or portions to further generate the surgical cutting guide model. In some embodiments, processor 111 may then assign data points (labels) and/or patient identifiers to the generated surgical cutting guide model. Processor 111 then finalises the surgical cutting guide model by ensuring that the mesh of each respective bone has no errors. If the mesh of the respective bone includes at least one error, processor 111 may automatically fix the at least one error.
[0125] Upon finalising the surgical cutting guide model, processor 111 moves to step S214. At step S214, processor 111 generates the personalised surgical operation instructions 120. The personalised surgical operation instructions 120 may include the personalised pre-operative software model, the personalised post-operative software model, the selected implant and corresponding chisel, and/or the personalised cutting guide model.
[0126] Processor 111 upon completion of step S214 moves to step S216. In some embodiments, processor 111 may perform step S216 prior to step S214. At step S216, processor 111 generates 3D model printing instructions 122 of at least one from: the personalised pre-operative software model, the personalised post-operative software model, and/or the personalised cutting guide model. Upon completion of both steps S214 and S216, processor 111 stops executing method 200 of surgery planning module 124. 3D printing may be with a biocompatible PA2200 polyamide powder.
[0127] In some embodiments, at least part of method 200 may be performed by a user via electronic interface 113. The user may utilise any one of, or a combination of, commercially available software solutions such as 'Mimics', '3-matic', 'MATLAB', 'Materialise Magics', and/or 'EOS Print', for example.
[0128] It will be appreciated by persons skilled in the art that numerous variations and/or modifications may be made to the above-described embodiments, without departing from the broad general scope of the present disclosure. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive.

Claims (31)

CLAIMS:
1. A computer-implemented method of pre-operative planning for patient surgery,
the method including:
generating by at least one computer processor a personalised pre-operative
software model of patient anatomical structure of an individual patient based on received
pre-operative medical imagery of the patient anatomical structure of the individual
patient and pre-operative patient movement analysis data related to movement of the
patient anatomical structure of the individual patient, the patient movement analysis data
being generated at least in part by a movement sensor system observing movement of the
patient anatomical structure of the individual patient over a period of time, wherein the
personalised pre-operative software model of the patient includes bone position and
dimension data of bones in the patient anatomical structure of the individual patient,
muscle position and dimension data of muscles in the patient anatomical structure of the
individual patient, and relationship definition data defining relationships between bones
and muscles in the patient anatomical structure of the individual patient;
receiving by the least one computer processor a software definition of a surgical
procedure to be performed in relation to the personalised pre-operative software model
of the patient anatomical structure of the individual patient;
generating by the least one computer processor a modified personalised post
operative software model of the patient anatomical structure of the individual patient
based on the personalised pre-operative software model and the software definition of
the surgical procedure; simulating by the at least one computer processor movement of the bones and muscles of the patient anatomical structure of the individual patient according to the modified personalised post-operative software model to generate simulation output; allowing by the least one computer processor adjustment of the personalised post operative software model based on the simulation output; generating by the least one computer processor a personalised surgical cutting guide model to facilitate the surgical procedure based on the personalised post-operative software model or the adjusted personalised post-operative software model; and generating by the least one computer processor at least one of:
3D printing instructions based on the personalised surgical cutting guide
model to transmit to a 3D printer to form a surgical cutting guide; or
personalised surgical operation instructions to transmit to a robotic surgical
system for use in performing the surgical procedure.
2. The computer-implemented method of claim 1, wherein the surgery is one from
among:
an osteotomy;
a femoral osteotomy;
a proximal femoral osteotomy;
a tibial osteotomy;
a high tibial osteotomy.
3. The computer-implemented of claim 1 or claim 2, wherein the surgical
procedure includes one or more osteotomies, and the simulation output comprises the
surgical cutting guide model, the surgical cutting guide model defining one or more
osteotomy planes in which to cut a bone or bones of the patient anatomical structure to
facilitate reconfiguration of the patient anatomical structure in accordance with the
modified software model or the adjusted modified software model.
4. The computer-implemented of any one of claims 1 to 3, wherein the surgical
procedure includes one or more osteotomies, and the simulation output comprises an
implant configured to secure a first portion of a bone cut by the one or more osteotomies
to a second, separate, portion of the same bone cut by the one or more osteotomies, in a
configuration determined in accordance with the modified software model or the adjusted
modified software model.
5. The computer-implemented method of any one of claims 1 to 4, wherein the
software definition of the surgical procedure defines a value range for each of the
following parameters:
a number of osteotomies in each of one or more specified bones;
for each osteotomy, a specific bone from the patient anatomical structure to be
cut by the osteotomy;
for each osteotomy, a position and orientation of an osteotomy plane;
for each osteotomy, a relative position and orientation of the two or more post
osteotomy distinct bone portions; wherein the simulation iteratively constrains the software definition of the surgical procedure to include a patient-specific osteotomy plan by determining a specific value from within the value range for each of the parameters, and wherein the surgical cutting guide model and the implant configuration implement the specific values for the parameters.
6. The computer-implemented method of claim 5, wherein the software definition
of the surgical procedure defines a value range for each of the following parameters:
a repositioning of one or more specified bones within an allowable range;
a range of available implants with corresponding chisels;
wherein the simulation iteratively constrains the software definition of the
surgical procedure to include a patient-specific implant plan by determining one or more
repositioning's and for each determined repositioning a selected implant and chisel from
the available range.
7. The computer-implemented method of claim 5 or claim 6, wherein the surgical
cutting guide model implements the specific values for one or more osteotomies from
the patient-specific osteotomy plan, wherein the surgical cutting guide model is
generated by, for each osteotomy:
defining a mask portion of the surgical cutting guide model configured to
conform to one or more curves or other geometric features of the bone position and
dimension data of bones in the patient anatomical structure of the individual patient.
8. The computer-implemented method of claim 7, wherein the surgical cutting
guide model is further generated by, for each osteotomy: defining an osteotomy slot portion of the surgical cutting guide model being an aperture in the mask portion positioned and orientated according to the specific values for the osteotomy; defining an osteotomy saw blade insertion profile and extruding the surgical cutting guide model according to the saw blade insertion profile to a predefined distance proximally and distally of the defined slot portion.
9. The computer implemented method of claim 4 and any one of claims 7 to 8,
wherein the surgical cutting guide model is further generated by, for each osteotomy:
defining an osteotomy chisel insertion as a location on the defined mask portion
and a direction relative to the mask, extruding the surgical cutting guide model according
to the location and direction by a predefined distance distal from the osteotomy chisel
insertion location.
10. The computer implemented method of claim 4 and any one of claims 7 to 9,
wherein the surgical cutting guide model is further generated by, for each osteotomy:
defining one or more implant fixation slots as a location on the defined mask
portion based on a shaft surface of the implant, extruding the surgical cutting guide model
at the one or more defined implant fixation slots by a predefined distance distally.
11. The computer-implemented method of claim 10, wherein the defined osteotomy
chisel insertion further comprises a hole in the surgical cutting guide model configured
to removably receive a guide wire and a guide wire seat, and wherein the surgical cutting
guide model is extruded distally around the hole to define the guide wire seat, the guide wire seat being configured for insertion into the hole at one end and to longitudinally receive the guide wire.
12. The computer-implemented method of any one of claims 5 to 11, wherein the
surgical cutting guide model is converted to 3D printing instructions, and the method
further comprises 3D printing surgical cutting guide from the 3D printing instructions.
13. The computer-implemented of any one of claims 1 to 12, wherein the received
pre-operative medical imagery of the patient anatomical structure of the individual
patient is obtained by one or more from among:
anthropometric data acquisition;
attachment of MRI-compatible markers to the individual patient and MRI
scanning thereof, and analysis of the MRI-scanning to obtain MRI-images scans of the
patient anatomical structure of the individual patient;
placement of electromyography (EMG) units on the skin of the individual
patient and measurement and analysis of EMG signals generated by the EMG units.
14. The computer-implemented of any one of claims 1 to 13, wherein generating
the personalised pre-operative software model of the patient includes:
generating at least one MRI-reconstructed portion by one or more from among:
obtaining the bone position and dimension data of bones in the patient
anatomical structure of the individual patient by executing a segmentation process on the
MRI scans; obtaining the muscle position and dimension data of muscles in the patient anatomical structure of the individual patient by executing a segmentation process on the
MRI scans;
obtaining position and dimension data of a growth plate in one or more
bones, by executing a segmentation process on the MRI scans.
15. The computer-implemented of any one of claims 1 to 14, wherein generating
the personalised pre-operative software model of the patient includes:
generating at least one CT-reconstructed portion by imaging the patient
anatomical structure of the individual patient by a computerized tomography scan to
obtain at least one CT scan, and obtaining the bone position and dimension data of bones
in the patient anatomical structure of the individual patient by executing a segmentation
process on the at least one CT scan.
16. The computer-implemented method of claim 14 and claim 15, further
comprising registering common landmarks in the bone position and dimension data of
bones in the CT-reconstructed portion and the MRI-reconstructed portion
17. The computer-implemented method of any one of claims 13 to 16, wherein
generating the personalised pre-operative software model of the patient includes a 3D
anatomical analysis of the bone position and dimension data, including defining one or
more axes and planes in the bone position and dimension data, and measuring one or
more 3D angles between the defined one or more axes and planes.
18. The computer-implemented method of claim 6 and claim 17, wherein
generating the post-operative personalised software model of the patient anatomical
structure of the individual patient includes:
combining a CAD model of the selected implant and chisel with the bone
position and dimension data to verify that adequate bone thickness remains in place once
the implant is implanted into a bone orifice created by the chisel.
19. The computer-implemented method of claim 6 and claim 17, wherein
generating the post-operative personalised software model of the patient anatomical
structure of the individual patient includes:
combining a CAD model of the selected implant and chisel with the growth
plate position and dimension data to verify that adequate growth plate volume remains
once the implant is implanted into a bone orifice created by the chisel.
20. The computer-implemented method of any one of claims 1 to 20, wherein the
patient movement analysis data is measured motion data of the individual patient.
21. The computer-implemented method of claim 20, wherein the patient movement
analysis data is measured motion data of the individual patient obtained by motion
capture while the individual patient is walking or performing another natural body
movement.
22. The computer-implemented method of any one of claims 1 to 21, wherein the
personalised pre-operative software model of patient anatomical structure of the
individual patient based on received pre-operative medical imagery of the patient anatomical structure of the individual patient and pre-operative patient movement analysis data related to movement of the patient anatomical structure of the individual patient, is a 4D personalised functional model, generated by one or more steps from among: identifying surfaces and landmarks in the received pre-operative medical imagery, and fitting the body parts to which the identified surfaces and landmarks belong to an anatomical structure reference system by mapping the identified surfaces and landmarks to equivalents in the anatomical structure reference system; adding body parts fitted to the anatomical reference system including one or more from among: bones; joints; muscles; to the 4D personalised functional model.
23. The computer-implemented method of claim 22, wherein the post-operative
personalised software model of the patient anatomical structure of the individual patient
based on the personalised pre-operative software model and the software definition of
the surgical procedure, is a 4D personalised functional model, generated by:
identifying surfaces and landmarks in the pre-operative software model of
patient anatomical structure as modified by the software definition of the surgical
procedure, optionally as iteratively constrained according to claim 5 and claim 6, and
fitting the body parts to which the identified surfaces and landmarks belong to an
anatomical structure reference system by mapping the identified surfaces and landmarks
to equivalents in the anatomical structure reference system;
adding body parts fitted to the anatomical reference system including one or
more from among: bones; joints; muscles; to the 4D personalised functional model.
24. The computer-implemented method of claim 22 and claim 23, further including
simulating by the at least one computer processor movement of the bones and muscles
of the patient anatomical structure of the individual patient according to the post
operative personalised software model to generate simulation output, wherein the
simulation comprises comparing the 4D personalised functional model of the pre
operative software model of patient anatomical structure with the 4D personalised
functional model of the post-operative personalised software model of patient anatomical
structure.
25. The computer-implemented method of claim 24, wherein the software definition
of the surgical procedure is iteratively constrained to a defined solution surgical
procedure within the software definition of the surgical procedure by a machine learning
model or another solving algorithm seeking to achieve a defined optimum outcome in
the comparison of the 4D personalised functional model of the pre-operative software
model of patient anatomical structure with the 4D personalised functional model of the
post-operative personalised software model of patient anatomical structure.
26. The steps, features, integers, compositions and/or compounds disclosed herein
or indicated in the specification of this application individually or collectively, and any
and all combinations of two or more of said steps or features.
27. A computer program which, when executed by a computing apparatus
comprising processor hardware and memory hardware, causes the processor hardware to
perform the computer-implemented method according to any of claims 1 to 25.
28. A computer-readable medium storing the computer program according to
claim 27.
29. A non-transitory computer-readable medium storing the computer program
according to claim 27.
30. An apparatus comprising a processor and a memory, the processor being
configured to execute processing instructions stored by the memory, and by executing
the processing instructions to perform a computer-implemented method comprising of
pre-operative planning for patient surgery, the method including:
generating by at least one computer processor a personalised pre-operative
software model of patient anatomical structure of an individual patient based on received
pre-operative medical imagery of the patient anatomical structure of the individual
patient and pre-operative patient movement analysis data related to movement of the
patient anatomical structure of the individual patient, the patient movement analysis data
being generated at least in part by a movement sensor system observing movement of the
patient anatomical structure of the individual patient over a period of time, wherein the
personalised pre-operative software model of the patient includes bone position and
dimension data of bones in the patient anatomical structure of the individual patient,
muscle position and dimension data of muscles in the patient anatomical structure of the
individual patient, and relationship definition data defining relationships between bones
and muscles in the patient anatomical structure of the individual patient; receiving by the least one computer processor a software definition of a surgical procedure to be performed in relation to the personalised pre-operative software model of the patient anatomical structure of the individual patient; generating by the least one computer processor a modified personalised post operative software model of the patient anatomical structure of the individual patient based on the personalised pre-operative software model and the software definition of the surgical procedure; simulating by the at least one computer processor movement of the bones and muscles of the patient anatomical structure of the individual patient according to the modified personalised post-operative software model to generate simulation output; allowing by the least one computer processor adjustment of the personalised post operative software model based on the simulation output; generating by the least one computer processor a personalised surgical cutting guide model to facilitate the surgical procedure based on the personalised post-operative software model or the adjusted personalised post-operative software model; and generating by the least one computer processor at least one of:
3D printing instructions based on the personalised surgical cutting guide
model to transmit to a 3D printer to form a surgical cutting guide; or
personalised surgical operation instructions to transmit to a robotic surgical
system for use in performing the surgical procedure.
31. An apparatus comprising a processor and a memory, the processor being
configured to execute processing instructions stored by the memory, and by executing the processing instructions to perform a computer-implemented method according to any of claims I to 25.
2UWKRSDHGLFVXUJHU\SODQQLQJGHYLFH
3URFHVVRU 0HPRU\ 3UHRSHUDWLYH 3HUVRQDOLVHG 3UHRSHUDWLYH PRYHPHQW VXUJLFDORSHUDWLRQ &RPPV PHGLFDOLPDJHU\ DQDO\VLVGDWD LQVWUXFWLRQV PRGXOH 'PRGHOSULQWLQJ 6XUJHU\SODQQLQJ 'DQDWRPLFDO (OHFWURQLF LQVWUXFWLRQV PRGXOH PRGHOOLQJPRGXOH LQWHUIDFH 1/10
6RIWZDUHGHILQLWLRQ PRGXOH
1HWZRUN
)LJ
AU2022235552A 2022-09-20 2022-09-20 A computer program, apparatus, and computer-implemented method of pre-operative planning for patient surgery Pending AU2022235552A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2022235552A AU2022235552A1 (en) 2022-09-20 2022-09-20 A computer program, apparatus, and computer-implemented method of pre-operative planning for patient surgery
PCT/AU2023/050905 WO2024059902A1 (en) 2022-09-20 2023-09-20 A computer program, apparatus, and computer-implemented method of pre-operative planning for patient surgery

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2022235552A AU2022235552A1 (en) 2022-09-20 2022-09-20 A computer program, apparatus, and computer-implemented method of pre-operative planning for patient surgery

Publications (1)

Publication Number Publication Date
AU2022235552A1 true AU2022235552A1 (en) 2024-04-04

Family

ID=90453495

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2022235552A Pending AU2022235552A1 (en) 2022-09-20 2022-09-20 A computer program, apparatus, and computer-implemented method of pre-operative planning for patient surgery

Country Status (2)

Country Link
AU (1) AU2022235552A1 (en)
WO (1) WO2024059902A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9017334B2 (en) * 2009-02-24 2015-04-28 Microport Orthopedics Holdings Inc. Patient specific surgical guide locator and mount
WO2019245866A2 (en) * 2018-06-19 2019-12-26 Tornier, Inc. Mixed reality-aided surgical assistance in orthopedic surgical procedures
CN113382691A (en) * 2019-02-05 2021-09-10 史密夫和内修有限公司 Algorithm-based optimization for knee arthroplasty procedures
US20220211507A1 (en) * 2019-05-13 2022-07-07 Howmedica Osteonics Corp. Patient-matched orthopedic implant
US20220354511A1 (en) * 2021-05-07 2022-11-10 Mazor Robotics Ltd. Three-dimensional (3d) bone-protecting drill guide device and systems and methods of manufacturing and using device

Also Published As

Publication number Publication date
WO2024059902A1 (en) 2024-03-28

Similar Documents

Publication Publication Date Title
US20220015833A1 (en) Plan implementation
US11488721B2 (en) Operatively tuning implants for increased performance
US10922448B2 (en) Systems and methods for optimizing fit of an implant to anatomy
US9532788B2 (en) Systems and methods for determining the mechanical axis of a femur
Rao et al. A statistical finite element model of the knee accounting for shape and alignment variability
JP2023029916A (en) Bone reconstruction and orthopedic implants
US20150250552A1 (en) Advanced methods of modeling knee joint kinematics and designing surgical repair systems
US20170258598A1 (en) Advanced Methods of Modeling Knee Joint Kinematics and Designing Surgical Repair Systems
US20240189031A1 (en) Patient-specific osteotomy instrumentation
BR102022013149A2 (en) SYSTEMS AND METHODS OF USE OF THREE-DIMENSIONAL IMAGE RECONSTRUCTION TO ASSIST IN THE EVALUATION OF BONE OR SOFT TISSUE ABERRATIONS FOR ORTHOPEDIC SURGERY
Park et al. Computer‐Assisted Optimization of the Acetabular Rotation in Periacetabular Osteotomy Using Patient’s Anatomy‐Specific Finite Element Analysis
Price et al. A model‐based motion capture marker location refinement approach using inverse kinematics from dynamic trials
AU2022235552A1 (en) A computer program, apparatus, and computer-implemented method of pre-operative planning for patient surgery
Park et al. Evaluation of computer-aided design software methods for assessment of the three-dimensional geometry of the canine radius
US20230410993A1 (en) Devices, systems, and methods for predicting surgical time and optimizing medical procedures and outcomes
Forgione Gait analysis before and after a novel personalized system for high tibial osteotomy: statistical analysis of the entire data set and definition of parameters for biomechanical evaluation
Rohrs Development of a canine knee implant
Carman Modelling the Form and Function of the Paediatric Musculoskeletal System