CN117500451A - Systems, methods, and devices for dental implant surgery enhanced using kinematic data - Google Patents

Systems, methods, and devices for dental implant surgery enhanced using kinematic data Download PDF

Info

Publication number
CN117500451A
CN117500451A CN202280042860.4A CN202280042860A CN117500451A CN 117500451 A CN117500451 A CN 117500451A CN 202280042860 A CN202280042860 A CN 202280042860A CN 117500451 A CN117500451 A CN 117500451A
Authority
CN
China
Prior art keywords
implant
patient
computing system
computer
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280042860.4A
Other languages
Chinese (zh)
Inventor
马克西姆·贾伊松
安托伊内·朱莱斯·罗德里久伊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magic Chew Co
Original Assignee
Magic Chew Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Chew Co filed Critical Magic Chew Co
Publication of CN117500451A publication Critical patent/CN117500451A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C19/00Dental auxiliary appliances
    • A61C19/04Measuring instruments specially adapted for dentistry
    • A61C19/045Measuring instruments specially adapted for dentistry for recording mandibular movement, e.g. face bows
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y80/00Products made by additive manufacturing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C8/00Means to be fixed to the jaw-bone for consolidating natural teeth or for fixing dental prostheses thereon; Dental implants; Implanting tools
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Epidemiology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Primary Health Care (AREA)
  • Veterinary Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Surgery (AREA)
  • Dentistry (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Chemical & Material Sciences (AREA)
  • Manufacturing & Machinery (AREA)
  • Materials Engineering (AREA)
  • Data Mining & Analysis (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Optics & Photonics (AREA)
  • Urology & Nephrology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Robotics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

The present invention provides systems and methods for modeling and planning a dental implant procedure. A method for modeling and planning a dental implant process may include: receiving a patient profile, the patient profile comprising a model of a maxilla or mandible of the patient, kinematic data associated with movement of a jaw of the patient; identifying one or more candidate sites for a dental implant; generating one or more dental implant parameters; determining an indication of the function cone; determining one or more dental implant contact points; generating a constraint map; selecting an implant model; and generating a modified model.

Description

Systems, methods, and devices for dental implant surgery enhanced using kinematic data
Incorporated by reference to any priority application
Any and all applications identified in the application data sheet filed with the present application in accordance with 37 CFR 1.57, wherein the foreign or domestic priority claims are hereby incorporated by reference.
The present application claims the benefit of U.S. provisional application No. 63/213607 entitled "system, method and apparatus FOR dental implant SURGERY enhanced with kinematic data" (SYSTEMS, METHODS, AND DEVICES FOR AUGMENTED DENTAL IMPLANT surcharge use KINEMATIC DATA) filed on U.S. c. ≡119 (e) on day 2021, month 6, 22, and the entire contents of this application are hereby incorporated herein by reference FOR all purposes.
Technical Field
The present application relates generally to dental implant surgery.
Background
Dental implantation procedures may involve replacing a root with a metal post and/or replacing a damaged or missing tooth with an artificial tooth (also referred to as an implant cap) that is shaped and functions as an actual tooth to provide a more permanent and/or aesthetic alternative to a dental prosthesis or bridge. Planning a surgical procedure to place an implant typically requires the surgeon to consider several variables in the physiology of the patient. For example, when selecting the type, size, depth, and/or angular orientation of the implant to be used, it may be desirable to consider the patient's bone structure, gums, surrounding teeth, and/or others.
Disclosure of Invention
The present disclosure describes a surgical planning system (referred to herein as a "system") and methods of using the same for performing dental implant surgery. In particular, in some embodiments, and more particularly, to dental implant surgery enhanced using kinematic data derived from capturing movements of a patient's jaw. In some embodiments, the system includes a jaw motion tracking subsystem connected to a computer system configured with surgical planning software. According to some embodiments, the jaw motion tracking system may be connected to a computer system that is networked to a remote computer system configured with surgical planning software. In some embodiments, the jaw motion tracking subsystem includes a detector, a wearable headset with a tracking marker, and software configured to record data from the detector. In some embodiments, the motion tracking system may include only a detector. In some embodiments, a jaw motion tracking system is used to capture patient jaw motion (e.g., by recording video) and use the captured motion to construct kinematic data representing the motion of the patient jaw. In some embodiments, the constructed kinematic data may be used to present a visual representation of the movement of the patient's jaw to assist the surgeon in selecting and locating potential implant targets. In some embodiments, the computer system may be further configured to calculate optimization parameters for implant placement, including but not limited to depth of the implant, angle with respect to bone, size, and the like.
In some embodiments, the computer system may be configured to output a 3D printable model that may be used to fabricate a surgical guide for guiding a procedure. In some embodiments, the computer system may be connected to the 3D printer directly or through a network. In some embodiments, the computer system may be configured to output a navigated surgical plan for use with the surgical navigation system. In some embodiments, the computer system may be connected to the surgical navigation system directly or through a network. In some embodiments, the system may be compatible with external systems to output the navigated surgical plan.
For purposes of this summary, certain aspects, advantages and novel features of the invention are described herein. It is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves one advantage or a set of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
In some aspects, the techniques described herein relate to a computer-implemented method for oral surgery planning, the computer-implemented method comprising: receiving, by the computing system, a patient profile, wherein the patient profile includes: patient anatomical data, wherein the patient anatomical data comprises one or more models of a patient's maxilla or mandible; and kinematic data associated with movement of the jaw of the patient; identifying, by the computing system, one or more candidate sites for the dental implant based at least in part on the received patient profile; and generating, by the computing system, one or more dental implant parameters based at least in part on the identified one or more candidate sites and the kinematic data.
In some aspects, the techniques described herein relate to a computer-implemented method wherein the patient profile includes any combination of one or more of bone volume, bone density, relative bone density, nerve location, or sinus location.
In some aspects, the techniques described herein relate to a computer-implemented method, further comprising: determining, by the computing system, the proposed crown geometry; determining, by the computing system, an indication of the function cone based at least in part on the kinematic data; determining, by the computing system, one or more crown contact points based at least in part on the patient profile and the proposed crown geometry; generating, by the computing system, a constraint map based at least in part on the one or more crown contact points; selecting, by the computing system, an implant model based at least in part on the constraint map; and generating, by the computing system, a modified implant model based at least in part on the constraint map and the implant model.
In some aspects, the techniques described herein relate to a computer-implemented method, further comprising: determining, by the computing system, the proposed crown geometry; automatically determining, by the computing system, an indication of the function cone based at least in part on the kinematic data; automatically determining, by the computing system, one or more crown contact points based at least in part on the patient profile; and automatically selecting, by the computing system, an implant model based at least in part on the crown contact points.
In some aspects, the techniques described herein relate to a computer-implemented method in which generating a modified model includes minimizing one or more stresses on a dental implant.
In some aspects, the techniques described herein relate to a computer-implemented method, wherein identifying one or more candidate sites for a dental implant includes comparing one or more models of a patient's maxilla or mandible to one or more reference models.
In some aspects, the techniques described herein relate to a computer-implemented method, wherein identifying one or more candidate sites for a dental implant comprises automatically analyzing a patient's bone to determine any combination of one or more of: dental arch, inter-dental space, bone volume and relative bone density.
In some aspects, the techniques described herein relate to a computer-implemented method wherein the one or more dental implant parameters include any combination of one or more of the following: the location of the dental implant relative to the bone surface, the implant type, the implant material, the burial depth, the implant angle relative to the bone surface, the implant size, the crown size, and the crown geometry.
In some aspects, the techniques described herein relate to a computer-implemented method in which at least one of the crown size and crown geometry is based at least in part on a prosthetic projection of a patient, a prosthetic tooth, or an existing tooth.
In some aspects, the techniques described herein relate to a computer-implemented method, further comprising: by the computing system, it is determined, based on the patient profile, that the one or more candidate sites have insufficient bone volume or bone density to perform the implantation procedure.
In some aspects, the techniques described herein relate to a computer-implemented method wherein determining one or more dental implant contact points includes determining contact at one or more phases of jaw motion based at least in part on an indication of a functional cone and patient anatomy, wherein jaw motion includes recorded motion, simulated motion, or both.
In some aspects, the techniques described herein relate to a computer-implemented method in which selecting an implant model includes selecting a preconfigured model from a model database using an artificial intelligence engine.
In some aspects, the techniques described herein relate to a computer-implemented method, wherein generating implant parameters includes: patient data is provided to an artificial intelligence model configured to generate implant parameters.
In some aspects, the techniques described herein relate to a computer-implemented method, further comprising: receiving, by the computing system, an indication of a surgical outcome; and retraining, by the computing system, the artificial intelligence model using the received indication of the surgical result.
In some aspects, the techniques described herein relate to a computer-implemented method, further comprising: the user is provided with an interface for modifying one or more implant parameters.
In some aspects, the techniques described herein relate to a computer-implemented method, further comprising: a surgical guide is created, wherein the surgical guide includes a 3D model of the guide that may be used during a surgical procedure.
In some aspects, the techniques described herein relate to a computer-implemented method further comprising providing the surgical guide to a 3D printer.
In some aspects, the techniques described herein relate to a computer-implemented method further comprising generating a surgical navigation plan.
In some aspects, the techniques described herein relate to a computer-implemented method further comprising providing a visualization and interactive interface.
In some aspects, the techniques described herein relate to an oral surgical planning system comprising: a computing system, comprising: a computer readable storage medium having program instructions embodied therewith; and one or more processors configured to execute the program instructions to cause the computing system to: receiving a patient profile, wherein the patient profile comprises: patient anatomical data, wherein the patient anatomical data comprises one or more models of a patient's maxilla or mandible; and kinematic data associated with movement of the jaw of the patient; identifying one or more candidate sites for the dental implant based at least in part on the received patient profile; and generating one or more dental implant parameters based at least in part on the identified one or more candidate sites and the kinematic data.
In some aspects, the techniques described herein relate to an oral surgical planning system in which patient anatomical data includes one or more models of a patient's maxilla or mandible.
In some aspects, the techniques described herein relate to an oral surgical planning system in which the patient profile includes any combination of one or more of bone volume, bone density, relative bone density, nerve location, or sinus location.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to: determining the proposed crown geometry; determining an indication of the function cone based at least in part on the kinematic data; determining one or more crown contact points based at least in part on the patient profile and the proposed crown geometry; generating a constraint map based at least in part on the one or more crown contact points; selecting an implant model based at least in part on the constraint map; and generating a modified implant model based at least in part on the constraint map and the implant model.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to: determining the proposed crown geometry; automatically determining an indication of the function cone based at least in part on the kinematic data; automatically determining one or more crown contact points based at least in part on the patient profile; and automatically selecting an implant model based at least in part on the crown contact points.
In some aspects, the techniques described herein relate to an oral surgical planning system in which generating a modified model includes minimizing one or more stresses on a dental implant.
In some aspects, the techniques described herein relate to an oral surgical planning system in which identifying one or more candidate sites for a dental implant includes comparing one or more models of a patient's maxilla or mandible to one or more reference models.
In some aspects, the techniques described herein relate to an oral surgical planning system wherein identifying one or more candidate sites for a dental implant includes automatically analyzing a patient's bone to determine any combination of one or more of: dental arch, inter-dental space, bone volume and relative bone density.
In some aspects, the techniques described herein relate to an oral surgical planning system wherein the one or more dental implant parameters include any combination of one or more of the following: the location of the dental implant relative to the bone surface, the implant type, the implant material, the burial depth, the implant angle relative to the bone surface, the implant size, the crown size, and the crown geometry.
In some aspects, the techniques described herein relate to an oral surgical planning system in which at least one of the crown size and crown geometry is based at least in part on a prosthetic projection of a patient, a prosthetic tooth, or an existing tooth.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to: determining, based on the patient profile, that the one or more candidate sites have insufficient bone volume or bone density to perform the implantation procedure.
In some aspects, the techniques described herein relate to an oral surgical planning system in which determining one or more dental implant contact points includes determining contact at one or more phases of jaw movement based at least in part on an indication of a functional cone and patient anatomical data, wherein jaw movement includes recorded movement, simulated movement, or both.
In some aspects, the techniques described herein relate to an oral surgical planning system in which selecting an implant model includes selecting a preconfigured model from a model database using an artificial intelligence engine.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to: patient data is provided to an artificial intelligence model configured to generate implant parameters.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to: receiving an indication of a surgical outcome; and retraining the artificial intelligence model using the received indication of the surgical result.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to: the user is provided with an interface for modifying one or more implant parameters.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to: a surgical guide is created, wherein the surgical guide includes a 3D model of the guide that may be used during a surgical procedure.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to provide the surgical guide to the 3D printer.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to generate a surgical navigational plan.
In some aspects, the techniques described herein relate to an oral surgical planning system, wherein the program instructions further include instructions that, when executed by the one or more processors, cause the computing system to provide a visualization and interactive interface.
In some aspects, the techniques described herein relate to an oral surgical planning system, further comprising: jaw movement tracking headgear; a jaw motion tracking detector.
All of these embodiments are intended to be within the scope of the disclosed invention. These and other embodiments will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, wherein the invention is not limited to any particular embodiment disclosed.
For purposes of this summary, certain aspects, advantages and novel features of the invention are described herein. It is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves one advantage or a set of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
Drawings
These and other features, aspects, and advantages of the present invention are described with reference to the drawings of certain embodiments, which are intended to illustrate, but not limit the present disclosure. It should be understood that the drawings, which are incorporated in and constitute a part of this specification, are for the purpose of illustrating the concepts disclosed herein and may not be drawn to scale.
Fig. 1A shows an example of implant placement without regard to function taper.
Fig. 1B shows an example of implant placement according to some embodiments, where a function taper is considered.
Fig. 2 shows a functional taper and placement of an implant in view of the functional taper, according to some embodiments.
Fig. 3A is a schematic diagram illustrating an example embodiment of a jaw motion tracking system and modeling/planning system.
Fig. 3B is a schematic diagram illustrating an example embodiment of a jaw motion tracking system and modeling/planning system.
Fig. 3C is a schematic diagram illustrating an example embodiment of a jaw motion simulation system and modeling/planning system.
Fig. 4A is an example of a tracking headset that may be used to capture jaw movements according to some embodiments.
Fig. 4B is an example of a tracking camera system according to some embodiments.
Fig. 5 is a schematic diagram illustrating an example embodiment of an automated modeling and planning system for dental implant surgery using kinematic data enhancement, the schematic diagram depicting various components of the system.
Fig. 6 is a schematic diagram illustrating an example embodiment of an implant design module of an automated modeling and planning system for dental implant surgery using kinematic data enhancement, the schematic diagram depicting various components of the system.
Fig. 7A is a schematic diagram illustrating an example embodiment of a system for dental implant surgery using kinematic data enhancement, the schematic diagram depicting various components of the system.
Fig. 7B is a schematic diagram depicting various components of a system operating via a network, showing an example embodiment of a system for dental implant surgery using kinematic data enhancement.
Fig. 8 is a flow chart showing an overview of an example embodiment for performing dental implant surgery enhanced using kinematic data.
Fig. 9 is a flowchart illustrating an overview of an example embodiment for training and using an AI engine to provide parameterized implant suggestions, according to some embodiments herein.
FIG. 10 is a flow chart illustrating a procedure for training an artificial intelligence or machine learning model according to some embodiments.
11A and 11B are images depicting feature determination according to some embodiments herein.
FIG. 12 is an image depicting feature identification and implantation site determination in accordance with some embodiments.
Fig. 13 shows example images and graphs of resistivity according to some embodiments.
Fig. 14 illustrates implantation site identification according to some embodiments herein.
15A and 15B illustrate bone core identification according to some embodiments.
Fig. 16 illustrates identification of implant locations according to some embodiments.
FIG. 17 is a diagram illustrating a collaboration platform, according to some embodiments.
Fig. 18 is a diagram of an example computer system configured for use with an example embodiment of a system for dental implant surgery enhanced using kinematic data.
Detailed Description
While several embodiments, examples and illustrations are disclosed below, it will be understood by those of ordinary skill in the art that the invention described herein extends beyond the specifically disclosed embodiments, examples and illustrations to other uses of the invention and obvious modifications and equivalents thereof. Embodiments of the present invention are described with reference to the drawings, wherein like reference numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because the terminology is being used in conjunction with a detailed description of some specific embodiments of the invention. Furthermore, embodiments of the invention may include several novel features, and no single feature is solely responsible for its desirable attributes or necessary to practicing the invention described herein.
As discussed above, some embodiments described herein relate to automating dental implant surgery using kinematic data describing patient jaw movements. For example, in some embodiments, the system may be configured to utilize kinematic data derived from capturing movements of a patient's jaw to provide an enhanced dental implant procedure. Without using or considering kinematic data, in some embodiments, the surgeon may have to rely on experience and guesswork to determine appropriate parameters for the implant, which may yield suboptimal surgical results; this can lead to fracture or chipping of the dental cap or crown, bone resorption around the implant or post (e.g., due to lateral forces), fracture of the abutment and/or connection between the implant and abutment, discomfort or adverse health events for the patient, or unsightly tooth geometry. In such cases, the surgeon may also need more time to identify parameters of the implant that meet the patient's needs.
Some embodiments of the systems, methods, and devices described herein are directed to addressing these technical shortcomings. In particular, in some embodiments described herein, a surgeon may utilize automated calculation and generation of implant parameters to optimize implant parameters based on kinematic data derived from patient jaw movements (which may be captured and/or simulated) to save time and achieve better results.
In some embodiments, the implant may be an artificial substitute for missing teeth, the implant comprising a post that is attached or inserted into the jawbone of the patient. In some embodiments, the cap or crown mimic may be permanently attached to the appearance of the teeth of the implant. In some embodiments, the kinematic data may be captured by using one or more detectors to record movement of the patient's jaw. In some embodiments, the motion data may be recorded while the patient is wearing one or more visual markers to be used to convert the video into the motion data. In some embodiments, one or more detectors may be attached to a computer configured to accept and import the kinematic data into one or more surgical planning software packages. In some embodiments, the kinematic data may be used to present a visual representation of the movement of the patient's jaw to assist the surgeon in selecting and locating potential implant targets. In some embodiments, the kinematic data may be further used to calculate a functional cone. The function cone represents an envelope (envelope) whose limits have been determined by the displacement of points during mandibular movement. The mandibular movement may be the actual movement of the patient recorded, may be generated by simulating the mastication system as a mechanical system, or both. The function cone may represent the average angle and stress involved in the movement of the patient's jaw.
Fig. 1A shows an example of implant placement without consideration of the function taper. For example, the implant may be placed at a non-ideal angle, which may have several undesirable results as described herein. Fig. 1B shows an example of an implant that has been positioned with a function taper in mind. The implant in fig. 1B may have several advantages over those implants in fig. 1A as discussed herein.
Fig. 2 shows a functional taper and placement of an implant in view of the functional taper, according to some embodiments. As shown in fig. 2, the function cone has an appearance point 202, a 3D envelope 204 of function movement, and a 3D envelope 206 of boundary movement. The implant 208 may be placed close to the centroid of the function cone. The 3D envelope 204 of functional movement may correspond to a range of movement of the patient's jaw during normal functional functioning. The 3D envelope of boundary movement may correspond to a maximum range of jaw movement.
In some embodiments, the computer system may be further configured to automatically calculate optimization parameters for implant placement, including but not limited to depth, angle, size, and/or type of implant. In some embodiments, a functional cone may be used in these calculations to determine placement parameters that resist bite loading, distribute stresses in surrounding bone, avoid excessive bone resorption, apply uniform (or near uniform) stresses at the junction between the abutment and the implant, and avoid crowns, abutment fractures, and the like, among others. For example, the function taper may be used to determine an angle that minimizes shear stress on the implant, minimizes stress on surrounding bone tissue, and/or selects an appropriate implant type and/or geometry. In these embodiments, the implantation site may be further determined based on the application of a function taper to the points from the patient's jaw where the positioned implant emerges.
As discussed herein, a "dental implant" is defined to include any type of dental implant, including posts, implants, bridges, dental fixtures, crowns, dentures, or any other dental fastener. As discussed herein, "dental implant surgery" is defined to include any treatment or procedure for planning, developing, modeling, preparing, creating, inserting, and/or attaching any of the foregoing dental implants.
Jaw motion tracking
As shown in fig. 3A, in some embodiments, the system may include a jaw motion tracking system 301, which may further include a detector 302 (e.g., a jaw motion tracking detector) and a motion tracking headset 303. In some embodiments, the detector may be connected to the modeling and planning system 501. In some embodiments, the detector may be connected to the modeling and planning system via a data transfer cable, a network cable, or a wireless network. The detector may be any device or combination of devices capable of recording movement of the patient's mandible relative to its maxilla over time, for example, a camera without depth sensing capabilities. In some embodiments, the camera system may be capable of depth sensing. For example, the camera system may use stereo vision, structured light, time of flight, light detection and ranging (LIDAR), or other depth sensing principles. As shown in fig. 3B, in some embodiments, the motion tracking headgear may further include a maxillary marker 304 and/or a mandibular marker 305. In some embodiments, the maxillary and/or mandibular markers further include fiducial markers that may be used to track relative movement of the patient's mandible and/or maxillary bone. In some embodiments, inertial measurement units may also be used for motion tracking instead of or in addition to optical detection systems. For example, accelerometers, magnetometers, gyroscopes, and the like may be used to monitor movement. In some embodiments, the inertial measurement unit may be a microelectromechanical system (MEMS) device.
In some embodiments, a jaw motion simulation system may be used to simulate movements of a patient's jaw, for example by treating the mastication system as a mechanical system, instead of, or in addition to, using a jaw motion tracking system. Fig. 3C depicts an example embodiment in which jaw motion simulation system 306 communicates with modeling and planning system 501. In some embodiments, the jaw motion simulation system 306 may receive data from the jaw motion tracking system 301, such as condyle points, bennet angles, etc., that may be used to simulate the jaw motion of a patient. In some embodiments, the modeling and planning system may use only jaw motion tracking system data, only jaw motion simulation system data, or a combination of data from both the jaw motion tracking system and the jaw motion simulation system.
In some embodiments, the maxillary marker 304 can include any device that can be attached, connected, or related to the patient's maxilla that can enable the detector to track movement of the patient's maxilla. In some embodiments, the mandible marker 305 may include any device that may be attached, connected, or related to the patient's mandible that may enable the detector to track movements of the patient's mandible. It should be understood that other configurations are possible, which may use additional or different markers. Fig. 4A depicts an example embodiment of a tracking device that includes a headset (tial) 401 attached to a patient's head and a mandible marker 402 attached to and moving with the patient's mandible. In some embodiments, the headset may include one or more sensors that enable the headset to track motion directly (e.g., without using an external detector). In some embodiments, the external detector and one or more sensors contained in the headset may be used together to track movement of the patient. Fig. 4B illustrates a motion capture device that may be used to capture patient motion, for example, by detecting motion of the headgear 401 and the mandibular marker 402 of fig. 4A. The motion capture device may include a stereo camera system 403 configured to capture 3D motion. In some embodiments, the motion capture device may include a display 404. The display 404 may be used to monitor the motion capture process.
Modeling and planning system
As shown in fig. 5, in some embodiments, the modeling and planning system 501 may include an implant design module 502, an AI engine 503, a surgical guide generator 504, a navigated surgical planner 505, and/or a system database 513. In some embodiments, the modeling and planning system includes software configured to run on a computer system.
In some embodiments, the AI engine 503 may include a machine learning algorithm, one or more neural networks, a heuristic engine, and/or a stochastic model.
In some embodiments, the surgical guide generator 504 may be configured to output a 3D model representing one or more surgical guides. In some embodiments, the surgical guide may be a device attached to the patient's mouth that assists the surgeon in manipulating one or more tools and/or implants during surgery to improve the surgical outcome. In some embodiments, the 3D model may be configured to be manufacturable in the field, or may be manufacturable by a manufacturer using one or more manufacturing and/or 3D printing techniques. For example, in some embodiments, modeling and planning system 501 may be connected to a 3D printer and/or milling machine, while in other embodiments, the 3D model may be transferred to another system for production, e.g., using a web file transfer protocol, an application software programming interface, email, etc.
In some embodiments, navigated surgical planner 505 may be configured to output one or more surgical navigation plans. In some embodiments, the surgical navigational plan may include data compatible with one or more surgical navigational systems. In some embodiments, to generate a surgical navigational plan, the navigated surgical planner may be configured to transmit data to an external navigated surgical planning system. The one or more surgical navigation systems may include a computerized system that may include sensors and/or indicators to assist a surgeon in guiding one or more tools and/or implants during a procedure to improve the outcome of the procedure.
In some embodiments, the implant design module 502 may be configured to allow a surgeon and/or medical personnel to manually or automatically generate dental implant parameters and design and/or reconfigure the parameters.
In some embodiments, system database 513 may include a database engine configured to store one or more patient profiles, system settings, and/or usage information. In some embodiments, the one or more patient profiles may each include a patient history, a model of patient anatomy, and/or a medical image, such as a dental X-ray, which may contain data describing its existing natural teeth, existing prosthetic teeth, virtual prosthetic projections, jaw, nerves, bones, and/or other features. In some embodiments, each patient profile may further include data related to a planned procedure, a procedure guide model, and/or a procedure navigation plan generated by the system. In some embodiments, the system settings may include settings related to the operation of the graphical user interface, connection to external services, and/or device settings. In some embodiments, the usage information may include statistics regarding usage software (e.g., login), access to patient data, and/or log information describing various system activities. In some embodiments, the usage information may further include settings and data related to access control, which may include user login and password information, access rights to patient data, and/or audit information describing user activity on the system.
In some embodiments, images and/or models of the patient's face may be imported into modeling and planning system 501 and displayed in conjunction with other patient data and models, e.g., to provide context for the patient and/or surgeon or medical personnel.
Implant design module
As shown in fig. 6, in some embodiments, the implant design module 502 may include an implant site identifier 606, an implant site analyzer 607, a jaw kinematics analyzer 608, an implant geometry analyzer 609, an implant geometry generator 610, and/or a visualization and interaction interface 612. In some embodiments, the implant design module may be used to automatically generate implant parameters based on patient profiles and/or kinematic data composed of patient jaw movements. In some embodiments, the user may choose to automatically generate one or more parameters. In some embodiments, the inputs available to generate parameters may be user-defined.
In some embodiments, the implantation site identifier 606 may include a set of algorithms configured to receive one or more models of the patient's maxilla and/or mandible and identify one or more candidate sites for a dental implant. In some embodiments, the one or more models may include 3D images of patient anatomy acquired by X-ray Computed Tomography (CT) and/or other medical imaging modalities. In some embodiments, the identification of the implantation site may be based on a comparison of a model of the patient's maxilla and/or mandible with one or more stored models of human anatomy to identify missing teeth. Some embodiments may further analyze the additional data to determine an implantation site, such as x-ray data from Cone Beam Computed Tomography (CBCT), for example, to analyze the resistivity (e.g., using a linear attenuation coefficient that may be expressed in henry units). In some embodiments, the implantation site identifier 606 may distinguish between relatively higher density and relatively lower density regions. In some embodiments, the implantation site identifier 606 may determine and/or apply one or more density thresholds that may be used to distinguish between bone segments that are suitable for implantation and bone segments that are unsuitable for implantation. In some embodiments, the identification of the implantation site may be based on automated bone analysis. In some embodiments, the implant site identifier may perform a 3D reconstruction of the implant site to facilitate calculation of the bone volume and density of the patient. In these embodiments, the implantation site may be determined to maximize the average bone density around the implant. In some embodiments, the implant site identifier may not calculate an absolute bone density value. In some embodiments, the implant site identifiers may compare relative bone densities. For example, the implantation site identifier may compare a relatively high density region (e.g., a tooth root) to a relatively low density region (e.g., a jawbone).
In some embodiments, the identification of potential implantation sites may be presented to the user via the visualization and interaction interface 612, and the user may remove, add, or edit one or more of the identified implantation sites. In some embodiments, the identified implantation site is stored as part of a patient profile.
In some embodiments, the implantation site analyzer 607 may include a set of algorithms configured to receive one or more models of a patient's maxilla and/or mandible and a set of implantation sites. The implantation site analyzer may analyze the potential implantation sites using a variety of methods. Such methods may include, for example, automated methods for computing dental arches, defining interdental spaces, and the like. In some embodiments, the implantation site analyzer 607 may be further configured to generate parameters of the dental implant for each of the one or more implantation sites. In some embodiments, the implant parameters may include one or more of implant position relative to a bone surface of the patient, implant type, implant material, burial depth, implant angle relative to the bone surface, implant size, crown size, and/or geometry of the implant cap. In some embodiments, generating the implant parameters may further include identifying a need for a bone graft to support the implant based on, for example, localized bone volume and/or density deficiencies.
In some embodiments, generating the implant parameters may include considering one or more anatomical features of the patient. In some embodiments, these anatomical features may include bone volume, bone width, mass and density around the implant site, height of the patient's gums above the bone, the position of the patient's sinuses, nerves and chin holes, etc.; in some embodiments, an optimal point of occurrence through the gums may be obtained taking into account one or more of these parameters. The point of occurrence may be the intersection between the axis of the implant (characterized as a straight line) and the bone surface or gum surface, corresponding to the connection area of the prosthetic device (e.g., crown) to its implant. In some embodiments, generating the implant parameters may further take into account biomechanical parameters of the patient's jaw. In some embodiments, the biomechanical parameters may include kinematic data describing the movements of the jaw of the patient. In some embodiments, this kinematic data may be received in the form of a functional cone. In some embodiments, the biomechanical behavior of bone may be used to predict stress on bone.
In some embodiments, the jaw kinematic analyzer 608 may include a set of algorithms configured to receive raw jaw motion data from a jaw motion tracking system. In these embodiments, the jaw kinematics analyzer 608 may analyze the raw data and output a cone of functions. In some embodiments, the function cone may include a plurality of displacement vectors. In some embodiments, the functional cone may include a set of data describing a maximum range of motion of the patient's jaw. In some embodiments, the functional cone may further include a set of vectors describing stresses that may be generated by the patient's jaw at various locations. For example, the displacement of the mandible is the result of contraction and/or relaxation of various muscles oriented along multiple axes. Because the motions, displacement vectors, and forces produced by the muscles are known or can be estimated, the jaw kinematics analyzer may determine the magnitude and/or direction of the stresses produced on the bone. In some cases, the analyzer may determine the stress on the implant and/or abutment, for example, when its biomechanical behavior is known or estimated. Advantageously, if the implant is located at the centroid of movement, the implant may absorb forces to reduce the risk of damage. In some embodiments, one or more portions of the functional cone may be provided as input to the jaw kinematic analyzer 608, and the analyzer may continue to derive the remainder of the functional cone from the provided data.
In some embodiments, the implant geometry analyzer 609 may comprise a set of algorithms configured to receive a functional cone, a model of the patient's mandible and/or maxilla, and/or the proposed crown geometry. In some embodiments, the implant geometry analyzer 609 may be further configured to generate a map of the proposed contact between the implanted crown and other features of the patient (e.g., other teeth, crowns, implants, etc.) at various stages of jaw movement based on the anatomy and function cone of the patient. In some embodiments, the implant geometry analyzer 609 may be further configured to generate a map of stress present at each of the previously described contact points based on a function cone (hereinafter referred to as a constraint map). In some embodiments, a constraint map may be generated with respect to one or more amounts of space surrounding one or more implantation sites. In some embodiments, the implant geometry analyzer may be configured to follow a decision tree in generating the implant parameters. For example, the decision tree process flow may begin by determining boundary conditions (e.g., locations of nerves, bones, etc.), and then may refine implant parameters based on additional information (e.g., bone density, stress, etc.). In some embodiments, one or more portions of the generated data may be submitted to the AI engine 503 for modification. In some embodiments, one or more portions of the generated data may be initially generated by the AI engine 503.
In some embodiments, the implant geometry generator 610 may include a set of algorithms configured to generate one or more models representing the geometry of the dental implant. In some embodiments, the generation of the one or more models may be based on a constraint map generated by an implant geometry analyzer. In some embodiments, the generation of the one or more models may be further configured to generate a model that minimizes shear stress on the implant generated by movement of the patient's jaw. In some embodiments, the generation of one or more models may be further configured to generate a model that minimizes stress on certain portions of the patient's jaw. In some embodiments, the model may be generated by modifying one or more preconfigured implant models stored in system database 513. In some embodiments, the selection of one or more preconfigured implant models may be performed by the AI engine 503. In some embodiments, the selection of one or more preconfigured implant models may be performed by a surgeon or medical staff, or may be performed automatically by the implant geometry generator 610. In some embodiments, the selection of one or more preconfigured implant models may be performed according to one or more rules stored in system database 513.
In some embodiments, the visualization and interaction interface 612 includes a graphical user interface. The graphical user interface may further include a visualization window that may display 2D and/or 3D images and/or text. The visualization window may be further configured to display data representing anatomical features, implant geometry, jaw movement paths, statistics, and/or patient information, for example, prior to, during, and/or after a dental procedure. In some embodiments, the graphical user interface may be further configured to interact, allowing a user to manipulate the displayed data. In some embodiments, the enabled manipulations may include editing parameters and geometries of the implant, for example, for planning purposes. In some embodiments, the graphical user interface may be further configured to allow a user to generate and/or output images and video based on the presented data.
Integrated system for dental implant surgery enhanced using kinematic data
As shown in fig. 7A, in some embodiments, the system may include a jaw motion tracking system 301, a modeling and planning system 501, an operator console 708, a 3D printer 706, and/or a surgical navigation system 707.
In some embodiments, the jaw motion tracking system 301 may be configured to record the motion of the patient's jaw and transmit the data to the modeling and planning system 501. In some embodiments, the jaw motion tracking system 301 may analyze and/or format the data prior to transmitting the data to the modeling and planning system. In some embodiments, the jaw motion tracking system may store data or transmit data to a storage system for later use.
In some embodiments, the modeling and planning system 501 may be a computer system configured to receive motion tracking data from the jaw motion tracking system 301 and convert the motion tracking data to kinematic data. In some embodiments, the modeling and planning system 501 may be further configured to receive one or more patient profiles as input. In some embodiments, the modeling and planning system 501 may be further configured to automatically determine parameters of one or more dental implants that may be surgically implanted in the patient. In some embodiments, modeling and planning system 501 may be connected to 3D printer 706 and may be further configured to output a 3D printable surgical guide model that may be printed on the 3D printer. In some embodiments, additional or alternative computer-aided manufacturing hardware may be connected to the modeling and planning system 501, such as a milling device. In some embodiments, the modeling and planning system may be connected to the surgical navigation system 707 and/or may be further configured to output one or more surgical navigation plans that may be utilized by the surgical navigation system.
In some embodiments, operator console 708 may be a computer system configured to provide a user interface for a surgeon and/or surgical personnel to interact with modeling and planning system 501. In some embodiments, the operator console may include a display 709 and/or one or more input devices 710. In some embodiments, modeling and planning system 501 and operator console 708 may comprise a single computer system. In some embodiments, operator console 708 may include a thin client or other computing device that may interact with modeling and planning system 501 via a network.
In some embodiments, 3D printer 706 may be connected to modeling and planning system 501. In some embodiments, 3D printer 706 may be configured to use stereolithography, digital light processing, fused deposition modeling, selective laser sintering, multi-jet fusion, polyjet, direct metal laser sintering, and/or electron beam fusion. In some embodiments, the modeling and planning system may be configured to interact with a connected milling machine.
Distributed system for dental implant surgery enhanced using kinematic data
As shown in fig. 7B, in some embodiments, the system may include a jaw motion tracking system 301, an operator console 708, a 3D printer 706, a surgical navigation system 707, an AI engine 503, and/or a modeling and planning system 501. In some embodiments, modeling and planning system 501 may be connected to other modules via one or more computer networks. Computer networks 721, 722, 723, and 724 may include wireless and/or wired networks.
Method for automating dental implant surgery using kinematic data
Fig. 8 is a flow chart showing an overview of an example embodiment of performing dental implant surgery using kinematic data enhancement. In some embodiments, the program may deviate from the program shown in fig. 8. Some embodiments may include more steps, fewer steps, and/or steps may be performed in an order different than shown. As shown in figure 8 of the drawings,
in some embodiments, at block 802, a surgeon or medical staff may capture static physical data about a patient. In some embodiments, at block 804, the surgeon or medical personnel may attach the mandibular marker 305 and the maxillary marker 304 and/or the headset 303 to the patient. The static data may include x-rays from CBCT or CT scans, 3D models of teeth from intraoral scanners or laboratory scanners, and the like.
In some embodiments, the detector may identify the location of a fiducial marker attached to the patient at block 806 and/or capture moving image data at block 808. In some embodiments, the surgeon or medical personnel may instruct the patient to move their jaw during the capture process.
In some embodiments, at block 810, the jaw motion tracking system tracks and communicates jaw movements of the patient to the modeling and planning system based on the captured image data. Alternatively or additionally, jaw movement of the patient may be simulated, for example, by treating the mastication system as a mechanical system. In some embodiments, the simulated movement may be customized based at least in part on the recorded patient motion. For example, the recorded motion may have various imperfections or incompleteness, but may still be suitable for determining parameters such as condylar slope (condylar slope), bennett angle, etc., which may help to improve the accuracy of the simulated movement.
In some embodiments, at block 812, the modeling and planning system 501 may store kinematic data based on the captured jaw movements as part of the patient profile. In some embodiments, the modeling and planning system 501 may further identify the implantation target and analyze it in conjunction with the captured kinematic data.
In some embodiments, static physical data may be accessed at block 814, and the kinematic data may be analyzed and/or implantation targets identified at block 816 using the kinematic data, patient profile, and static data. At block 818, the data may be analyzed by the AI engine 503, which may include receiving data from and/or providing data to a model database 820. At block 822, the AI analysis may then be utilized by the modeling and planning system 501 to automatically generate implant size, type, and/or placement parameters.
In some embodiments, at block 824, the surgeon and/or surgical personnel may choose to accept or modify the implant parameters suggested by the modeling and planning system 501. If the surgeon and/or operator reject the suggested parameters at 824, the surgeon and/or operator may modify the implant design at block 826. In some embodiments, the AI engine 503 may receive the modification and retrain the AI model. For example, the AI engine 503 may determine which parameter or parameters to modify (e.g., point of occurrence, bone volume, stress map, placement of implants relative to function cones, etc.). For example, different providers may have different preferences, and the AI engine 503 may employ those preferences.
In some embodiments, modeling and planning system 501 may be further configured to generate a surgical guide model at block 828 and/or a surgical navigation plan at block 830.
In some embodiments, at block 832, one or more surgical guides may be manufactured based on a previously generated surgical guide model. In some embodiments, at block 834, one or more implants may be selected and/or manufactured based on previously established parameters. Alternatively, in some embodiments, existing implants that match previously established parameters may be selected from a library of implants.
In some embodiments, at block 836, the surgeon may perform the implantation procedure with the generated procedure guide and/or navigation plan. In some embodiments, at block 838, the surgeon may record the post-operative results.
In some embodiments, at block 840, the AI engine 503 may be further configured to update its internal model based on the patient profile and surgical results based on post-operative data, which may include data regarding implant failure.
AI engine for automating parameterized implants
FIG. 9 is a flow chart showing an overview of an example embodiment of training and using an AI engine to provide parameterized implant suggestions. In some embodiments, the AI engine 503 may include one or more analysis algorithms and a model database 820. In some embodiments, model database 820 may contain one or more preconfigured models. In some embodiments, the system developer or provider may collect training data at block 902 to perform initial training of the AI engine 503 at block 904. In some embodiments, the training data may include data regarding past surgery and patient parameters with corresponding surgical outcome data in the form of implant pairs. In some embodiments, the implant parameters may include one or more of implant position relative to a bone surface of the patient, implant type, implant material, burial depth, implant angle relative to the bone surface, implant size (e.g., length, diameter), crown size, and/or crown geometry. In some embodiments, patient parameters may include mandible and/or maxilla geometry, bone mass, bone density, nerve and chin hole position, sinus position, jaw kinematic data and/or function cones, and/or vital information such as age, gender, and/or medical history. In some embodiments, the surgical outcome data may include one or more perioperative complications, a post-operative lifetime of the implant, adverse events associated with the implant, and/or patient satisfaction.
FIG. 10 depicts a flowchart for training an artificial intelligence or machine learning model, according to some embodiments. In some embodiments, training step 904 of fig. 9 may implement a procedure similar to or the same as training procedure 1000 depicted in fig. 10. At block 1001, the system may receive a dataset including various information, such as patient profile information, jaw movement information, and the like. At block 1002, one or more transforms may be performed on the data. For example, the data may need to be transformed to conform to an intended input format, such as to conform to an intended date format, to conform to a particular tooth numbering system (e.g., universal numbering system, FDI world dental alliance, or parmer notation). In some embodiments, the data may undergo conversion in preparation for training an AI or ML algorithm, which typically operates using data that has undergone some form of normalization or other modification. For example, the classification data may be encoded in a particular manner. The nominal data may be encoded using one-hot encoding, binary encoding, feature hashing, or other suitable encoding methods. Ordinal data may be encoded using ordinal encoding, polynomial encoding, helmert encoding, and the like. Numerical data may be normalized, for example, by scaling the data to a maximum of 1 and a minimum of 0 or-1. At block 1003, the system may create training, tuning, and test/verify datasets from the received datasets. The training data set 1004 may be used during training to determine variables used to form a predictive model. The tuning data set 1005 may be used to select a final model and prevent or correct overfitting that may occur during training with the training data set 1004 because the trained model should generally be suitable for a wide range of patients, rather than a special training data set (e.g., where the training data set is biased toward patients with relatively high or relatively low bone density, wide or narrow dental arches, etc.). After training and tuning, the model may be evaluated using the test dataset 1006. For example, the test dataset 1006 may be used to check whether the model is overfitted to the training dataset. In training loop 1014, the system may train model 1007 using training data set 1004. Training may be performed in a supervised, unsupervised or partially supervised manner. At block 1008, the system may evaluate the model according to one or more evaluation criteria. For example, the evaluation may include determining a frequency at which the implant proposal fits the patient based on various criteria, such as contact points, shear stress, bone density at the implant site, and the like. At block 1009, the system may determine whether the model meets one or more evaluation criteria. If the model fails the evaluation, the system may tune the model using the tuning data set 1005 at block 1010, repeating training 1007 and evaluation 1008 until the model passes the evaluation at block 1009. Once the model passes the evaluation at 1009, the system may exit the model training loop 1014. The test dataset 1006 may be passed through a trained model 1011 and at block 1012, the system may evaluate the results. If the evaluation fails at block 1013, the system may re-enter the training loop 1014 for additional training and tuning. If the model passes, the system may stop the training process, producing a trained model 1011. In some embodiments, the training program may be modified. For example, in some embodiments, the system may not use the test dataset 1006. In some embodiments, the system may use a single data set. In some embodiments, the system may use two data sets. In some embodiments, the system may use more than three data sets.
Returning to fig. 9, in some embodiments, at block 906, the surgeon and/or medical personnel may submit the patient profile to an AI engine (e.g., AI engine 503). In some embodiments, the patient profile may include mandible and/or maxilla geometry, bone mass, bone density, nerve and chin hole location, sinus location, jaw kinematic data and/or function cones, tooth location, structure to drink, and/or vital information of the patient, such as age, gender, and/or medical history. In some embodiments, the patient profile may include other parameters not specified herein.
In some embodiments, at block 908, the patient profile may be analyzed by the AI engine 503, and the AI engine 503 may then generate suggested implant parameters. In some embodiments, AI analysis may be configured to analyze a subset of patient profiles. In some embodiments, the subset may include one or more of patient mandible and maxilla geometry, bone mass, bone density, nerve and chin hole position, sinus position, jaw kinematic data or function cones, vital information, and/or surgical history. In some embodiments, a subset of the data may be used. For example, in some embodiments, only a portion of jaw kinematic data may be used. For example, in some embodiments, less data may be used in order to reduce computational requirements, or some data may be excluded, such as jaw kinematic data representing extreme actions that a patient typically does not make. In some embodiments, the AI engine may analyze all or part of an existing prosthesis, a prosthesis projection (e.g., a simulated or future shape of a prosthetic tooth), or the like. In some embodiments, the implant may be placed under a virtually designed tooth. In some embodiments, the surgeon and/or medical personnel may adjust the suggested parameters at block 910, and the surgeon may perform the procedure based on the suggested parameters at block 912.
In some embodiments, after the procedure, the surgeon and/or medical personnel may submit the procedure and post-procedure data to the AI engine 503 at block 914. In some embodiments, the surgical and/or post-surgical data may include a list of perioperative complications, post-operative lifetime of the implant, adverse events related to the implant, and/or patient satisfaction. In some embodiments, the submitted data may be used by the AI engine 503 to update the model in the model database 820 at block 916.
As shown in fig. 7B, in some embodiments, AI engine 503 may be connected to a modeling and planning system via a network. In some embodiments, the AI engine 503 can be hosted by a commercial service provider. In some embodiments, the AI engine 503 may be software configured to run on a cloud computing platform. In some embodiments, the AI engine 503 may be trained on and deployed on a first computing system. In some embodiments, the AI engine 503 may be trained on a first computing system and deployed on a second computing system. For example, it may be advantageous to have a large amount of computing resources during model training, but such computing power may not be required for deployment. Similarly, network bandwidth may be relatively unimportant during training, but rather more important at deployment, so that the AI engine 503 is able to process incoming requests.
Example method of automating implant surgery
The systems and methods described herein may be used to automate various aspects of dental treatment and planning, including simple and complex treatment situations. In some embodiments, the system may be configured to determine the final placement of the teeth by proposing one or more implant locations of missing teeth. This may include capturing kinematic data, capturing intraoral scan information to determine the shape and placement of a patient's existing teeth and bone structures, etc. In some cases, the patient may not have missing teeth, but may instead need to replace one or more existing teeth, in which case information about the shape, structure, and/or positioning of the patient's current teeth may be captured so that a suitable substitute may be formed. The system may determine the positioning of the implant and present the guide plate to assist in performing the procedure and/or for performing planning of the navigated procedure.
In some embodiments, the patient may have an existing prosthesis, and the implant may be used to place and/or secure the prosthesis. In some cases, the implantation region may be located in front of or behind the prosthetic tooth. In this case, the procedure for automatically determining the placement of the implant may include consideration of information such as tooth centroid, bone volume, bone density, and/or functional taper. In some cases, the implantation area may be identified using automated modeling and/or segmentation of the patient's existing prosthesis and/or natural teeth. In some cases, the patient may have a removable prosthesis that may be replaced with a fixed prosthesis. The shape of the existing prosthesis may define the location of the implant.
In some embodiments, the patient may not have an existing prosthesis and/or may lack teeth. In this case, more complex methods may be used as the information about the existing (natural or prosthetic) teeth is reduced or eliminated. In some embodiments, a method may include: identifying a dental arch; establishing a dental implant plan; determining a volumetric region for implantation; performing bone reconstruction; and identifying a functional tooth centroid (which may be determined from the functional cone). The function taper may be centered at the point of occurrence of the implant, the implant dimensions (e.g., diameter, length, width, height, etc.) may be determined, and the planned implantation or working area may be identified.
In some embodiments, the implant and/or crown may be selected from one or more libraries. For example, the library may include implants from different manufacturers. In some embodiments, a provider (e.g., a dentist) may select a manufacturer, and may then select an appropriate diameter and/or length of the implant using an algorithm based at least in part on a combination of one or more of bone volume, prosthesis location, points of occurrence, function taper, and the like.
In some embodiments, the dental arch may be identified by the system using a 3D model of the dental arch, CBCT scan, x-ray, or other image. In some embodiments, CBCT images may contain artifacts that make them unsuitable or undesirable for identifying dental arches. Fig. 11A illustrates the identification of an arch 1101. In some embodiments, the crown may be identified in part by redirecting at least one model along an occlusal plane as depicted in fig. 11B showing the crown 1102.
From the image of fig. 11A, for example, various features may be determined, as shown in fig. 12. For example, the image may include teeth 1201, missing tooth regions 1202, and gaps 1203. In some embodiments, multiple images may be used, or multiple measurements may be made to determine an average dental arch.
Once the arch has been identified, a cross-section perpendicular to the arch can be analyzed to optimize position determination and identify the desired area, as shown in fig. 13. For example, gray scale on an x-ray or CT scan may indicate the emissivity, and different materials (e.g., bone, tooth root, soft tissue, etc.) may have different resistivities, which may be measured in henry units in some embodiments. In some embodiments, the resistivity data may help identify the implant region, as shown in fig. 14. For example, the system may be configured to use imaging data to determine bone density, determine bone volume, determine tooth position, boundary regions between teeth, and the like. In some embodiments, bone volume calculations may be used to determine the center of positioning of the implant by, for example, selecting a region with a relatively high bone volume. In some embodiments, selecting a region with a relatively large bone volume may not produce a desired implantation result. For example, areas of relatively higher bone density may alternatively be selected. In some embodiments, the system may determine that a bone graft or other procedure would be advantageous based on determining that the patient lacks a region of suitable bone volume and density to perform the implantation procedure.
In some embodiments, the system may identify a bone centroid that may be the center of the bone volume, as shown in fig. 15A and 15B. In the absence of teeth, the bone centroid may enable the system to compensate for the absence of teeth in the dental arch. In some embodiments, the system may consider the center of mass and the top of the ridge to correct the axis for placement of the implant and obtain the point of appearance, as shown in fig. 16. Thus, the system may enable a surgeon to improve planning by taking into account biomechanical data and bone density information. In some embodiments, the bone centroid may enable the determination of a functional dental centroid that accounts for the dental arch.
In some embodiments, the system may calculate a function cone starting at the point of occurrence and applying the movement (e.g., kinematic data captured from the patient). Thus, the implant may be oriented at least in part by taking into account the functional taper.
In some embodiments, the implant may be selected based on a prosthetic projection, and even without teeth, the use of the dental arch may already act as a prosthetic projection.
Collaboration platform
In some embodiments, for example, as shown in fig. 7A, a dentist may access field devices for preparing a surgical guide. However, in some cases, the dentist may not have access to such devices, and thus may outsource manufacturing to a third party. Similarly, dentists will often use an external laboratory to provide implants, crowns, and the like. Thus, it would be advantageous for dentists and other providers to be able to easily interact with the laboratory to facilitate the preparation of implants, dental caps, surgical guides, and the like.
FIG. 17 illustrates an example embodiment of a collaboration platform implementation provider and a laboratory. For example, as depicted in fig. 17, the dental offices 1702, 1704, and 1706 and the dental laboratories 1708, 1710, and 1712 may communicate with the cloud server 1714. The cloud server 1714 may communicate with a data store 1716. The cloud server 1714 may enable the dental offices 1702, 1704, 1706 to communicate with the dental laboratories 1708, 1710, 1712. For example, the dental office may send data to the cloud server 1714, where the data may be stored in the data store 1716 and accessible by the dental laboratory. In some embodiments, the dental laboratory may provide data to the cloud server 1714, which may then be accessed by one or more of the dental offices. In some embodiments, the dental office and/or dental laboratory may communicate using a smartphone, tablet computer, desktop computer, laptop computer, or other device. For example, in some embodiments, the apparatus for collecting patient data and/or planning the procedure may have the capability to communicate over a network.
In some embodiments, a dental office may provide relevant parameters and information that a dental laboratory may use to prepare implants, crowns, etc. for a particular procedure. The dental laboratory may provide information that the dental office may consider useful, such as available implant depth and diameter, available materials, available crown shape and size, and the like. In some embodiments, the dental office may provide information about the patient, such as name, contact information, insurance information, billing information, etc., to the dental laboratory, such as where the dental laboratory directly insurance the patient and/or patient.
In some embodiments, different dental offices may use the platform depicted in fig. 17 in conjunction with each other. For example, the dental office 1702 may wish to market procedures with the dental offices 1704 and 1706. Using the platform, the dental office 1702 may make some or all of the relevant patient data available to other dental offices.
In some embodiments, the collaboration platform may include various features for protecting data and patient privacy. For example, in some embodiments, the information may be stored in an encrypted format in data store 1716. In some embodiments, the collaboration platform may have a user permission system that enables a user of the platform to control access to the information. For example, a user of the platform may give access to some users but not others, or a user may wish to allow another user to temporarily access information (e.g., for use in a business with another dentist, or when one dentist or provider replaces another provider).
In some embodiments, a user of the platform may upload information for processing by the platform. For example, storing kinematic data, storing patient profiles, storing static physical data, identifying implant targets, analyzing kinematic data, generating implant parameters, generating surgical guides, generating surgical navigation plans, analyzing implant parameters, and the like may all run on a platform. In some embodiments, some portions of the dental planning procedure may run on the provider's own local system, while other portions may run remotely. For example, computationally intensive tasks such as running and/or training a machine learning model may be offloaded to a cloud server.
Computer system
In some embodiments, the systems, programs, and methods described herein may be implemented using one or more computing systems (e.g., the computing system shown in fig. 18). The example computer system 1802 communicates with one or more computing systems 1820, one or more portable devices 1815, and/or one or more data sources 1822 via one or more networks 1818. While FIG. 14 illustrates an embodiment of a computing system 1802, it should be appreciated that the functionality provided in the components and modules for the computer system 1802 may be combined into fewer components and modules or further separated into additional components and modules.
The computer system 1802 may include a jaw motion tracking and/or implant planning module 1814 that performs the functions, methods, acts, and/or procedures described herein. The jaw motion tracking and implant planning module 1814 is executed on the computer system 1802 by the central processing unit 1806, discussed further below.
In general, the term "module" as used herein refers to logic embodied in hardware or firmware, or a set of software instructions having entry and exit points. The modules are written in a programming language such as JAVA, C or c++, python, etc. The software modules may be compiled and linked into an executable program, installed in a dynamically linked library, or may be written in an interpreted language such as BASIC, PERL, LUA or Python. The software modules may be invoked from other modules or from themselves, and/or may be invoked in response to a detected event or interrupt. Modules implemented in hardware include connected logic units such as gates and flip-flops and/or may include programmable units such as programmable gate arrays or processors.
In general, the modules described herein refer to logic modules that may be combined with other modules or divided into sub-modules, regardless of the physical organization or storage of the logic modules. Modules are executed by one or more computing systems and may be stored on or within any suitable computer-readable medium, or implemented in whole or in part within specially designed hardware or firmware. While the above-described methods, calculations, processes, or analyses may be facilitated through the use of a computer, not all calculations, analyses, and/or optimizations require the use of a computer system. Moreover, in some embodiments, the program blocks described herein may be altered, rearranged, combined, and/or omitted.
The computer system 1802 includes one or more processing units (CPUs) 1806, which may include a microprocessor. The computer system 1802 further includes physical memory 1810, such as Random Access Memory (RAM) for temporarily storing information, read Only Memory (ROM) for permanently storing information, and a mass storage device 1804, such as a backing store, hard drive, rotating magnetic disk, solid State Disk (SSD), flash memory, phase Change Memory (PCM), 3D XPoint memory, magnetic disk, or optical media storage. Alternatively, the mass storage device may be implemented in an array of servers. Typically, the components of computer system 1802 are connected to the computer using a standard based bus system. The bus system may be implemented using a variety of protocols, such as Peripheral Component Interconnect (PCI), micro-channel, SCSI, industry Standard Architecture (ISA), and Extended ISA (EISA) architecture.
The computer system 1802 includes one or more input/output (I/O) devices and interfaces 1812, such as a keyboard, mouse, touch pad, and printer. The I/O devices and interfaces 1812 may include one or more display devices, such as monitors, that allow visual presentation of data to the participant. More specifically, for example, a display device provides a presentation of a GUI as application software data, and a multimedia presentation. The I/O devices and interfaces 1812 may also provide a communications interface with various external devices. For example, the computer system 1802 may include one or more multimedia devices 1808, such as speakers, graphics cards, graphics accelerators, and microphones.
The computer system 1802 can run on a variety of computing devices, such as a server, windows server, structural query language server, unix server, personal computer, laptop computer, etc. In other embodiments, computer system 1802 may run on a clustered computer system, a mainframe computer system, and/or other computing system adapted to control and/or communicate with a large database, perform bulk transaction processing, and generate reports from the large database. The computing system 1802 is typically controlled and coordinated by operating system software, such as z/OS, windows, linux, UNIX, BSD, sunOS, solaris, macOS or other compatible operating systems, including proprietary operating systems. The operating system controls and schedules computer processes for execution, performs memory management, provides file systems, networking and I/O services, and provides user interfaces, such as Graphical User Interfaces (GUIs), among others.
The computer system 1802 shown in fig. 18 is coupled to a network 1818, such as a LAN, WAN, or the internet, via a communications link 1816 (wired, wireless, or a combination thereof). The network 1818 communicates with various computing devices and/or other electronic devices. The network 1818 communicates with one or more computing systems 1820, one or more portable devices 1815, and/or one or more data sources 1822. The jaw motion tracking and implant planning module 1814 may access or be accessible by the computing system 1820 and/or the data source 1822 through a user access point having internet access functionality. The connections may be direct physical connections, virtual connections, and other connection types. The user access point with internet functionality may include a browser module that presents data using text, graphics, audio, video, and other media and allows interaction with the data via the network 1818.
Accessing the jaw motion tracking and implant planning module 1814 of the computer system 1802 through the computing system 1820, the portable device 1815, and/or the data source 1822 may be through a user access point having internet functionality, such as the computing system 1820, the portable device 1815, and/or the data source 1822, a personal computer, a cellular telephone, a smart phone, a laptop computer, a tablet computer, an electronic reader device, an audio player, or other device capable of connecting to the network 1818. Such devices may have browser modules implemented as modules that present data using text, graphics, audio, video, and other media and allow interaction with the data via the network 1818.
The output module may be implemented as a combination of all points addressable by a display, such as a Cathode Ray Tube (CRT), liquid Crystal Display (LCD), plasma display, or other type and/or combination of displays. The output module may be implemented to communicate with the input device 1812 and also include software with an appropriate interface that allows a user to access data through the use of programmatic screen elements such as menus, windows, dialog boxes, toolbars, and controls (e.g., radio buttons, check boxes, slide bars, etc.). Further, the output module may be in communication with a set of input and output devices to receive signals from a user.
The input device may include a keyboard, a roller ball, a pen and stylus, a mouse, a trackball, a voice recognition system, or pre-specified switches or buttons. The output device may include a speaker, a display screen, a printer, or a voice synthesizer. Additionally, the touch screen may act as a hybrid input/output device. In another embodiment, the user may interact with the system more directly, such as through a system terminal connected to the score generator, rather than communicating via the Internet, WAN, or LAN or similar network.
In some embodiments, system 1802 may include a physical or logical connection that has been established between a remote microprocessor and a mainframe computer for the express purpose of uploading, downloading, or viewing interactive data and databases on-line in real-time. The remote microprocessor may be operated by an entity that operates a computer system 1802, including a client server system or a host server system, and/or may be operated by one or more of a data source 1822, a portable device 1815, and/or a computing system 1820. In some embodiments, terminal emulation software can be used on the microprocessor to participate in micro-mainframe links.
In some embodiments, a computing system 1820 internal to the entity operating computer system 1802 may access jaw motion tracking and implant planning module 1814 internally when CPU 1806 runs an application or program.
The computing system 1802 may include one or more internal and/or external data sources (e.g., data sources 1822). In some embodiments, one or more of the data stores and data sources described above may be implemented using relational databases, such as DB2, sybase, oracle, codeBase, and other types of databases SQL server, of the other typeDatabases such as flat file databases, entity relationship databases, and object-oriented databases, and/or record-based databases.
The computer system 1802 may also access one or more databases 1822. Database 1822 may be stored in a database or data repository. The computer system 1802 may access one or more databases 1822 through the network 1818, or may directly access a database or data repository through the I/O devices and interfaces 1812. A data store that stores one or more databases 1822 may reside within the computer system 1802.
In some embodiments, one or more features of the systems, methods, and devices described herein may utilize URLs and/or cookies to store and/or transmit data or user information, for example. A Uniform Resource Locator (URL) may contain a web address and/or a reference to a web page resource stored on a database and/or server. The URL may specify a location of a computer and/or resource on a computer network. The URL may include a mechanism to retrieve the network resource. The source of the web resource may receive the URL, identify the location of the web resource, and transmit the web resource back to the requestor. The URL may be converted to an IP address and a Domain Name System (DNS) may look up the URL and its corresponding IP address. The URL may refer to a web page, file transfer, email, database access, and other applications. The URL may include a sequence of characters that identify a path, domain name, file extension, hostname, query, fragment, scheme, protocol identifier, port number, username, password, flag, object, resource name, etc. The systems disclosed herein may generate, receive, transmit, apply, parse, serialize, visualize, and/or perform actions on URLs.
cookies, also known as HTTP cookies, web cookies, internet cookies, and browser cookies, may contain data sent from a website and/or stored on a user's computer. This data may be stored by the user's web browser as the user browses. cookies may contain useful information for websites to remember previous browsing information, such as shopping carts on online stores, clicks on buttons, login information, and/or records of web pages or web resources accessed in the past. The cookie may also contain information entered by the user, such as name, address, password, credit card information, etc. cookies may also perform computer functions. For example, an application (e.g., a web browser) may use an authentication cookie to identify whether a user has logged in (e.g., to a website). The cookie data may be encrypted to provide security for the consumer. Tracking cookies may be used to compile historical browsing histories of individuals. The systems disclosed herein may generate and use cookies to access data of individuals. The system may also generate and use JSON web tokens to store reliability information, HTTP authentication as authentication protocol, tracking IP addresses of sessions or identification information, URLs, etc.
OTHER EMBODIMENTS
While the invention has been disclosed in the context of certain embodiments and examples, it will be understood by those skilled in the art that the invention extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses of the invention and obvious modifications and equivalents thereof. In addition, while several variations of the embodiments of the invention have been shown and described in detail, other modifications within the scope of the invention will be apparent to those skilled in the art based upon the present disclosure. Furthermore, it is contemplated that various combinations or sub-combinations of the specific features and aspects of the embodiments can be made and still fall within the scope of the invention. It should be understood that various features and aspects of the disclosed embodiments can be combined with or substituted for one another in order to form varying modes of the disclosed embodiments of the invention. Any of the methods disclosed herein do not have to be performed in the order recited. Therefore, it is intended that the scope of the invention herein disclosed should not be limited by the particular embodiments described above.
Conditional language such as "can, could, might or mays" and others are generally intended to convey that certain embodiments include and other embodiments do not include certain features, elements and/or steps unless specifically stated otherwise or otherwise understood within the context as used. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required by one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The headings used herein are for the convenience of the reader only and are not meant to limit the scope of the claims or the invention.
Further, while the methods and apparatus described herein are susceptible to various modifications and alternative forms, specific examples thereof have been shown in the drawings and are herein described in detail. It should be understood, however, that the invention is not to be limited to the particular forms or methods disclosed, but to the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the various embodiments described and the appended claims. Furthermore, the disclosure herein of any particular feature, aspect, method, property, characteristic, quality, attribute, element, etc. in connection with an embodiment or example can be used with all other embodiments or examples set forth herein. Any of the methods disclosed herein do not have to be performed in the order recited. The methods disclosed herein involve certain actions taken by a participant; however, the method may also include any third party indication of these actions, whether explicit or implicit. The scope of the disclosure herein also encompasses any and all overlaps, sub-ranges, and combinations thereof. Language such as "up to", "at least", "greater than", "less than", "between …", and the like, includes such numbers. Terms such as "about" or "approximately" that precede a number include the number and should be construed on a case-by-case basis (e.g., as accurate as possible in the case, such as ± 5%, ± 10%, ± 15%, etc.). For example, "about 3.5mm" includes "3.5mm". A phrase preceding a term, such as "substantially" includes the recited phrase and should be interpreted on a case-by-case basis (e.g., as reasonably possible in the case). For example, "substantially constant" includes "constant". Unless otherwise indicated, all measurements are under standard conditions, including temperature and pressure.
As used herein, a phrase referring to "at least one" in a list of items refers to any combination of those items, including single members. For example, unless specifically stated otherwise, "at least one of: A. b or C "is intended to encompass: A. b, C, A and B, A and C, B and C, A, B and C. Unless specifically stated otherwise, a connection language like the phrase "at least one of X, Y and Z" is to be understood in another way as generally used to convey the context of an item, term, etc., that may be at least one of X, Y or Z. Thus, such connection language is not generally intended to imply that certain embodiments require at least one X, at least one Y, and at least one Z to all be present.

Claims (40)

1. A computer-implemented method for oral surgery planning, the computer-implemented method comprising:
receiving, by a computing system, a patient profile, wherein the patient profile includes:
patient anatomical data, wherein the patient anatomical data comprises one or more models of a maxilla or mandible of the patient; and
kinematic data associated with movements of the jaw of the patient;
identifying, by the computing system, one or more candidate sites for a dental implant based at least in part on the received patient profile; and
One or more dental implant parameters are generated, by the computing system, based at least in part on the identified one or more candidate sites and the kinematic data.
2. The computer-implemented method of claim 1, wherein the patient profile includes any combination of one or more of bone volume, bone density, relative bone density, nerve location, or sinus location.
3. The computer-implemented method of claim 1, further comprising:
determining, by the computing system, a proposed crown geometry;
determining, by the computing system, an indication of a function cone based at least in part on the kinematic data;
determining, by the computing system, one or more crown contact points based at least in part on the patient profile and the proposed crown geometry;
generating, by the computing system, a constraint map based at least in part on the one or more crown contact points;
selecting, by the computing system, an implant model based at least in part on the constraint map; and
a modified implant model is generated, by the computing system, based at least in part on the constraint map and the implant model.
4. The computer-implemented method of claim 1, further comprising:
determining, by the computing system, a proposed crown geometry;
automatically determining, by the computing system, an indication of a function cone based at least in part on the kinematic data;
automatically determining, by the computing system, one or more crown contact points based at least in part on the patient profile; and
an implant model is automatically selected by the computing system based at least in part on the crown contact points.
5. The computer-implemented method of claim 3, wherein generating the modified model comprises minimizing one or more stresses on the dental implant.
6. The computer-implemented method of claim 1, wherein identifying one or more candidate sites for a dental implant comprises comparing the one or more models of the maxilla or mandible of the patient to one or more reference models.
7. The computer-implemented method of claim 1, wherein identifying one or more candidate sites for a dental implant comprises automatically analyzing a bone of the patient to determine any combination of one or more of: dental arch, inter-dental space, bone volume and relative bone density.
8. The computer-implemented method of claim 1, wherein the one or more dental implant parameters comprise any combination of one or more of: the location of the dental implant relative to the bone surface, implant type, implant material, burial depth, implant angle relative to the bone surface, implant size, crown size, and crown geometry.
9. The computer-implemented method of claim 8, wherein at least one of the crown size and the crown geometry is based at least in part on a prosthetic projection of a patient, a prosthetic tooth, or an existing tooth.
10. The computer-implemented method of claim 1, further comprising:
determining, by the computing system, that one or more candidate sites have insufficient bone volume or bone density to perform an implantation procedure based on the patient profile.
11. The computer-implemented method of claim 4, wherein determining one or more dental implant contact points comprises determining contact at one or more phases of jaw movement based at least in part on the indication of the functional cone and the patient anatomy, wherein the jaw movement comprises recorded movement, simulated movement, or both.
12. The computer-implemented method of claim 1, wherein selecting an implant model comprises selecting a preconfigured model from a model database using an artificial intelligence engine.
13. The computer-implemented method of claim 1, wherein generating implant parameters comprises:
patient data is provided to an artificial intelligence model configured to generate implant parameters.
14. The computer-implemented method of claim 13, further comprising:
receiving, by the computing system, an indication of a surgical outcome; and
the artificial intelligence model is retrained, by the computing system, using the received indication of the surgical result.
15. The computer-implemented method of claim 1, further comprising:
the user is provided with an interface for modifying one or more implant parameters.
16. The computer-implemented method of claim 1, further comprising:
a surgical guide is created, wherein the surgical guide includes a 3D model of the guide that can be used during a surgical procedure.
17. The computer-implemented method of claim 16, further comprising providing the surgical guide to a 3D printer.
18. The computer-implemented method of claim 1, further comprising generating a surgical navigational plan.
19. The computer-implemented method of claim 1, further comprising providing a visualization and interactive interface.
20. An oral surgical planning system, comprising:
a computing system, comprising:
a computer readable storage medium having program instructions embodied therewith; and
one or more processors configured to execute the program instructions to cause the computing system to:
receiving a patient profile, wherein the patient profile comprises:
patient anatomical data, wherein the patient anatomical data comprises one or more models of a maxilla or mandible of the patient; and
kinematic data associated with movements of the jaw of the patient;
identifying one or more candidate sites for the dental implant based at least in part on the received patient profile; and
one or more dental implant parameters are generated based at least in part on the identified one or more candidate sites and the kinematic data.
21. The oral surgical planning system of claim 20 wherein the patient anatomical data comprises one or more models of a maxilla or mandible of the patient.
22. The oral surgical planning system of claim 20 wherein the patient profile comprises any combination of one or more of bone volume, bone density, relative bone density, nerve location, or sinus location.
23. The oral surgical planning system of claim 20, wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to:
determining the proposed crown geometry;
determining an indication of a function cone based at least in part on the kinematic data;
determining one or more crown contact points based at least in part on the patient profile and the proposed crown geometry;
generating a constraint map based at least in part on the one or more crown contact points;
selecting an implant model based at least in part on the constraint map; and
a modified implant model is generated based at least in part on the constraint map and the implant model.
24. The oral surgical planning system of claim 20, wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to:
Determining the proposed crown geometry;
automatically determining an indication of a function cone based at least in part on the kinematic data;
automatically determining one or more crown contact points based at least in part on the patient profile; and
an implant model is automatically selected based at least in part on the crown contact points.
25. The oral surgical planning system of claim 23 wherein generating the modified model comprises minimizing one or more stresses on the dental implant.
26. The oral surgical planning system of claim 20 wherein identifying one or more candidate sites for a dental implant comprises comparing the one or more models of the maxilla or mandible of the patient to one or more reference models.
27. The oral surgical planning system of claim 20 wherein identifying one or more candidate sites for a dental implant comprises automatically analyzing the patient's bone to determine any combination of one or more of: dental arch, inter-dental space, bone volume and relative bone density.
28. The oral surgical planning system of claim 20 wherein the one or more dental implant parameters comprise any combination of one or more of: the location of the dental implant relative to the bone surface, implant type, implant material, burial depth, implant angle relative to the bone surface, implant size, crown size, and crown geometry.
29. The oral surgical planning system of claim 28 wherein at least one of the crown size and the crown geometry is based at least in part on a prosthetic projection of a patient, a prosthetic tooth, or an existing tooth.
30. The oral surgical planning system of claim 20, wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to:
determining, based on the patient profile, that one or more candidate sites have insufficient bone volume or bone density to perform an implantation procedure.
31. The oral surgical planning system of claim 23 wherein determining one or more dental implant contact points comprises determining contact at one or more phases of jaw movement based at least in part on the indication of the functional taper and the patient anatomical data, wherein the jaw movement comprises recorded movement, simulated movement, or both.
32. The oral surgical planning system of claim 20 wherein selecting an implant model comprises selecting a preconfigured model from a model database using an artificial intelligence engine.
33. The oral surgical planning system of claim 20, wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to:
patient data is provided to an artificial intelligence model configured to generate implant parameters.
34. The oral surgical planning system of claim 33, wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to:
receiving an indication of a surgical outcome; and
the artificial intelligence model is retrained using the received indication of the surgical result.
35. The oral surgical planning system of claim 20, wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to:
the user is provided with an interface for modifying one or more implant parameters.
36. The oral surgical planning system of claim 20, wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to:
A surgical guide is created, wherein the surgical guide includes a 3D model of the guide that can be used during a surgical procedure.
37. The oral surgical planning system of claim 36 wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to provide the surgical guide to a 3D printer.
38. The oral surgical planning system of claim 20 wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to generate a surgical navigational plan.
39. The oral surgical planning system of claim 20 wherein the program instructions further comprise instructions that, when executed by the one or more processors, cause the computing system to provide a visualization and interactive interface.
40. The oral surgical planning system of claim 20, further comprising:
jaw movement tracking headgear; and
jaw motion tracking detector.
CN202280042860.4A 2021-06-22 2022-06-21 Systems, methods, and devices for dental implant surgery enhanced using kinematic data Pending CN117500451A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202163213607P 2021-06-22 2021-06-22
US63/213,607 2021-06-22
PCT/IB2022/000368 WO2022269359A1 (en) 2021-06-22 2022-06-21 Systems, methods, and devices for augmented dental implant surgery using kinematic data

Publications (1)

Publication Number Publication Date
CN117500451A true CN117500451A (en) 2024-02-02

Family

ID=82846283

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280042860.4A Pending CN117500451A (en) 2021-06-22 2022-06-21 Systems, methods, and devices for dental implant surgery enhanced using kinematic data

Country Status (3)

Country Link
EP (1) EP4358889A1 (en)
CN (1) CN117500451A (en)
WO (1) WO2022269359A1 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120214121A1 (en) * 2011-01-26 2012-08-23 Greenberg Surgical Technologies, Llc Orthodontic Treatment Integrating Optical Scanning and CT Scan Data
US20140329194A1 (en) * 2013-05-05 2014-11-06 Rohit Sachdeva Orthodontic treatment planning using biological constraints
WO2018045135A1 (en) * 2016-08-31 2018-03-08 Lucas Kelly System and method for producing dental solutions incorporating a guidance package

Also Published As

Publication number Publication date
WO2022269359A1 (en) 2022-12-29
EP4358889A1 (en) 2024-05-01

Similar Documents

Publication Publication Date Title
US11676701B2 (en) Systems and methods for automated medical image analysis
US20220218449A1 (en) Dental cad automation using deep learning
US10984529B2 (en) Systems and methods for automated medical image annotation
US10143536B2 (en) Computational device for an orthodontic appliance for generating an aesthetic smile
US10534869B2 (en) Method for designing and manufacturing a bone implant
US20210343400A1 (en) Systems and Methods for Integrity Analysis of Clinical Data
US11963846B2 (en) Systems and methods for integrity analysis of clinical data
US20220296344A1 (en) Method, system and devices for instant automated design of a customized dental object
WO2021046241A1 (en) Automated medical image annotation and analysis
WO2018112427A1 (en) Augmented reality planning and viewing of dental treatment outcomes
US11357604B2 (en) Artificial intelligence platform for determining dental readiness
US20220084267A1 (en) Systems and Methods for Generating Quick-Glance Interactive Diagnostic Reports
WO2022011342A9 (en) Systems and methods for integrity analysis of clinical data
US20240029901A1 (en) Systems and Methods to generate a personalized medical summary (PMS) from a practitioner-patient conversation.
KR102464472B1 (en) Orthodontic recommendation system using artificial intelligence and method thereof
KR102041888B1 (en) Dental care system
CN117500451A (en) Systems, methods, and devices for dental implant surgery enhanced using kinematic data
US20230252748A1 (en) System and Method for a Patch-Loaded Multi-Planar Reconstruction (MPR)
US20240161317A1 (en) Enhancing dental video to ct model registration and augmented reality aided dental treatment
WO2023041986A1 (en) Systems, devices, and methods for tooth positioning
CN118235209A (en) Systems, devices, and methods for tooth positioning
WO2023203385A1 (en) Systems, methods, and devices for facial and oral static and dynamic analysis
CN117999615A (en) Automatic tooth management in dental restoration workflow

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination