CN118235209A - Systems, devices, and methods for tooth positioning - Google Patents

Systems, devices, and methods for tooth positioning Download PDF

Info

Publication number
CN118235209A
CN118235209A CN202280075745.7A CN202280075745A CN118235209A CN 118235209 A CN118235209 A CN 118235209A CN 202280075745 A CN202280075745 A CN 202280075745A CN 118235209 A CN118235209 A CN 118235209A
Authority
CN
China
Prior art keywords
teeth
library
arc
patient
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280075745.7A
Other languages
Chinese (zh)
Inventor
马克西姆·贾伊松
安托伊内·朱莱斯·罗德里久伊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magic Chew Co
Original Assignee
Magic Chew Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Chew Co filed Critical Magic Chew Co
Publication of CN118235209A publication Critical patent/CN118235209A/en
Pending legal-status Critical Current

Links

Abstract

The present disclosure relates to systems and methods for determining a dental treatment plan. Some embodiments relate to determining a double helix structure. Some embodiments relate to optimizing the positioning of natural teeth, prosthetic teeth, or both to achieve desirable functional and aesthetic characteristics. In some embodiments, a machine learning model may be used to determine a treatment plan for a dental patient.

Description

Systems, devices, and methods for tooth positioning
Incorporated by reference to any priority application
The present application claims the benefit of U.S. provisional application No. 63/245072 filed on month 9 and 16 of 2021 and U.S. provisional application No. 63/364102 filed on month 5 and 3 of 2022, and these applications are hereby incorporated by reference in their entirety for all purposes.
Technical Field
The present application relates to systems, devices and methods for determining, generating and/or assisting in the positioning of teeth of a patient.
Background
The approaches described in this section are approaches that could be pursued, but not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section.
Proper placement of the patient's own teeth, artificial teeth, or both may be important for aesthetic and functional reasons. Current methods often fail to take into account important information in determining the placement of teeth, which can lead to poor aesthetics and/or functional results.
Disclosure of Invention
The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be briefly described.
In some aspects, the technology described herein relates to a computer-implemented method for dental treatment planning, comprising: receiving, by a computing system, patient data associated with a patient; determining, by the computing system, at least one arc corresponding to an anatomical point of a tooth library; determining, by the computing system, a double helix based on the at least one arc, the double helix to be used to fit a library of teeth; determining, by the computing system, a position of a tooth of the tooth library on the double helix; and optimizing teeth of the library of teeth by the computing system.
In some aspects, the techniques described herein relate to a method wherein the patient data comprises dental data.
In some aspects, the techniques described herein relate to a method, wherein the patient data comprises morphometric data.
In some aspects, the techniques described herein relate to a method, wherein determining at least one arc comprises: the patient data is provided to an AI model trained to identify anatomical points of teeth of the tooth library.
In some aspects, the techniques described herein relate to a method wherein determining a double helix includes providing the at least one arc to an AI model configured to determine the double helix based at least in part on the at least one arc.
In some aspects, the techniques described herein relate to a method wherein optimizing teeth of the library of teeth includes determining a position, rotation, or both, of each tooth of the library of teeth using an AI model configured to optimize functional and aesthetic positioning of the teeth.
In some aspects, the techniques described herein relate to a method further comprising performing, by the computing system, a dynamic assessment of a position of a tooth of the tooth library.
In some aspects, the techniques described herein relate to a method, wherein determining at least one arc includes determining an aesthetic arc, and wherein determining the aesthetic arc includes: projecting, by the computing system, one or more control points onto an image of the patient; defining, by the computing system, an initial curve based at least in part on the one or more control points; determining a final curve by modifying at least one control point by the computing system; and determining, by the computing system, a location of one or more anatomical points based at least in part on the final curve, the location of one or more anatomical points at least in part defining the aesthetic arc.
In some aspects, the techniques described herein relate to a method wherein determining the at least one arc includes determining an aesthetic arc, centering an arc, and fitting an arc.
In some aspects, the techniques described herein relate to a method, wherein determining the at least one arc further includes determining a guide arc associated with the mandibular teeth.
In some aspects, the techniques described herein relate to a method wherein optimizing teeth of the library of teeth includes adjusting relative positioning of one or more teeth in the library of teeth.
In some aspects, the techniques described herein relate to a method wherein adjusting the relative positioning includes adjusting an upper and lower overengagement value (overbite value) and an anterior and posterior overengagement value (overjet value).
In some aspects, the techniques described herein relate to a method wherein optimizing teeth of the library of teeth includes adjusting any combination of one or more of the size, shape, or rotation of at least one tooth of the library of teeth.
In some aspects, the techniques described herein relate to a method, wherein the library of teeth comprises a library of teeth of the patient, and wherein the method further comprises: identifying, by the computing system, one or more teeth of the library of teeth; and labeling, by the computing system, one or more anatomical points of each of the one or more teeth of the library of teeth.
In some aspects, the techniques described herein relate to a method, wherein the library of teeth comprises a library of artificial teeth, and wherein the method further comprises: a dental library is selected by the computing system from a plurality of prosthetic dental libraries based at least in part on the captured patient data.
In some aspects, the techniques described herein relate to a method wherein optimizing teeth of the dental library includes determining points of contact between the patient's maxillary teeth and the patient's mandibular teeth, wherein performing dynamic assessment includes determining a relationship of contact between the patient's maxillary teeth and the patient's mandibular teeth during movement of the patient's jaw.
In some aspects, the techniques described herein relate to a system for dental treatment planning, comprising: a computer readable storage medium having program instructions embodied therewith; and one or more processors configured to execute the program instructions to cause the system to: receiving patient data associated with a patient; determining at least one arc, the arc corresponding to an anatomical point of a tooth of the library of teeth; determining a double helix based on the at least one arc, the double helix to be used to fit a dental library; determining the position of teeth of the dental library on the double helix; and optimizing teeth of the dental library.
In some aspects, the techniques described herein relate to a system wherein the patient data includes dental data.
In some aspects, the techniques described herein relate to a system wherein the patient data comprises morphometric data.
In some aspects, the techniques described herein relate to a system wherein determining at least one arc includes: the patient data is provided to an AI model trained to identify anatomical points of teeth of the tooth library.
In some aspects, the techniques described herein relate to a system in which determining a double helix includes providing the at least one arc to an AI model configured to determine the double helix based at least in part on the at least one arc.
In some aspects, the techniques described herein relate to a system wherein optimizing teeth of the library of teeth includes determining a position, rotation, or both, of each tooth of the library of teeth using an AI model configured to optimize functional and aesthetic positioning of the teeth.
In some aspects, the techniques described herein relate to a system wherein the computer-readable storage medium has instructions embodied therewith that, when executed by the one or more processors, cause the system to: a dynamic assessment of the position of the teeth of the library of teeth is performed.
In some aspects, the techniques described herein relate to a system wherein determining at least one arc includes determining an aesthetic arc, and wherein determining the aesthetic arc includes: projecting one or more control points onto an image of the patient; defining an initial curve based at least in part on the one or more control points; defining a final curve by modifying at least one of the one or more control points; and determining a location of one or more anatomical points based at least in part on the final curve, the location of the one or more anatomical points at least in part defining the aesthetic arc.
In some aspects, the techniques described herein relate to a system wherein determining the at least one arc includes determining an aesthetic arc, centering an arc, and fitting an arc.
In some aspects, the techniques described herein relate to a system wherein determining the at least one arc further includes determining a guide arc associated with the mandibular teeth.
In some aspects, the techniques described herein relate to a system wherein optimizing teeth of the library of teeth includes adjusting relative positioning of one or more teeth in the library of teeth.
In some aspects, the techniques described herein relate to a system wherein adjusting the relative positioning includes adjusting an upper and lower malocclusion value and an anterior and posterior malocclusion value.
In some aspects, the techniques described herein relate to a system wherein optimizing teeth of the teeth library includes adjusting any combination of one or more of the size, shape, or rotation of at least one tooth of the teeth library.
In some aspects, the techniques described herein relate to a system wherein the library of teeth includes a library of teeth of the patient, and wherein the computer-readable storage medium has instructions embodied therewith that, when executed by the one or more processors, cause the system to: identifying one or more teeth of the library of teeth; and labeling one or more anatomical points of each of the one or more teeth of the library of teeth.
In some aspects, the techniques described herein relate to a system wherein the library of teeth comprises a library of artificial teeth, and wherein the computer-readable storage medium has instructions embodied therewith that, when executed by the one or more processors, cause the system to: a dental library is selected from a plurality of prosthetic dental libraries based at least in part on the patient data.
In some aspects, the techniques described herein relate to a system wherein optimizing teeth of the teeth library includes determining points of contact between the patient's maxillary teeth and the patient's mandibular teeth, wherein performing dynamic assessment includes determining a relationship of contact between the patient's maxillary teeth and the patient's mandibular teeth during movement of the patient's jaw.
For purposes of this summary, certain aspects, advantages and novel features of the invention are described herein. It should be understood that not all such advantages may be realized in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves one advantage or a set of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
Drawings
These and other features, aspects, and advantages of the present disclosure are described with reference to the drawings of certain embodiments, which are intended to illustrate, but not limit the present disclosure. It should be understood that the drawings, which are incorporated in and constitute a part of this specification, are for the purpose of illustrating the concepts disclosed herein and may not be drawn to scale.
FIG. 1 illustrates an example process for generating a dental localization plan for a patient, according to some embodiments.
Fig. 2A-2I illustrate example implementations of orthodontic procedures according to some embodiments.
Fig. 3A-3K illustrate example implementations of a prosthetic process according to some embodiments.
Fig. 4 illustrates an example tooth profile that can be used to segment teeth in accordance with some embodiments.
FIG. 5 illustrates an example for training a machine learning model to practice some embodiments described herein.
Fig. 6 illustrates labeling of maxillary incisors 11, 12, 21, and 22 according to some embodiments.
Fig. 7 illustrates labeling of maxillary canines 13 and 23, in accordance with some embodiments.
Fig. 8 illustrates labeling of maxillary premolars 14, 15, 24 and 25 according to some embodiments.
Fig. 9 illustrates labeling of maxillary molars 16, 17, 26, and 27 according to some embodiments.
Fig. 10 illustrates labeling of mandibular incisors 31, 32, 41, and 42, according to some embodiments.
Fig. 11 illustrates labeling of mandibular premolars 34 and 44, according to some embodiments.
Fig. 12 illustrates labeling of mandibular premolars 35 and 45 according to some embodiments.
Fig. 13 illustrates labeling of mandibular molars 36 and 46, according to some embodiments.
Fig. 14 illustrates labeling of mandibular molars 37 and 47, according to some embodiments.
Fig. 15 illustrates an example process for determining tooth positioning according to some embodiments.
Fig. 16 illustrates an embodiment of an aesthetic arc that can be manipulated using control points.
Fig. 17 is a block diagram illustrating an example process for defining a 3D curve, according to some embodiments.
Fig. 18 depicts an example of a 3D capture mapping anatomical points from a 2D projection to teeth of a patient, according to some embodiments.
Fig. 19 illustrates an example of anatomical points to which an initial curve has been fitted according to some embodiments.
FIG. 20 illustrates an example process for locating a dental library, according to some embodiments.
Fig. 21 illustrates an example process for optimizing static positioning of a patient's teeth according to some embodiments.
Fig. 22 illustrates an example of points for defining an aesthetic arc of a maxillary tooth, according to some embodiments.
Fig. 23 illustrates an example of points defining a centering arc for maxillary teeth, according to some embodiments.
Fig. 24 illustrates an example of points defining a fitted arc for maxillary teeth, according to some embodiments.
Fig. 25 illustrates an example of defining aesthetic arcs, centering arcs, and points of fitting arcs for maxillary teeth, according to some embodiments.
Fig. 26 illustrates an example of points defining a fitted arc for mandibular teeth, according to some embodiments.
Fig. 27 illustrates an example of points defining a centering arc for mandibular teeth, according to some embodiments.
Fig. 28 illustrates an example of points defining a guide arc for mandibular teeth, according to some embodiments.
Fig. 29 to 33B show example angles and spatial relationships between teeth and a reference plane.
Fig. 34 shows examples of upper and lower malocclusions and anterior and posterior malocclusions.
Fig. 35A-C illustrate example relationships between upper and lower malocclusions, anterior and posterior malocclusions, and various arches.
Fig. 36 illustrates an example process for developing a treatment plan for a patient.
FIG. 37 illustrates an example computer system that can be used to practice one or more embodiments disclosed herein.
Detailed Description
While several embodiments, examples and illustrations are disclosed below, it will be understood by those of ordinary skill in the art that the invention described herein extends beyond the specifically disclosed embodiments, examples and illustrations to other uses of the invention and obvious modifications and equivalents thereof. Embodiments of the present invention are described with reference to the drawings, wherein like reference numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because the terminology is being used in conjunction with a detailed description of some specific embodiments of the invention. Furthermore, embodiments of the invention may include several novel features, and no single feature is solely responsible for its desirable attributes or necessary to practicing the invention described herein.
As used herein, a dental marking (e.g., a tooth number) conforms to the FDI world dental association marking system (ISO 3950), unless otherwise indicated.
SUMMARY
As briefly discussed above, proper placement of the patient's own teeth, artificial teeth, or both, may be important for both aesthetic and functional reasons. Various embodiments described herein relate to systems, methods, and devices for determining, generating, and/or assisting in the positioning of teeth of a patient. In some embodiments, the systems, methods, and devices herein may be used to determine, generate, and/or assist in tooth shaping and/or sizing.
Often, when a practitioner plans or performs an orthodontic or prosthetic operation, the practitioner may lack information that would aid in locating teeth, selecting an appropriate artificial tooth, and so forth. For example, a practitioner may lack information regarding movement of the patient's jaw and/or other morphometric parameters, such as the location of a reference plane (e.g., an orbital plane), which may make it difficult to consider functions such as chewing when determining placement of teeth or prostheses. For example, a practitioner may rely on limited static views (e.g., x-rays, cone Beam Computed Tomography (CBCT) scans, etc.), which may result in the practitioner failing to take into account the overall oral and/or facial structure of the patient, which may result in time consuming procedures (and possibly additional procedures) and/or poor results. For example, with current limited information, a practitioner may ignore the overall architecture of the patient, but instead focus on, for example, the positioning and/or placement of individual teeth. In some embodiments, the morphometric parameter may be unique to the patient. In some embodiments, the morphometric parameters may be partially fully normalized, for example, to use a standard orbital plane.
In some cases, a practitioner may develop a treatment plan focused on aesthetics. While this approach may convey desirable aesthetic results, it may lead to functional problems. For example, if functional aspects of a patient's teeth are ignored, the patient may experience premature wear of the tooth surfaces (e.g., due to erosion or abrasion), increased vulnerability to cracking or chipping, difficulty eating or speaking, and the like. In some cases, the practitioner may choose from a limited set of idealized arc-shaped forms to develop a treatment plan for the patient, which may not take into account, for example, the contact surface between the patient's upper and lower teeth.
The present disclosure may lead to improved aesthetics, improved functionality, and/or a better patient experience. The present disclosure may enable a practitioner to better account for the overall architecture of a patient's teeth (and their placement relative to one another, for example), which may lead to improved results.
In some embodiments, the systems, methods, and devices described herein are configured to identify one or more parameters that may be used to evaluate, recreate, and/or alter the localization of poorly located or lost structures. Although in some embodiments the missing or poorly positioned structure is a tooth, the present disclosure is not limited to teeth. For example, the present disclosure may be applied to other structures, such as root and/or bone structures. For example, the base bone of the maxilla or mandible supporting the teeth may not fit the desired determined position of the patient's teeth. Thus, in some embodiments, the processes herein may be used to determine a new location of a bone that may be used by a maxillofacial surgeon to surgically reposition the bone.
Teeth may be organized in a system and may have non-random locations and/or non-random shapes. Thus, it may be important to consider morphometric parameters that may be specific to a patient when adding, recreating, moving, and or realigning teeth. Morphometric parameters may include, for example, lip position, arch position, bone position, etc. In some embodiments, parameters such as static bite and/or dynamic bite may be considered in determining the placement of the tooth or prosthesis. Determining these parameters can be a difficult and/or time consuming process for the practitioner, particularly in the case of significant rehabilitation or complex diagnostics. For example, treating a edentulous patient may be particularly challenging because the patient does not have teeth present. Aspects of the present disclosure may be used to make it easier to evaluate, recreate, and/or straighten the loss of poorly positioned structures.
In some embodiments, a library of teeth (e.g., a collection of predefined tooth shapes) may be created that facilitates automatic positioning of the teeth. For example, the library may facilitate automatic positioning of teeth using morphometric data of a patient. In some embodiments, the library may contain representations of the patient's teeth, representations of artificial teeth, or both. For example, the actual teeth of the patient may be used when orthodontic treatment is planned, while the artificial teeth (or both the artificial teeth and the patient's own teeth) may be used when prosthetic treatment is planned. In some embodiments, both the patient's own teeth and the artificial teeth may be used when planning orthodontic and/or prosthetic treatments, for example, to help ensure that the prosthetic teeth fit well with the patient's existing teeth. In some embodiments, the practitioner may modify the positioning of one or more teeth. In some embodiments, an artificial intelligence or machine learning (AI/ML) model may be used to improve the positioning of one or more of a patient's teeth, artificial teeth, or both. In some embodiments, positioning may be adjusted based at least in part on patient preferences, the country or region in which the patient resides, or where orthodontic or prosthetic treatment is performed, and so forth.
In some embodiments, the positioning of one or more teeth may be automatic or partially automatic. In some embodiments, the position of one or more teeth may be modified to change aesthetics. In some embodiments, the position of one or more teeth may be modified to change dynamics, such as to improve functionality such as eating or speaking. In some embodiments, the orientation of one or more posterior teeth may be modified, for example, with respect to the frontal plane, the sagittal plane, or both. In some embodiments, the tilt of one or more teeth may be modified. The characteristic points of each tooth may be connected to a corresponding arc to define a spiral. In some cases, steeper inclinations may increase contact during movement. In some embodiments, automatic, partially automatic, or manual movement may be made with respect to, for example, canine guidance, progressive functions, group functions, substantially balanced occlusion, or any combination of these. In some embodiments, teeth may fit on double spirals. Thus, adjusting the shape of the double helix may alter the guiding function of the teeth. The double helix may be made up of multiple sections or facets. As one example, the fitted arc may be connected with an aesthetic arc for the premolars and molars, thereby creating a face. The face may have an inclination relative to the orbital plane. In some embodiments, this face may be parallel to the orbital plane such that the maxillary and mandibular molars and premolars do not contact during the deviating movement (e.g., left and/or right side bite movements), but may maintain contact on the canines.
In some embodiments, if jaw movement data is available, the contact relationship between teeth may be determined, and the contact point may be determined after positioning the teeth on the double helix. In some embodiments, contact may be undesirable, and the design of the double helix and tooth positioning thereon may be modified to alter the position and/or shape of the teeth, thereby modifying contact between the mandibular and maxillary teeth. In some embodiments, the AI/ML model can be trained to output a desired positioning of the teeth, the shape of the teeth, and so forth. Additional details regarding contact relationships and points of contact between teeth are provided later within this disclosure.
Fig. 1 illustrates an example process 100 for generating a dental localization plan for a patient, according to some embodiments. The steps shown in fig. 1 are merely examples. In some embodiments, the process may include more steps, fewer steps, and/or the steps may be performed in a different order than shown in fig. 1.
At block 102, a practitioner may collect data about a patient, such as facial and dental information. At block 104, the system may be used to prepare data for creating a treatment plan or designing a smile. At block 106, the system may determine one or more arcs and may position teeth on the one or more arcs. At block 108, a geometry, such as a double helix, may be generated using the system. At block 110, the system may perform a static optimization of the patient's teeth. At block 112, the system may perform a dynamic assessment of the patient's teeth, jaw, etc., which may result in further refinement of the positioning of the teeth. Each of these steps is discussed in more detail below.
Fig. 2A-2I illustrate example implementations of processes for orthodontic treatment planning according to some embodiments. In fig. 2A, a user may select a patient to be treated, and multiple arcs may be associated with the patient, such as aesthetic arcs (hollow circles), centering arcs (solid circles), and fitting arcs (cross shapes). For greater clarity, example points and arcs corresponding to the points and arcs shown in fig. 2A through 2I and 3A through 3K are shown in greater detail in fig. 22 through 28. In fig. 2B, the user may define an aesthetic curve or arc. In fig. 2C and 2D, the user can select and locate the library. As shown in fig. 2C and 2D, a user may select from an orthodontic library (e.g., the patient's own teeth) or from one or more prosthetic libraries. In fig. 2E through 2G, a user may locate a library on a double helix. In fig. 2H, the user may position the library after modifying the bite vertical distance to achieve the desired upper and lower bite and/or anterior and posterior bite values. In fig. 2I, the user can calculate the contact relationship between the upper and lower teeth, as indicated by the shaded areas of the teeth. In fig. 2K, the user may calculate the contact point between the upper and lower teeth during jaw movement and/or at a particular jaw position.
Fig. 3A-3K illustrate example implementations of processes for prosthetic treatment planning according to some embodiments. The process shown in fig. 3A-3J may be broadly similar to the orthodontic process depicted in fig. 2A-2I. In fig. 3A, the user may select a patient to perform a treatment plan. In fig. 3B and 3C, a user may define an aesthetic curve for a patient. In fig. 3D, the user may select a library. In fig. 3E through 3G, a user may locate a library on a double helix. In fig. 3H and 3J, the user can adjust the relative positioning of the teeth to achieve the desired upper and lower malocclusions and/or anterior and posterior malocclusions. In fig. 3J and 3K, the user can calculate and visualize the points of contact between the upper and lower teeth during jaw movement and/or at a particular jaw position using the system, for example, as indicated by the shaded areas of the teeth in fig. 3K.
Data collection and preparation
As mentioned briefly above, better therapeutic results may be achieved if the practitioner considers a more complete data set about the patient. Preferably, the practitioner considers both aesthetic aspects and functional aspects when determining a treatment plan. Thus, it may be advantageous to collect a considerable amount of data about the patient's teeth, facial structures, jaw alignment, temporomandibular joint movement, jaw movement, bone structure, etc.
In some embodiments, facial and/or jaw movements of the patient may be considered when making a diagnosis and/or treatment plan. In some cases, a motion capture system may be used to map movements of the patient's face and jaw during actions such as speaking, smiling, and chewing. For example, markers may be applied to the patient's face and movement tracked using an infrared camera. In some embodiments, dedicated hardware and/or software may be used for recording and/or simulating jaw movements of a patient, such as described in U.S. patent 10,265,149 issued 4-23 on 1919, the contents of which are hereby incorporated by reference in their entirety.
However, some providers may not have access to dedicated devices for facial motion capture. Thus, in some embodiments, the provider may capture facial movements without the need for specialized equipment, such as using consumer imaging hardware.
The capture of the patient's face and movements may be used in combination with 3D representations of the patient's teeth, bones, and/or other anatomical features as part of the process of determining optimal placement of the teeth. The 3D representation of the patient's teeth may be obtained from, for example, an intraoral scanner, a laboratory scan of a mold of the patient's teeth, a cone-beam computed tomography scan, and the like. These techniques are typically available to dental practitioners. Preferably, the teeth may be segmented as described more fully below. For example, each tooth and gum may be treated separately, the teeth may be divided into groups and treated separately, or individual teeth may be divided into more than one segment. In some cases, different segments (e.g., partial teeth, complete teeth, multiple teeth) may be used to develop a single patient treatment plan. In some cases, the information for each tooth and gum may be stored in a separate file, but need not be.
In some embodiments, certain points and parameters may be set manually, partially manually, or automatically, such as the condyle, the position of the dental arch relative to the lips, the size of the dental arch relative to the dimensions of the mouth, and so forth. Preferably, the alignment of the patient's face and teeth may be performed automatically.
In some embodiments, data describing a 3D dental architecture of a patient may be collected. For example, the dental architecture data may include information about various angles and/or reference planes (e.g., inclination angles of the condyles, orbital planes, etc.), mandibular movement, lip position, and the like. In some embodiments, the data may include information describing, for example, the position of the dental arch relative to the temporomandibular joint. In some embodiments, the practitioner may use this data to develop a treatment plan tailored to the individual patient. In some embodiments, the data may include a description of tooth position, static positioning of the jaw, and/or dynamic movement. In some embodiments, the data may describe aesthetic aspects, functional aspects, or both.
As briefly mentioned above, in some embodiments, individual teeth may be identified in segments and/or individually. For example, the system may be configured to segment teeth automatically, semi-automatically, or manually (e.g., depending on user input). As discussed in more detail below, various methods may be used to segment the teeth. In some embodiments, the segments may be individual teeth, but this need not be the case. For example, a segment may include a portion of a tooth, a plurality of teeth, a portion of a plurality of teeth, a combination of a complete tooth and a portion of a tooth, and the like. In some embodiments, when a fragment includes an individual tooth, the system may automatically assign its name or identifier to the tooth, e.g., according to the ISO 3950 standard, the universal numbering system, the parmer notation, and so forth. For example, a machine learning algorithm may be trained to automatically identify teeth and assign appropriate designations (e.g., "canine 13" for the upper right canine of the patient according to the ISO 3950 standard).
In some embodiments, metadata describing, for example, particular points, areas, features, etc., of the tooth surface may be determined. In some embodiments, a virtual surface (e.g., a double helix) may be created, and the geometry of the virtual surface may be used to indicate and/or determine the positioning of teeth and/or facing areas between the mandibular arch and one or more teeth of the maxillary arch. For example, the facing region may describe a point of contact, angle, etc. between the mandible and the maxillary teeth. The facing area may be defined as the area where the mesh of one tooth contacts the mesh of the other tooth. In some embodiments, there may be contact between the teeth of the mandible and maxillary dental arches. The facing region may include a static (e.g., jaw closed and stationary) bite, a dynamic bite (e.g., contact made as the jaw moves), or both.
In some embodiments, the systems, methods, and devices described herein may provide automated and/or semi-automated solutions for determining boundaries between teeth, gum lines, and the like. In some embodiments, the three-dimensional dental arch may be represented by a point cloud, a mesh, or the like. In some embodiments, a representation of the dental arch may be segmented or subdivided into sub-units, which may be given a particular name or identifier. For example, a subunit may be a single tooth, more than one tooth, a portion of a tooth, and so forth. In some embodiments, the segmentation may be based on color differentiation, geometric transition changes, and so forth. For example, the gum line may be identified by a change in color, or the teeth may be distinguished from one another by looking for a sharp change in the slope of the profile or a change in the sign of the slope. For example, fig. 5 shows a profile view in which the magnitude of the slope increases sharply as the interface between two teeth is reached, and the sign of the slope changes rapidly at or near the interface between two teeth. As shown in fig. 4, tooth 400 may have a contour 402. The slope 404 relatively far from the interface between the teeth may be shallower than the slope 406 near the interface between the two teeth. Slope 408 may have a different sign than slope 406 and may correspond to another tooth. For example, as shown in FIG. 4, slope 406 may be relatively large and negative, while slope 408 may be relatively large and positive. In some embodiments, the interface between teeth may be determined as an inflection point where the slope changes from negative to positive. In some embodiments, artificial Intelligence (AI) and/or Machine Learning (ML) models may be used for segmentation. In some embodiments, the boundaries or edges of the teeth may be determined based on the local 3D curvature. For example, the boundary may be indicated by a high slope change in a relatively small region. In some embodiments, AI and/or ML models may improve efficiency, accuracy, or both.
FIG. 5 depicts a flowchart for training an artificial intelligence or machine learning model, according to some embodiments. The training process depicted in fig. 5 may be used to train a model to be used in a variety of applications. For example, the training process 500 can be used to train a model to identify arcs (e.g., aesthetic arcs, centering arcs, fitting arcs, etc., as described herein), segment teeth, position teeth, and so forth. In some embodiments, the model may be trained to identify a library of prostheses that may be used to treat a patient, such as selecting the most appropriate library or libraries from a set of standard libraries. In some embodiments, the model may be trained to generate a library of teeth. At block 501, the system may receive a dataset containing various information for training a model, such as facial capture, jaw motion capture, tooth positioning data, images of teeth and/or gums, and so forth. At block 502, one or more transforms may be performed on the data. For example, the data may need to be transformed to conform to an intended input format, such as to conform to an intended date format, to conform to a particular tooth numbering system (e.g., universal numbering system, FDI world dental association notation, or palmer notation). In some embodiments, the data may undergo conversion in preparation for training an AI or ML algorithm, which typically operates using data that has undergone some form of normalization or other modification. For example, the classification data may be encoded in a particular manner. The nominal data may be encoded using one-hot encoding, binary encoding, feature hashing, or other suitable encoding methods. Ordinal data may be encoded using ordinal encoding, polynomial encoding, helmert (Helmert) encoding, and the like. Numerical data may be normalized, for example, by scaling the data to a maximum of 1 and a minimum of 0 or-1. The image data may undergo various transformations. For example, the channel values may be converted from a range of 0-255 to a range of 0-1, the image resolution may be set to a standardized value, and so on.
At block 503, the system may create training, tuning, and test/verify datasets from the received datasets. The training data set 504 may be used during training to determine variables used to form a predictive model. The tuning data set 505 may be used to select a final model and prevent or correct overfitting that may occur during training with the training data set 504, as the trained model should generally be suitable for a wide range of patients, rather than a special training data set (e.g., where the training data set is biased toward patients with relatively high or relatively low bone density, wide or narrow dental arches, etc.). After training and tuning, the model may be evaluated using the test dataset 506. For example, the test dataset 506 may be used to check whether the model is over-fitted to the training dataset. In training loop 514, the system may train the model using training data set 504 at 507. Training may be performed in a supervised, unsupervised or partially supervised manner. At block 508, the system may evaluate the model according to one or more evaluation criteria. For example, the evaluation may include determining whether the segmentation is accurate, determining whether a suggested library is appropriate, determining whether a suggested dental arch is properly identified, determining whether the teeth are properly positioned, or any other criteria that may be desirable. At block 509, the system may determine whether the model meets the one or more evaluation criteria. If the model fails the evaluation, the system may tune the model using tuning data set 505 at block 510, repeating training 507 and evaluation 508 until the model passes the evaluation at block 509. Once the model passes the evaluation at 509, the system may exit the model training loop 514. The test dataset 506 may be passed through a trained model 511 and at block 512, the system may evaluate the results. If the evaluation fails at block 513, the system may reenter training loop 514 for additional training and tuning. If the model passes, the system may stop the training process, producing a trained model 511. In some embodiments, the training process may be modified. For example, in some embodiments, the system may not use the test dataset 506. In some embodiments, the system may use a single data set. In some embodiments, the system may use two data sets. In some embodiments, the system may use more than three data sets. In some embodiments, the model may not use a tuning dataset. For example, the model may have a training data set and a test data set.
As briefly discussed above, metadata may be used to describe the nature of individual teeth, fragments, and the like. In some embodiments, the metadata may describe tooth morphology. In some embodiments, the metadata may include information about structures such as points, dimples, ridges, grooves, inflection points areas, larger profile areas, and the like. In some embodiments, metadata may be manipulated to move the fragment in space, deform the fragment, resize the fragment in whole or in part, and so forth. In some embodiments, metadata for different regions may be considered separately or together.
In some embodiments, metadata may be determined for existing teeth of a patient. In some embodiments, metadata may be determined, for example, for a library of standardized or artificial teeth, for example, if the treatment plan includes replacement of diseased or lost teeth with artificial teeth. In some embodiments, the AI/ML model can be used to determine metadata for existing teeth. In some embodiments, the AI/ML model can be used to recognize, process, etc., metadata of existing teeth. In some embodiments, the AI/ML model can be trained using a database of teeth that have been manually annotated by a human. For example, ridges, points, pits, grooves, inflection regions, larger contour regions, etc. may be manually annotated in the training dataset so that an AI/ML model may be trained to recognize one or more of these features. The AI model may be updated periodically, for example, by providing additional annotation data.
Fig. 6 through 14 illustrate examples of tooth labeling according to some embodiments. Fig. 6 illustrates labeling of maxillary incisors 11, 12, 21, and 22 according to some embodiments. Fig. 7 illustrates labeling of maxillary canines 13 and 23, in accordance with some embodiments. Fig. 8 illustrates labeling of maxillary premolars 14, 15, 24 and 25 according to some embodiments. Fig. 9 illustrates labeling of maxillary molars 16, 17, 26, and 27 according to some embodiments. Fig. 10 illustrates labeling of mandibular incisors 31, 32, 41, and 42, according to some embodiments. Fig. 11 illustrates labeling of mandibular premolars 34 and 44, according to some embodiments. Fig. 12 illustrates labeling of mandibular premolars 35 and 45 according to some embodiments. Fig. 13 illustrates labeling of mandibular molars 36 and 46, according to some embodiments. Fig. 14 illustrates labeling of mandibular molars 37 and 47, according to some embodiments. In some embodiments, for example, and without limitation, the callout can include a mesial vestibule point, mesial jawbone point, distal mesial vestibule point, distal mesial jawbone point, mesial lingual point, distal lingual point, vestibular point, palate point, distal point, median cutting edge, central cutting edge, distal cutting edge, canine tip, median crest, distal crest, median pit, central pit, distal pit, lingual elevation, major groove, median contact point, distal contact point, median midpoint, distal point, mid-cervical cheek point, mid-cervical palate point, and the like.
Arc determination and positioning
Fig. 15 illustrates an example process 1500 for determining a treatment plan for a patient. Additional details of each step of process 1500 are described herein. At block 1502, the system may be configured to determine an arc for target positioning of a patient's teeth. At 1502a, the system can be configured to determine an aesthetic arc. As described in more detail below, aesthetic arcs can be created using specific points on the teeth (e.g., ridges, points, pits, edges, etc.). In some embodiments, the practitioner may provide input defining the desired final location (e.g., using control point settings adjusted on the 2D photograph of the patient and/or on the 3D model of the patient). In some embodiments, the tooth edges may be used to construct an aesthetic arc, and may be automatically detected from photographs, facial scans, or other patient data. In some embodiments, facial landmarks may be used to determine aesthetic arcs. In some embodiments, 2D images may be used and the system may map from two dimensions to three dimensions. In some embodiments, filtering and/or smoothing algorithms may be used to smooth the aesthetic arc (or other arcs as described herein).
At 1502b, the system may be configured to determine a centering arc, which may be based at least in part on the aesthetic arc determined at block 1502 a. At block 1502c, the system may be configured to determine a fitted arc. In some embodiments, more arcs, fewer arcs, or different arcs may be determined. Various arches that can be used to determine the positioning and other properties of teeth are described in more detail below. At block 1504, the system may determine a double helix based on the arc determined at block 1502. At block 1506, the system may be configured to adjust the double helix. For example, the system may provide automated, semi-automated, and/or manual adjustment functionality (e.g., in some embodiments, a practitioner may manually edit the double spiral or calculate one or more arcs of the double spiral). At block 1508, the system may be configured to calculate a tooth position based on the double helix and/or the aesthetic arc. At block 1510, the system may be used to automatically, semi-automatically, and/or manually adjust the position, orientation, shape, and/or size of one or more teeth using a double screw. At block 1512, the system may be configured to adjust the relative positions of the mandibular and maxillary teeth, for example, by taking into account the contact relationship between the teeth, the dynamic behavior of the teeth and/or jaw, and the desired superior-inferior overbite and/or anterior-posterior overbite characteristics.
In some embodiments, the aesthetic arc may be a 3D line that engages the buccal edge, incisal edge, cuspid tip, buccal cusp, and the like of the maxillary tooth. In some embodiments, the aesthetic arc may be based on freehand lines drawn by the practitioner, lines drawn by means of pre-existing dental preforms, lines generated by a computer system, and so forth. In some embodiments, previously taken photographs may be superimposed on the 3D model and may help locate the aesthetic lines. In some embodiments, the stopping point of the aesthetic line may correspond to the location of the trailing edge of the last tooth of the arc (e.g., the second molar). In some embodiments, a double helix geometry may be used, and the aesthetic arc may define the outer limits of the double helix. In some embodiments, the double helix may have a first twist that describes the inclination of the tooth's tooth surface and a second twist that corresponds to the shape of the dental arch.
The plurality of arches may be associated with the patient's teeth as will be explained in more detail below, for example in fig. 24. In some embodiments, aesthetic arcs may be used to define the outer limits of the duplex spiral. Additional arcs, such as a fitted arc and a centered arc, may be used to further define the double helix. In some embodiments, determining the outer limit of the duplex via the aesthetic arc may be an early or initial step in determining the duplex. As described in more detail below, in some embodiments, other arcs may be determined based at least in part on the location of the aesthetic arc.
In some embodiments, the aesthetic arc for the patient may be calculated from, for example, a maxillary dental mesh and a patient facial image captured by an intraoral scanner and a facial scanning device (which may be a dedicated or non-dedicated device such as a smartphone, tablet computer, depth sensing camera, etc.). In some embodiments, landmarks between the maxillary dental mesh and the facial image may be mapped. Internal parameters of the camera used to capture the facial image may also be considered. For example, it may be important to know the focal length of the camera. In some cases, it may be useful to know the resolution of the camera or other parameters of the camera. In some embodiments, information about the camera (e.g., focal length) may be used to remove distortion, such as fish-eye effects that may result from capturing images using a wide angle lens. In some embodiments, the practitioner may define control points for calculating the aesthetic arc. In some embodiments, three control points may be used, but the number of control points is not necessarily limited. In some embodiments, the control point may have an initial position. In some embodiments, the control point may have both an initial position and a modified position.
As depicted in fig. 16, in some embodiments, the aesthetic arc 1601 may include three control points 1702a-c. The user interface of the system may show aesthetic arcs 1601 and control points 1702a-c overlaid on an image of a patient's face and/or teeth. In some embodiments, the practitioner may manipulate the control points to define the desired and/or desired locations on the patient's smile. In some embodiments, the control points may correspond to, for example, midpoints between molars 16 and 26 and incisors 11 and 21, although other control point placements are possible.
In some embodiments, the initial design of the aesthetic arc as depicted in fig. 16 may be performed using a 2D projection of the patient's teeth. In some embodiments, the system may be configured to detect landmarks on a picture of the patient's face. For example, the system may use a picture of the patient and may be configured to detect the patient's face, identify one or more landmarks, and draw smile lines and/or other reference points and/or lines that may be used to design the patient's smile.
While the design may be performed using 2D projection, preferably the dental diagnosis and/or treatment plan should take into account the 3D positioning of the patient's teeth. Fig. 17 depicts an example process 1700 for determining a 3D curve according to some embodiments. In other embodiments, the process shown in fig. 17 may include fewer or additional steps. In some embodiments, it may be advantageous to map between a 2D image of the patient's teeth (e.g., a photograph of the patient's face) and a 3D capture. If done improperly, the mapping may result in significant distortion, which may limit the usefulness of the mapping for tooth alignment/positioning.
At block 1702, the system may be configured to project a 2D control point (e.g., the aesthetic arc depicted in fig. 22) defining a desired smile into a 3D space. Similarly, anatomical points on both the 2D image and the 3D capture may be mapped to each other. Fig. 18 depicts an example of mapping anatomical points between a 2D projection 1802 and a 3D capture 1804 of a patient's teeth. The mapping between the 2D facial image of the patient's teeth and the 3D capture may be approximated as a perspective n-point problem given a set of n points in 3D space and their corresponding 2D projections. Given a set of known 3D points and their 2D projections, it may be possible to determine camera pose (i.e., roll, pitch, yaw and pan along three orthogonal axes). In determining the projection between the 2D image and the 3D space, intrinsic parameters of the camera (e.g., focal length) may be used. For example, the transformation may be performed according to recipe sp c=KRpw, where s is a scaling factor, p c is a 2D point, K is a matrix of intrinsic camera parameters, R is a matrix of required rotations (extrinsic parameters), and p c is a point in 3D space. In some embodiments, the rotation matrix R may actually be a translation matrix T, or the transformation between 2D and 3D space may include both the translation matrix T and the rotation matrix R.
In some cases, projection from a 2D image into 3D space may be complicated due to the lack of information about a third dimension (e.g., depth) in the 2D image. In some embodiments, stereoscopic vision may be used to aid in mapping 2D images to 3D space. For example, two cameras may be placed with some separation between them, and the images may be compared to determine depth information.
In some embodiments, the system may not project from 2D to 3D. This may mean that, for example, the curve displayed on the user interface to indicate the aesthetic arc may not be the same as the aesthetic curve determined for diagnostic and/or therapeutic purposes. In some embodiments, a user of the system may be allowed to move the control point vertically because there is little change in depth along the vertical axis. However, the user may not be able to adjust the control point in the horizontal direction because even small changes in horizontal position may correspond to large changes in depth. For example, returning to fig. 16, a user of the system may move three control points up and down to alter the aesthetic arc, but may not be able to move the control points horizontally. It will be appreciated that such limitations may not exist when the user is working with control points defined on a 3D scene.
At block 1704, an initial 3D curve may be created by traversing a series of points on an outward facing surface of the maxillary 3D mesh. In some embodiments, spline fitting may be used to produce a smooth curve through the points. In some embodiments, a B-spline algorithm may be used to calculate a 3D spline representing the dental arch. In some embodiments, the standard 3D curve may be selected from one or more template 3D splines representing dental arches. Template splines may be advantageous in some situations, such as when shape memory alloy wires are used to move teeth. In some embodiments, the aesthetic arc may be a preformed arc selected from a catalog or database of aesthetic arcs. In some embodiments, the preformed aesthetic arc may be used to calculate a double helix. While conventional approaches may consider only aesthetic arcs, the use of double spirals as described in the present disclosure may enable optimization of tooth orientation, tilt, etc., which may be difficult or even impossible when considering aesthetic arcs alone. Such optimization may improve functionality, reduce premature wear, and so forth. In some embodiments, the initial 3D curve may only consider the upper maxillary teeth of the patient. Fig. 19 depicts an example of an anatomical point 1902 to which an initial curve 1904 has been fitted by the system.
At block 1706, a user of the system may distort the initial 3D curve by, for example, moving one or more control points using a user interface, similar to how the user may modify the aesthetic arc by moving control points in fig. 16. The system may calculate a distorted 3D curve from the shifted control points. In some embodiments, the system may calculate the distortion that occurs due to each control point moving alone, and may combine the individual distortions to determine the overall distortion of the initial 3D curve. In some embodiments, the distortion of each tooth may vary based on the distance from the moved control point. For example, the distortion of a point may be weighted according to the distance from the point to the control point. In some embodiments, the distortion of a point may be calculated as sin 2((1-di,cp)/dmax pi/2), where d i,cp is the distance between the i-th point of the curve and the control point, and d max is the maximum distance between the point on the curve and the control point. In some embodiments, the distance is a straight line distance between points.
At block 1708, the system may be configured to move the anatomical point according to the distorted 3D curve. For example, the system may determine the point on the initial 3D curve that is closest to each anatomical point, and the anatomical points may be distorted based on the distortion of the closest point on the initial 3D curve. The anatomical points may be, for example, points along the surface of the tooth.
After the aesthetic 3D curve is determined, the patient's teeth or artificial teeth may be positioned. FIG. 20 depicts an example process 2000 for locating a library, according to some embodiments. The library may be, for example, a library of artificial teeth or may be the patient's own teeth. Such libraries may be used in some embodiments for prosthetic and/or orthodontic treatment. The computer system may be configured to perform process 2000. In some embodiments, the process may include fewer or additional steps. At block 2002, the system may receive a maxillary and mandibular mesh of a patient. At block 2004, the system may receive a maxillary and mandibular mesh of a library, which may be, for example, a patient's own teeth or artificial teeth. At block 2006, the system may receive an aesthetic 3D curve, such as a curve generated according to process 1700. At block 2008, the system may orient the library mesh, and at block 2010, the library mesh may be scaled in one or more dimensions to fit the patient. At block 2012, the system may apply a global rigid transformation to the library mesh, for example, to align the library mesh to the patient by performing translation in one or more directions, rotation in one or more directions, or by performing rotation and translation in one or more directions. At block 2014, the system may apply a local stiffness transformation to each tooth in the library mesh (e.g., to one tooth, to multiple teeth, or to all teeth, either independently or in groups of teeth). At block 2016, the system may optionally apply a gingival apex from the patient's dentition to the library dentition. At block 2018, the system may output a positioned library mesh, such as one in which the library's teeth have been manipulated globally and locally to better conform to the patient and aesthetic 3D curve.
The initial fitting according to the aesthetic 3D curve does not result in an ideal positioning of the teeth. Thus, fig. 21 depicts an example process 2100 for optimizing the static positioning of a patient's teeth, which may be run on a computer system, according to some embodiments. At block 2102, a double helix may be calculated, and at block 2104, teeth may be positioned on the double helix. The double helix may be calculated at least in part by determining various arcs along the outer edge, inner edge, or other anatomically relevant portion of the patient's teeth. For example, the system may determine a centering arc and a fitting arc. At block 2106, the system may determine static upper and lower malocclusions and anterior and posterior malocclusions. At block 2108, the system may determine a bite vertical distance. At blocks 2110 and 2112, the system may determine the point of contact and the relationship of contact, respectively. The steps in fig. 21 may be performed in a different order. More or fewer steps may be included in processes consistent with the present disclosure.
Double helix determination and positioning
In some embodiments, the initial duplex may be based on information about the patient, such as captured data about the positioning of the patient's teeth. For example, data about a patient's teeth may be used to generate an initial aesthetic arc. In some embodiments, the data may have metadata associated with it. For example, the metadata may indicate the buccal surface of the patient's teeth, which may be used to form an aesthetic line.
In some embodiments, the initial aesthetic arc may be used for diagnostics, for developing treatment plans, and the like. The second duplex may be calculated based at least in part on an aesthetic line, which may be, for example, a random line, a manual design, or an automatically calculated line based on, for example, facial scan data, pictures, or the like.
In some embodiments, the facing region may be described by a geometric shape (e.g., surface), such as a double helix. In some embodiments, the geometry may facilitate positioning, modification, or both, of one or more teeth. In some embodiments, the geometry may be modeled based at least in part on recorded data specific to the patient. In some embodiments, the geometry may be based on manipulated patient data, such as data that has been manipulated to achieve a desired aesthetic result, functional result, or both.
In some embodiments, the patient-specific data may relate to one or more reference planes, such as an orbit plane, a condyle slope, etc., of the skull of the patient, for example. In some embodiments, the patient-specific data may include photographs, facial scans, radiographs (e.g., transverse radiographs), CBCT images, and the like.
In some embodiments, the geometry may be determined at least in part by the snap cap. The data related to the occlusal cover may include data related to the posterior portion, the sagittal plane, or both. In some embodiments, the geometry data may define the architecture of the upper arch, the morphology of the upper teeth, or both. In some embodiments, the geometry for the upper arch, the upper teeth, or both, may affect the lower arch, may be complementary to the occlusal cover, or both. In some cases, the bite cover may be defined for the mandibular teeth. The bite cap may be a solid including a saper curve and a wilson curve. The bite cover may be calculated taking into account the condyle points, the tangent points, and the points of the distal flap of the canine. FOR example, additional details can be found in U.S. patent No. 9,922,454B2, entitled "METHOD FOR designing orthodontic appliances" (METHOD FOR DESIGNING AN ORTHDONTIC APPLIANCE), the contents of which are incorporated herein by reference in its entirety.
In some cases, the patient may lack teeth, and the geometry may define a plate or surface to which the teeth may be optimally applied based on, for example, metadata of the teeth (e.g., metadata of artificial teeth).
In some embodiments, determining the geometry may include constructing one or more arcs. For example, in some embodiments, any combination of one or more of an aesthetic arc, a centering arc, and a fitting arc may be used to determine geometry, as described herein. In some embodiments, one or more arcs may have been previously determined, such as described above.
As discussed above, in some embodiments, a helix may be calculated and teeth (e.g., the patient's own teeth, artificial teeth, or both) may be fitted to the helix. As discussed above, the helical structure may be defined at least in part by an aesthetic arc. In some embodiments, additional structural data about the patient may be used to calculate the double helix.
In some embodiments, a centering arc, a fitting arc, or both, may be used in combination with an aesthetic arc to define a double helix. Fig. 22 illustrates an example of points for defining an aesthetic arc of a maxillary tooth, according to some embodiments. In some embodiments, the aesthetic arc may be defined at least in part by the incisors, canine edges, buccal cusps of premolars, and/or buccal cusps of molars. In some embodiments, the centering arc may be an arc describing a center or other point on the surface of the tooth. An example of points defining the centering arc of the maxillary teeth is shown in fig. 23. The centering arc may pass through the palate cusp, median ridge, and/or distal ridge of, for example, an incisor and/or canine. The fitted arc for the maxillary teeth may describe the inner boundary of the maxillary teeth as shown in fig. 24. In some embodiments, the fitted arc may be determined from the incisors, canines, premolars, and marginal ridges of the molar teeth. In some embodiments, the fitted arc may take into account molar pits (e.g., the fitted arc may be aligned with the molar pits). In some embodiments, the fitted arc may be determined from the incisors of the premolars and molars, canine tips, vestibular tips, or any combination of these features. It will be appreciated that different arches may be used for similar purposes, but that arches are preferably related to the structure of the teeth such that arches are anatomically related and have a consistent logical structure.
Fig. 25 is an example illustration showing a centering arc (white circle with a black outline), a centering arc (black circle with a white outline), and a fitting arc (black cross with a white outline) for the maxillary teeth.
In some embodiments, one or more arches may be used to at least partially define the shape, position, or both of the teeth. For example, in front or rear views, three arches of the cusps may form an inverted "V" shape. In some embodiments, the arc may take into account one or more future positions of one or more teeth. In some embodiments, the arc may be determined by considering the segments individually, but this is not necessary. In some cases, segments may be considered in groups or in whole when determining arcs.
In some embodiments, the arc may be determined for the maxillary teeth, such as described above. In some embodiments, the arc may be determined for the mandibular teeth. Fig. 26 to 29 show example arches for the mandibular teeth. Fig. 26 illustrates a fitted arc for mandibular teeth according to some embodiments. As shown in fig. 26, the fitted arc may pass through the incisor and canine edges, the buccal cusps of the premolars, and/or the buccal cusps of the molars. Fig. 27 illustrates a centering arch for mandibular teeth according to some embodiments. Fig. 28 illustrates a guide arc for mandibular teeth according to some embodiments. The guide arc may pass through the lingual cusps of, for example, premolars and molars.
In some embodiments, segmentation, point identification, etc. may be accomplished partially or fully manually if the patient lacks teeth or if the existing teeth have a large deviation from the desired placement. For example, the AI/ML model may fail to identify relevant features when the patient lacks teeth or when the deviation from the expected position, orientation, or both is too pronounced. In some embodiments, the segments may be orthogonal to one or more features, such as aesthetic vestibular arcs. However, the segments need not be orthogonal to the arc. For example, some segments, such as canines, may not be orthogonal to the aesthetic vestibular arc. In some embodiments, fragments may be created and may be separated from each other. The segments may have a distance corresponding to the average length of the teeth. For example, a segment for a molar may have a distance or depth of about 8 mm.
In some embodiments, if the patient has enough teeth, such as all teeth, substantially all teeth, a majority of teeth, or a minority of teeth, the metadata point projections may be made on an aesthetic arc. In some embodiments, the metadata point projections may be made using an AI/ML model. In some embodiments, a guide segment having a length x and an angle α may be formed by the guide segment relative to the orbital plane may be modeled. The guide segments may be from an aesthetic arc to a fit arc, and the distance x may be the distance from the aesthetic arc to the fit arc for a particular tooth. In some embodiments, particular distances and angles may be associated with different types of teeth, such as indicated in the following table. The distance and angle may vary for different condylar slopes. For example, the following table may be for a 50 ° condyle slope.
Tooth type Tilting (alpha, degree) Distance (x, mm)
Intermediate incisors 57.2±9.7 3.4±0.8
Lateral incisors 53.6±10.5 3.1±0.9
Canine teeth 47.7±8.1 3.5±1.1
First premolars 30.7±9.7 3.3±0.4
Second premolars 20.7±9.7 3.2±0.6
First molar tooth 12.0±2.7 3.4±0.8
Second molar teeth 8.7±9.7 2.7±0.9
Fig. 29 is an example illustration of the slope and distance for different teeth corresponding to the table above. As shown in fig. 30, the teeth may have sequential slopes. Fig. 31A illustrates the tilt angles associated with various teeth in the mouth of a patient. Fig. 31B illustrates a double helix according to some embodiments.
Fig. 32 is a cross-sectional view showing the inclination α of an example tooth (e.g., molar) with respect to the orbital plane.
Fig. 33A and 34B show examples of teeth of the upper jaw and the lower jaw being fitted. As shown in fig. 33A, the aesthetic arc position may be fixed in place for the maxillary teeth. In fig. 33B, the tooth has rotated to achieve the desired tilt angle α. The corresponding mandibular teeth can be manipulated to maintain proper alignment with the positioned maxillary teeth.
In some embodiments, the duplex may be formed, at least in part, by: obtaining an outer arc (e.g., an aesthetic arc) that can define the outer limits of the double helix; creating a segment from the projection point for each segment, which may correspond to, for example, a tooth, having a length x and an angle α; defining one or more intermediate points at the ends of the segment; and determining an innermost point for each segment, the innermost point may be based at least in part on statistical data representing an average tooth width, a projection of corresponding tooth metadata points, or both. In some embodiments, an AI/ML model can be used to determine one or more points to form a double helix structure. In some embodiments, the practitioner may make manual adjustments to the double helix.
Fig. 35A to 36C show the relationship between the aesthetic arc, the fitting arc, the centering arc, the upper and lower malocclusions, and the front and rear malocclusions. Fig. 35A shows a view of the maxillary teeth with aesthetic arcs (dashed lines, hollow circles), fitted arcs (solid lines, crosses) and centered arcs (solid circles). Fig. 35B shows a cross section across the segment AB in fig. 35A. As shown in fig. 35B, there may be a distance x between the aesthetic arc and the fitted arc, a distance d between the aesthetic arc and the centering arc, and a vertical distance z between the aesthetic arc and the centering arc. Returning to fig. 35A, the distance x at the incisors may define an anterior-posterior malocclusion value. Fig. 35C shows that the value z may define upper and lower malocclusion values when measured at the incisors.
In some embodiments, the fitted arc may be determined by the system using a table, such as the table above. For example, after determining the aesthetic arc, a fitted arc may be constructed, with points of the fitted arc having distances from corresponding points (e.g., cheek points) on the aesthetic arc, such as defined in the table above or similar tables. The fitted arc may be positioned relative to the aesthetic arc such that a line segment drawn between a point on the aesthetic arc and a point on the fitted arc has an angle relative to the orbital plane as indicated above.
In some embodiments, the centering arc may be determined based at least in part on the aesthetic arc and/or the fitted arc. In some embodiments, the centering arc may be located at an average distance d of about 6mm from the corresponding cheek point for the premolars and/or molars. For anterior teeth (e.g., canines and incisors), the centering arc may correspond to about 4mm anterior-posterior malocclusion. In some embodiments, the foregoing distances may be modified manually, automatically, or semi-automatically, depending on the patient and the treatment needs. Depending on the personification value, the centering point may be higher or lower, or closer or farther from the orbital plane by a distance z. For example, the mesial palate sharp point of the tooth 26 may be 0.8mm lower than the point on the aesthetic arc corresponding to the tooth 26. Example z-positions of anatomical points corresponding to the centering arc relative to the aesthetic arc are given in the table below, with positive values indicating that the point characterizing the centering arc is below the corresponding point defining the aesthetic arc.
Fig. 34 illustrates an example of incisor alignment according to some embodiments. In some embodiments, the cutting edges along the aesthetic arc may be positioned and the incisors may be angled to achieve the desired tilt angle. The mandibular incisors may then be placed with the desired upper and lower overocclusions, which may at least partially determine the vertical positioning of the incisors. The mandibular incisors may then be oriented to preserve the desired anterior-posterior malocclusion.
In some embodiments, the calculation of the double helix may include snapping the cap. The bite cover may include a span curve defining a curvature from the edge of the mandibular incisors and extending to the mandibular bite plane of the condyles. In some embodiments, the buccal cusps of the mandibular cusps may be manipulated to conform to the span curve, which may constrain the overall fit of the teeth. In some embodiments, the position of the cutting edge and condyle points may be fixed, and the remaining positions may be adjusted by altering the recesses such that the occlusal surfaces of the maxillary and mandibular first molars are aligned.
Static optimization
The system may be configured to perform static optimization on the mandible and maxillary libraries. This may be done before or after positioning the library, but it may be advantageous to perform static optimization after alignment. Referring again to fig. 20, at block 2106, the system may determine optimal superior-inferior and anterior-posterior overocclusions. The system may be configured to take initial positioning values of the teeth and calculate a transformation to apply to achieve the desired anterior-posterior malocclusion and superior-inferior malocclusion values. To achieve the desired superior-inferior overbite, anterior-posterior overbite, or both, the algorithm may be configured to move the mandible. The movement of the mandible may be based on simulated motion data or alignment motion of the patient's jaw. The system may be configured to determine the upper and lower malocclusion values by calculating an average vertical difference between the incisors of the maxillary and mandibular teeth. Similarly, the system may calculate the anterior-posterior malocclusion value by determining the average level difference between the maxillary and mandibular cutting edges.
At block 2108, the system may determine an optimal bite vertical distance. Given the optimal upper and lower malocclusion values, the optimal anterior and posterior malocclusion values, the located library (e.g., maxillary and mandibular dentition networks), and the center relationship, the system may determine a mandibular dentition transformation to apply to achieve the optimal malocclusion vertical distance. For example, the system may find frames corresponding to optimal upper and lower malocclusions and/or anterior and posterior malocclusions values in the capture of jaw movements of the patient. The system can transform the mandibular dentition net to an optimal upper and lower malocclusion and/or anterior and posterior malocclusion position. In some cases, the system may determine that upper and lower overages and/or anterior and posterior overages are acceptable and may not select a new location, while in other embodiments, upper and lower overages and/or anterior and posterior overages may be altered to increase or decrease upper and lower overages and/or anterior and posterior overages. The upper and lower malocclusion and anterior and posterior malocclusion analysis may be performed when a mandibular library is placed or may be accomplished after positioning the mandibular teeth. In some cases, it may be advantageous to perform up-down overbite and anterior-posterior overbite analysis when placing a mandibular library, for example in a prosthetic workflow. In other circumstances, it may be preferable to perform the upper and lower malocclusion and anterior and posterior malocclusion analysis after positioning the mandibular teeth.
Dynamic assessment
In some embodiments, dynamic characteristics may be considered. For example, in some embodiments, movement relative to a reference, such as movement relative to an orbital plane, may be considered. In some embodiments, the dynamic characteristic information may be from movement of the patient. In some embodiments, the dynamic characteristic information may be from analog movements. In some embodiments, the simulation of the movement of the patient may be performed by modeling the movement around the posterior condylar point.
It is important to preserve functionality (e.g., speaking, eating, etc.) and the positioning of the teeth does not result in uneven or premature wear of the surface of the teeth. For example, it is important that the surfaces of the teeth be aligned so that the functionality (e.g., chewing) is not compromised and that the positioning of the teeth is appropriate throughout the range of motion of the patient. Thus, it is advantageous to determine the contact between the maxillary and mandibular teeth. At block 2110, the system may determine a point of contact from the capture of movement of the maxillary dental mesh, mandibular dental mesh, and the patient's jaw. For each frame in the animation or for a subset of frames in the capture of the patient's movements, the system may determine points of contact between the teeth. For example, the method described in U.S. patent number 10,582,992, the entire contents of which are incorporated herein by reference in its entirety and for all purposes, may be used.
In some embodiments, the number of frames in the capture of the patient's movement may be reduced in order to speed up the process of calculating the point of contact. For example, if the maxillary and mandibular dentition networks are too far apart, the frame may be discarded. For example, if the distance between the center vertex of the maxilla and the center vertex of the mandible is greater than a threshold, the frame may be discarded. For example, a frame may be discarded if the distance is greater than about 5mm, greater than about 8.5mm, greater than about 10mm, or any other larger or smaller separation that may be desirable to reduce the number of frames while preserving sufficient information.
In some embodiments, a frame may be discarded if the movement from one frame to another is below a threshold. For example, if the distance between the central vertices of the maxillary and mandibular dentition networks changes by less than about 0.005mm, at least one of the frames may be discarded. In some embodiments, the data set may be further reduced by, for example, taking only a portion of the remaining frames. For example, in some embodiments, the system may maintain one of every eight frames, one of every ten frames, and so on. The system may then calculate the points of contact between the mandible and the maxillary dentition from the reduced data set.
The calculation of the double helix at block 2102 may result in repositioning of the upper teeth at block 2104. Repositioning of the upper teeth may enable determination of superior-inferior malocclusion and/or anterior-posterior malocclusion at block 2106 by repositioning the lower incisors with respect to the upper door teeth. At block 2108, a vertical bite distance (VDO) may be determined relative to the upper and lower bite overages. After modifying the positioning and VDO of the teeth, the system may be used to automatically, semi-automatically, or manually tune the positioning (e.g., orientation) and shape of the teeth at block 2110 to obtain optimal contact in the resting state. Advantageously, the system may then achieve optimization of functional tooth positioning.
At block 2112, the system may calculate a contact relationship of the maxillary and mandibular dentition networks based on the capture of the maxillary dentition networks, the mandibular dentition networks, the movement of the patient, the contact points, and the semantic segmentation of the maxillary and mandibular dentition networks. For each animation frame having a point of contact (e.g., the frame held at block 2110), the system may determine a contact vertex in the maxilla and mandible for each point where a tooth in the mandibular teeth mesh contacts a tooth in the maxillary teeth mesh. The system may calculate one or more distances for each contact point and may store information in a table, database, spreadsheet, array, or the like. The system may calculate the contact relationship between each unique pair of teeth over successive frames. The system may track the evolution of the separation between each unique pair of teeth over time by calculating the minimum distance between the two closest pixels of each unique pair of teeth (one on each tooth) for one or more frames. The contact relationship may alternatively or additionally be characterized by a single minimum distance between two teeth.
Fig. 36 is a flowchart illustrating an overview of an example process for planning orthodontic and/or prosthetic surgery consistent with the present disclosure. At block 3602, the system may collect data available for treatment planning, such as dental impressions, facial scans, likelihoods, and the like. At block 3604, the system may be configured to prepare patient data, which may include performing transformations on the data or otherwise modifying the data for treatment planning. At block 3606, the system may determine an initial positioning of one or more arches and teeth. As shown in fig. 36, the system may determine an aesthetic arc at block 3606, which may include projecting control points, defining an initial curve, distorting the curve using the control points, and distorting anatomical points to fit the distorted curve. The system may determine a centering arc and/or a fitting arc, which may be related to an aesthetic arc, anatomical points on the patient's teeth, and so forth. At block 3608, the system may calculate a double helix structure based at least in part on the determined arc, adjust the double helix (which may be manual, automatic, or semi-automatic), and calculate a tooth position. At block 3610, the system may perform a static optimization, which may include adjusting the relative positioning of the teeth and/or various properties of the adjusted teeth. For example, static optimization may include altering the size, shape, or both, of one or more prosthetic teeth. At block 3612, the system may perform a dynamic assessment as described above, which may include consideration of the contact relationship between the mandible and the maxillary teeth.
Computer system
FIG. 38 is a block diagram depicting an embodiment of a computer hardware system configured to run software for implementing one or more embodiments disclosed herein.
In some embodiments, the systems, processes, and methods described herein are implemented using a computing system, such as the computing system shown in fig. 38. Example computer system 3802 communicates with one or more computing systems 3820 and/or one or more data sources 3822 via one or more networks 3818. While fig. 38 illustrates an embodiment of a computing system 3802, it should be appreciated that the functionality provided for in the components and modules of the computer system 3802 may be combined into fewer components and modules or further separated into additional components and modules.
Computer system 3802 may include a module 3814 to perform the functions, methods, acts, and/or processes described herein. The module 3814 is executed on the computer system 3802 by the central processing unit 3806 discussed further below.
In general, the term "module" as used herein refers to logic embodied in hardware or firmware, or a set of software instructions having entry and exit points. The modules are written in a programming language such as JAVA, C or c++, python or the like. The software modules may be compiled and linked into an executable program, installed in a dynamically linked library, or may be written in an interpreted language such as BASIC, PERL, LUA or Python. The software modules may be invoked from other modules or from themselves, and/or may be invoked in response to a detected event or interrupt. Modules implemented in hardware include connected logic units such as gates and flip-flops and/or may include programmable units such as programmable gate arrays or processors.
In general, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules, regardless of the physical organization or storage of the logical modules. Modules are executed by one or more computing systems and may be stored on or within any suitable computer-readable medium, or implemented in whole or in part within specially designed hardware or firmware. While the above-described methods, calculations, processes, or analyses may be facilitated through the use of a computer, not all calculations, analyses, and/or optimizations require the use of a computer system. Moreover, in some embodiments, the process blocks described herein may be altered, rearranged, combined, and/or omitted.
Computer system 3802 includes one or more processing units (CPUs) 3806, which may include a microprocessor. Computer system 3802 further includes physical memory 3810, such as Random Access Memory (RAM) for temporarily storing information, read Only Memory (ROM) for permanently storing information, and mass storage 3804, such as a backup memory, hard drive, rotating magnetic disk, solid State Disk (SSD), flash memory, phase Change Memory (PCM), 3D XPoint memory, magnetic disk, or optical media storage. Alternatively, the mass storage device may be implemented in an array of servers. Typically, the components of computer system 3802 are connected to the computer using a standard based bus system. The bus system may be implemented using a variety of protocols, such as Peripheral Component Interconnect (PCI), micro-channel, SCSI, industry Standard Architecture (ISA), and Extended ISA (EISA) architecture.
The computer system 3802 includes one or more input/output (I/O) devices and interfaces 3812, such as a keyboard, mouse, touch pad, and printer. The I/O devices and interfaces 3812 may include one or more display devices, such as a monitor, that allow data to be visually presented to a user. More specifically, for example, a display device provides a presentation of a GUI as application software data, and a multimedia presentation. The I/O devices and interfaces 3812 may also provide a communication interface with various external devices. For example, the computer system 3802 may include one or more multimedia devices 3808, such as speakers, a graphics card, a graphics accelerator, and a microphone.
Computer system 3802 can run on a variety of computing devices, such as servers, windows servers, structural query language servers, unix servers, personal computers, laptops, etc. In other embodiments, computer system 3802 may run on a clustered computer system, a mainframe computer system, and/or other computing system adapted to control and/or communicate with a large database, perform bulk transaction processing, and generate reports from the large database. The computing system 3802 is typically controlled and coordinated by operating system software such as Windows XP, windows Vista, windows 7, windows 8, windows 10, windows 11, windows server, unix, linux (and variants thereof such as Debian, linux Mint, fedora, and Red Hat), sunOS, solaris, blackberry (Blackberry) OS, z/OS, iOS, macOS, or other operating systems, including proprietary operating systems. The operating system controls and schedules computer processes for execution, performs memory management, provides file systems, networking and I/O services, and provides user interfaces, such as Graphical User Interfaces (GUIs), among others.
Computer system 3802 shown in fig. 38 is coupled to a network 3818, such as a LAN, WAN, or the internet, via a communications link 3816 (wired, wireless, or a combination thereof). The network 3818 communicates with various computing devices and/or other electronic devices. The network 3818 communicates with one or more computing systems 3820 and one or more data sources 3822. The module 3814 may access or be accessed by the computing system 3820 and/or the data source 3822 via a network enabled user access point. The connections may be direct physical connections, virtual connections, and other connection types. A network enabled user access point may include a browser module that presents data using text, graphics, audio, video, and other media and allows interaction with the data via the network 3818.
Access to the modules 3814 of the computer system 3802 by the computing system 3820 and/or the data source 3822 may be through a network enabled user access point, such as a personal computer, cellular telephone, smart phone, laptop computer, tablet computer, electronic reader device, audio player, or another device capable of connecting to the network 3818 of the computing system 3820 or the data source 3822. Such devices may have browser modules implemented as modules that render data using text, graphics, audio, video, and other media and allow interaction with the data via the network 3818.
The output module may be implemented as a combination of all points addressable by a display, such as a Cathode Ray Tube (CRT), liquid Crystal Display (LCD), plasma display, or other type and/or combination of displays. The output module may be implemented to communicate with the input device 3812 and also includes software with an appropriate interface that allows a user to access data through the use of programmatic screen elements such as menus, windows, dialog boxes, toolbars, and controls (e.g., radio buttons, check boxes, slide bars, etc.). Further, the output module may be in communication with a set of input and output devices to receive signals from a user.
The input device may include a keyboard, a roller ball, a pen and stylus, a mouse, a trackball, a voice recognition system, or pre-specified switches or buttons. The output device may include a speaker, a display screen, a printer, or a voice synthesizer. Additionally, the touch screen may act as a hybrid input/output device. In another embodiment, the user may interact with the system more directly, such as through a system terminal connected to the score generator, rather than communicating via the Internet, WAN, or LAN or similar network.
In some embodiments, system 3802 may include a physical or logical connection that has been established between a remote microprocessor and a mainframe computer for the explicit purpose of uploading, downloading, or viewing interactive data and databases on-line in real-time. The remote microprocessor may be operated by an entity operating the computer system 3802, including a client server system or a host server system, and/or may be operated by one or more of the data sources 3822 and/or one or more of the computing systems 3820. In some embodiments, terminal emulation software can be used on the microprocessor to participate in micro-mainframe links.
In some embodiments, the computing system 3820 internal to the entity that operates the computer system 3802 may access the module 3814 internally as an application or process run by the CPU 3806.
In some embodiments, one or more features of the systems, methods, and devices described herein may utilize URLs and/or cookies to store and/or transmit data or user information, for example. A Uniform Resource Locator (URL) may contain a web address and/or a reference to a web page resource stored on a database and/or server. The URL may specify the location of a computer and/or a resource on a computer network. The URL may include a mechanism to retrieve the network resource. The source of the web resource may receive the URL, identify the location of the web resource, and transmit the web resource back to the requestor. The URL may be converted to an IP address and a Domain Name System (DNS) may look up the URL and its corresponding IP address. The URL may refer to a web page, file transfer, email, database access, and other applications. The URL may include a sequence of characters that identify a path, domain name, file extension, hostname, query, fragment, scheme, protocol identifier, port number, username, password, flag, object, resource name, and/or the like. The systems disclosed herein may generate, receive, transmit, apply, parse, serialize, visualize, and/or perform actions on URLs.
Cookies, also known as HTTP cookies, web cookies, internet cookies, and browser cookies, may contain data sent from a website and/or stored on a user's computer. This data may be stored by the user's web browser as the user browses. cookies may contain useful information for websites to remember previous browsing information, such as shopping carts on online stores, clicks on buttons, login information, and/or records of web pages or web resources accessed in the past. The cookie may also contain information entered by the user, such as name, address, password, credit card information, etc. cookies may also perform computer functions. For example, an application (e.g., a web browser) may use an authentication cookie to identify whether a user has logged in (e.g., to a website). The cookie data may be encrypted to provide security for the consumer. Tracking cookies may be used to compile historical browsing histories of individuals. The systems disclosed herein may generate and use cookies to access data of individuals. The system may also generate and use JSON web tokens to store reliability information, HTTP authentication as authentication protocol, tracking IP addresses of sessions or identification information, URLs, etc.
The computing system 3802 may include one or more internal and/or external data sources (e.g., data source 3822). In some embodiments, one or more of the data stores and data sources described above may use, for example, sybase, oracle, codeBase, DB, postgreSQL, and PostgreSQLSQL servers and the like, as well as other types of databases such as NoSQL databases (e.g., couchbase, cassandra or MongoDB), flat file databases, entity-relational databases, object-oriented databases (e.g., INTERSYSTEMS CACH e), cloud-based databases (e.g., amazon RDS, azure SQL, microsoft Cosmos DB, azure database for MySQL, azure database for MariaDB, azure Cache for Redis, azure managed instances for APACHE CASSANDRA, google Bare Metal solutions for Oracle on Google cloud, gu Geyun SQL, gu Geyun Spanner, google cloud large tables, google Firestore, google Firebase real-time databases, google Memorystore, google MongoDB Atlas, amazon, amazodb amazon, amazon Redshift, amazon 962, amazon DocumentDB, amazon KEYSPACES, naazon, nagason, nemanson 3274, or non-type databases based on Mazon or non-Magnon Mazon databases.
The computer system 3802 may also access one or more databases 3822. Database 3822 may be stored in a database or data repository. Computer system 3802 may access one or more databases 3822 via network 3818, or may access databases or data stores directly via I/O devices and interface 3812. A data store that stores one or more databases 3822 may reside within the computer system 3802.
Further embodiments
In the foregoing specification, the system and process have been described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the embodiments disclosed herein. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Indeed, while the systems and processes have been disclosed in the context of certain embodiments and examples, those skilled in the art will understand that various embodiments of the systems and processes extend beyond the specifically disclosed embodiments to other alternative embodiments and/or uses of the systems and processes and obvious modifications and equivalents thereof. Moreover, while several variations of embodiments of the systems and processes have been shown and described in detail, other modifications within the scope of the present disclosure will be apparent to those skilled in the art based upon the present disclosure. Furthermore, it is contemplated that various combinations or sub-combinations of the specific features and aspects of the embodiments can be made and still fall within the scope of the present disclosure. It should be understood that various features and aspects of the disclosed embodiments can be combined with or substituted for one another in order to form varying modes of the disclosed embodiments of the systems and processes. Any of the methods disclosed herein do not have to be performed in the order recited. Accordingly, it is intended that the scope of the systems and processes disclosed herein should not be limited by the particular embodiments described above.
It will be appreciated that the systems and methods of the present disclosure each have several innovative aspects, none of which are solely responsible for their desirable attributes disclosed herein or are required thereof. The various features and processes described above may be used independently of each other or in various combinations. All possible combinations and sub-combinations are contemplated as falling within the scope of the present disclosure.
Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Furthermore, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination. For each embodiment, no single feature or group of features is necessary or essential.
It will be further understood that terms, such as "may/may (can, could, might, may)", "for example", etc., as used herein are generally intended to convey that certain embodiments include certain features, elements, and/or steps, and other embodiments do not include certain features, elements, and/or steps, unless expressly stated otherwise or otherwise understood in the context of the use. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required by one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without raw input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms "comprising," "including," "having," and the like are synonymous and are used in an open-ended fashion, and do not exclude additional elements, features, acts, operations, etc. In addition, the term "or" is used in its inclusive sense (rather than in its exclusive sense) so that, for example, when used in connection with a list of elements, the term "or" means one, some, or all of the elements in the list. In addition, the articles "a," "an," and "the" as used in this disclosure and the appended claims should be construed to mean "one or more" or "at least one" unless otherwise indicated. Similarly, although operations may be depicted in the drawings in a particular order, it should be recognized that such operations are not necessarily performed in the particular order shown or in sequential order, or that all illustrated operations are performed, to achieve desirable results. Furthermore, the figures may schematically depict one or more exemplary processes in the form of a flow chart. However, other operations not depicted may be incorporated into the example methods and processes schematically illustrated. For example, one or more additional operations may be performed before, after, concurrently with, or between any of the illustrated operations. In addition, the operations may be rearranged or reordered in other embodiments. In some cases, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. In addition, other examples are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.
Further, while the methods and apparatus described herein are susceptible to various modifications and alternative forms, specific examples thereof have been shown in the drawings and are herein described in detail. It should be understood, however, that the examples are not to be limited to the particular forms or methods disclosed, but, on the contrary, the examples are to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the various embodiments described and the appended claims. Furthermore, the disclosure herein in connection with any particular feature, aspect, method, property, characteristic, quality, attribute, element, or the like of an embodiment or example may be used with all other embodiments or examples set forth herein. Any of the methods disclosed herein do not have to be performed in the order recited. The methods disclosed herein involve certain actions taken by a practitioner; however, the method may also include any third party indication of these actions, whether explicit or implicit. The scope of the disclosure herein also encompasses any and all overlaps, sub-ranges, and combinations thereof. Language such as "up to", "at least", "greater than", "less than", "between …" and the like includes the recited numbers. Terms such as "about" or "approximately" that precede a number include the number and should be construed on a case-by-case basis (e.g., as accurate as possible in the case, such as ± 5%, ± 10%, ± 15%, etc.). For example, "about 3.5mm" includes "3.5mm". The phrase preceding the term, such as "substantially" includes the recited phrase and should be interpreted on a case-by-case basis (e.g., as reasonably possible in the case). For example, "substantially constant" includes "constant". Unless otherwise indicated, all measurements are under standard conditions, including temperature and pressure.
As used herein, a phrase referring to "at least one of" a list of items refers to any combination of those items, including single members. As an example, "at least one of A, B or C" is intended to cover: A. b, C, A and B, A and C, B and C, A, B and C. Unless specifically stated otherwise, the context in which a conjunctive language such as the phrase "at least one of X, Y and Z" is used in combination is to be understood otherwise to generally mean that an item, etc. may be at least one of X, Y or Z. Thus, such connection language is not generally intended to imply that certain embodiments require at least one X, at least one Y, and at least one Z to all be present. The headings provided herein are for convenience only and do not necessarily affect the scope or meaning of the devices and methods disclosed herein.
Thus, the claims are not intended to be limited to the embodiments shown herein but are to be accorded the widest scope consistent with the disclosure, principles and novel features disclosed herein.
Example clause
Examples of embodiments of the present disclosure may be described in view of the following example clauses. The features set forth in the example implementations below may be combined with additional features disclosed herein. Furthermore, additional inventive combinations of features disclosed herein, which are not specifically recited in the example embodiments below, and do not include the same features as the particular embodiments below. For brevity, the example embodiments below do not identify every inventive aspect of the disclosure. The following example implementations are not intended to identify key features or essential features of any of the subject matter described herein. Any of the example clauses or any features of the example clauses below may be combined with any one or more other example clauses or features of the example clauses or other features of the present disclosure.
Clause 1. A computer-implemented method for dental treatment planning, comprising: receiving, by a computing system, patient data associated with a patient; determining, by the computing system, at least one arc corresponding to an anatomical point of a tooth library; determining, by the computing system, a double helix based on the at least one arc, the double helix to be used to fit a library of teeth; determining, by the computing system, a position of a tooth of the tooth library on the double helix; and optimizing teeth of the library of teeth by the computing system.
Clause 2. The method of clause 1, wherein the patient data comprises dental data.
Clause 3 the method of clause 1, wherein the patient data comprises morphometric data.
Clause 4. The method of clause 1, wherein determining the at least one arc comprises: the patient data is provided to an AI model trained to identify anatomical points of teeth of the tooth library.
Clause 5 the method of clause 1, wherein determining the double helix comprises providing the at least one arc to an AI model configured to determine the double helix based at least in part on the at least one arc.
Clause 6. The method of clause 1, wherein optimizing the teeth of the library of teeth comprises determining the position, rotation, or both, of each tooth of the library of teeth using an AI model configured to optimize functional and aesthetic positioning of the teeth.
Clause 7. The method of clause 1, further comprising performing, by the computing system, a dynamic assessment of the position of the teeth of the dental library.
Clause 8. The method of clause 1, wherein determining at least one arc comprises determining an aesthetic arc, and wherein determining the aesthetic arc comprises: projecting, by the computing system, one or more control points onto an image of the patient; defining, by the computing system, an initial curve based at least in part on the one or more control points; determining a final curve by modifying at least one control point by the computing system; and determining, by the computing system, a location of one or more anatomical points based at least in part on the final curve, the location of one or more anatomical points at least in part defining the aesthetic arc.
Clause 9. The method of clause 1, wherein determining the at least one arc comprises determining an aesthetic arc, a centering arc, and a fitting arc.
Clause 10. The method of clause 9, wherein determining the at least one arc further comprises determining a guide arc associated with the mandibular teeth.
Clause 11. The method of clause 1, wherein optimizing the teeth of the library of teeth comprises adjusting the relative positioning of one or more teeth in the library of teeth.
Clause 12. The method of clause 11, wherein adjusting the relative positioning comprises adjusting an upper and lower overengagement value and an anterior and posterior overengagement value.
Clause 13 the method of clause 1, wherein optimizing the teeth of the library of teeth comprises adjusting any combination of one or more of the size, shape, or rotation of at least one tooth of the library of teeth.
The method of clause 1, wherein the library of teeth comprises a library of teeth of the patient, and wherein the method further comprises: identifying, by the computing system, one or more teeth of the library of teeth; and labeling, by the computing system, one or more anatomical points of each of the one or more teeth of the library of teeth.
Clause 15 the method of clause 1, wherein the library of teeth comprises a library of artificial teeth, and wherein the method further comprises: a dental library is selected by the computing system from a plurality of prosthetic dental libraries based at least in part on the captured patient data.
Clause 16 the method of clause 7, wherein optimizing the teeth of the dental library comprises determining points of contact between the patient's maxillary teeth and the patient's mandibular teeth, wherein performing the dynamic assessment comprises determining a relationship of contact between the patient's maxillary teeth and the patient's mandibular teeth during movement of the patient's jaw.
Clause 17. A system for dental treatment planning, comprising: a computer readable storage medium having program instructions embodied therewith; and one or more processors configured to execute the program instructions to cause the system to: receiving patient data associated with a patient; determining at least one arc, the arc corresponding to an anatomical point of a tooth of the library of teeth; determining a double helix based on the at least one arc, the double helix to be used to fit a dental library; determining the position of teeth of the dental library on the double helix; and optimizing teeth of the dental library.
Clause 18 the system of clause 17, wherein the patient data comprises dental data.
Clause 19 the system of clause 17, wherein the patient data comprises morphometric data.
Clause 20 the system of clause 17, wherein determining the at least one arc comprises: the patient data is provided to an AI model trained to identify anatomical points of teeth of the tooth library.
Clause 21 the system of clause 17, wherein determining the double helix comprises providing the at least one arc to an AI model configured to determine the double helix based at least in part on the at least one arc.
Clause 22 the system of clause 17, wherein optimizing the teeth of the library of teeth comprises determining the position, rotation, or both, of each tooth of the library of teeth using an AI model configured to optimize functional and aesthetic positioning of the teeth.
Clause 23 the system of clause 17, wherein the computer-readable storage medium has instructions embodied therewith that, when executed by the one or more processors, cause the system to: a dynamic assessment of the position of the teeth of the library of teeth is performed.
Clause 24 the system of clause 17, wherein determining at least one arc comprises determining an aesthetic arc, and wherein determining the aesthetic arc comprises: projecting one or more control points onto an image of the patient; defining an initial curve based at least in part on the one or more control points; defining a final curve by modifying at least one of the one or more control points; and determining a location of one or more anatomical points based at least in part on the final curve, the location of the one or more anatomical points at least in part defining the aesthetic arc.
Clause 25 the system of clause 17, wherein determining the at least one arc comprises determining an aesthetic arc, a centering arc, and a fitting arc.
Clause 26 the system of clause 25, wherein determining the at least one arc further comprises determining a guide arc associated with the mandibular teeth.
Clause 27 the system of clause 17, wherein optimizing the teeth of the library of teeth comprises adjusting the relative positioning of one or more teeth in the library of teeth.
Clause 28 the system of clause 27, wherein adjusting the relative positioning comprises adjusting an upper and lower overengagement value and an anterior and posterior overengagement value.
Clause 29, the system of clause 17, wherein optimizing the teeth of the library of teeth comprises adjusting any combination of one or more of the size, shape, or rotation of at least one tooth of the library of teeth.
The system of clause 17, wherein the dental library comprises a library of teeth of the patient, and wherein the computer-readable storage medium has instructions embodied therewith that, when executed by the one or more processors, cause the system to: identifying one or more teeth of the library of teeth; and labeling one or more anatomical points of each of the one or more teeth of the library of teeth.
The system of clause 31, wherein the library of teeth comprises a library of artificial teeth, and wherein the computer-readable storage medium has instructions embodied therewith that, when executed by the one or more processors, cause the system to: a dental library is selected from a plurality of prosthetic dental libraries based at least in part on the patient data.
Clause 32 the system of clause 23, wherein optimizing the teeth of the dental library comprises determining points of contact between the patient's maxillary teeth and the patient's mandibular teeth, wherein performing the dynamic assessment comprises determining a relationship of contact between the patient's maxillary teeth and the patient's mandibular teeth during movement of the patient's jaw.

Claims (32)

1. A computer-implemented method for dental treatment planning, comprising:
receiving, by a computing system, patient data associated with a patient;
Determining, by the computing system, at least one arc corresponding to an anatomical point of a tooth library;
determining, by the computing system, a double helix based on the at least one arc, the double helix to be used to fit a library of teeth;
Determining, by the computing system, a position of a tooth of the tooth library on the double helix; and
Optimizing teeth of the teeth library by the computing system.
2. The method of claim 1, wherein the patient data comprises dental data.
3. The method of claim 1, wherein the patient data comprises morphometric data.
4. The method of claim 1, wherein determining at least one arc comprises:
The patient data is provided to an AI model trained to identify anatomical points of teeth of the tooth library.
5. The method of claim 1, wherein determining a double helix comprises providing the at least one arc to an AI model configured to determine the double helix based at least in part on the at least one arc.
6. The method of claim 1, wherein optimizing the teeth of the library of teeth comprises determining a position, a rotation, or both, of each tooth of the library of teeth using AI models configured to optimize functional and aesthetic positioning of the teeth.
7. The method of claim 1, further comprising performing, by the computing system, a dynamic assessment of a position of a tooth of the tooth library.
8. The method of claim 1, wherein determining at least one arc comprises determining an aesthetic arc, and wherein determining the aesthetic arc comprises:
Projecting, by the computing system, one or more control points onto an image of the patient;
defining, by the computing system, an initial curve based at least in part on the one or more control points;
determining a final curve by modifying at least one control point by the computing system; and
Determining, by the computing system, a location of one or more anatomical points based at least in part on the final curve, the location of the one or more anatomical points at least in part defining the aesthetic arc.
9. The method of claim 1, wherein determining the at least one arc comprises determining an aesthetic arc, a centering arc, and a fitting arc.
10. The method of claim 9, wherein determining the at least one arc further comprises determining a guide arc associated with a mandibular tooth.
11. The method of claim 1, wherein optimizing teeth of the library of teeth comprises adjusting relative positioning of one or more teeth in the library of teeth.
12. The method of claim 11, wherein adjusting the relative positioning comprises adjusting an upper and lower bite value and an anterior and posterior bite value.
13. The method of claim 1, wherein optimizing teeth of the library of teeth comprises adjusting any combination of one or more of a size, shape, or rotation of at least one tooth of the library of teeth.
14. The method of claim 1, wherein the library of teeth comprises a library of teeth of the patient, and wherein the method further comprises:
identifying, by the computing system, one or more teeth of the library of teeth; and
One or more anatomical points of each of the one or more teeth of the library of teeth are annotated by the computing system.
15. The method of claim 1, wherein the library of teeth comprises a library of artificial teeth, and wherein the method further comprises:
A dental library is selected by the computing system from a plurality of prosthetic dental libraries based at least in part on the captured patient data.
16. The method of claim 7, wherein optimizing the teeth of the dental library comprises determining points of contact between the patient's maxillary teeth and the patient's mandibular teeth, wherein performing dynamic assessment comprises determining a contact relationship between the patient's maxillary teeth and the patient's mandibular teeth during movement of the patient's jaw.
17. A system for dental treatment planning, comprising:
A computer readable storage medium having program instructions embodied therewith; and
One or more processors configured to execute the program instructions to cause the system to:
receiving patient data associated with a patient;
determining at least one arc, the arc corresponding to an anatomical point of a tooth of the library of teeth;
determining a double helix based on the at least one arc, the double helix to be used to fit a dental library;
determining the position of teeth of the dental library on the double helix; and
Optimizing teeth of the dental library.
18. The system of claim 17, wherein the patient data comprises dental data.
19. The system of claim 17, wherein the patient data comprises morphometric data.
20. The system of claim 17, wherein determining at least one arc comprises:
The patient data is provided to an AI model trained to identify anatomical points of teeth of the tooth library.
21. The system of claim 17, wherein determining a double helix comprises providing the at least one arc to an AI model configured to determine the double helix based at least in part on the at least one arc.
22. The system of claim 17, wherein optimizing the teeth of the library of teeth comprises determining a position, rotation, or both, of each tooth of the library of teeth using AI models configured to optimize functional and aesthetic positioning of the teeth.
23. The system of claim 17, wherein the computer-readable storage medium has instructions embodied therewith that, when executed by the one or more processors, cause the system to: a dynamic assessment of the position of the teeth of the library of teeth is performed.
24. The system of claim 17, wherein determining at least one arc comprises determining an aesthetic arc, and wherein determining the aesthetic arc comprises:
projecting one or more control points onto an image of the patient;
defining an initial curve based at least in part on the one or more control points;
defining a final curve by modifying at least one of the one or more control points; and
Determining a location of one or more anatomical points based at least in part on the final curve, the location of the one or more anatomical points at least in part defining the aesthetic arc.
25. The system of claim 17, wherein determining the at least one arc comprises determining an aesthetic arc, a centering arc, and a fitting arc.
26. The system of claim 25, wherein determining the at least one arc further comprises determining a guide arc associated with a mandibular tooth.
27. The system of claim 17, wherein optimizing teeth of the library of teeth comprises adjusting relative positioning of one or more teeth in the library of teeth.
28. The system of claim 27, wherein adjusting the relative positioning comprises adjusting an upper and lower bite value and an anterior and posterior bite value.
29. The system of claim 17, wherein optimizing teeth of the library of teeth comprises adjusting any combination of one or more of a size, shape, or rotation of at least one tooth of the library of teeth.
30. The system of claim 17, wherein the dental library comprises a library of teeth of the patient, and wherein the computer-readable storage medium has instructions embodied therewith that, when executed by the one or more processors, cause the system to:
Identifying one or more teeth of the library of teeth; and
One or more anatomical points of each of the one or more teeth of the library of teeth are labeled.
31. The system of claim 17, wherein the library of teeth comprises a library of artificial teeth, and wherein the computer-readable storage medium has instructions embodied therewith that, when executed by the one or more processors, cause the system to:
A dental library is selected from a plurality of prosthetic dental libraries based at least in part on the patient data.
32. The system of claim 23, wherein optimizing the teeth of the dental library comprises determining points of contact between the patient's maxillary teeth and the patient's mandibular teeth, wherein performing dynamic assessment comprises determining a contact relationship between the patient's maxillary teeth and the patient's mandibular teeth during movement of the patient's jaw.
CN202280075745.7A 2021-09-16 2022-09-15 Systems, devices, and methods for tooth positioning Pending CN118235209A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US63/245,072 2021-09-16
US63/364,102 2022-05-03

Publications (1)

Publication Number Publication Date
CN118235209A true CN118235209A (en) 2024-06-21

Family

ID=

Similar Documents

Publication Publication Date Title
EP3952782B1 (en) Visual presentation of gingival line generated based on 3d tooth model
US10945813B2 (en) Providing a simulated outcome of dental treatment on a patient
US11672629B2 (en) Photo realistic rendering of smile image after treatment
US11717380B2 (en) Automated 2D/3D integration and lip spline autoplacement
US20210118132A1 (en) Artificial Intelligence System For Orthodontic Measurement, Treatment Planning, And Risk Assessment
WO2018140159A1 (en) Adaptive orthodontic treatment
US20230132201A1 (en) Systems and methods for orthodontic and restorative treatment planning
US11833007B1 (en) System and a method for adjusting an orthodontic treatment plan
US20230172691A1 (en) Systems and methods for determining an orthodontic treatment
US11399917B1 (en) Systems and methods for determining an orthodontic treatment
CN118235209A (en) Systems, devices, and methods for tooth positioning
WO2023041986A1 (en) Systems, devices, and methods for tooth positioning
JP7405809B2 (en) Estimation device, estimation method, and estimation program
WO2023203385A1 (en) Systems, methods, and devices for facial and oral static and dynamic analysis
JP2024029381A (en) Data generation device, data generation method and data generation program
JP2023058939A (en) Estimation device, estimation method, and estimation program

Legal Events

Date Code Title Description
PB01 Publication