US20120070803A1 - Prosthesis manipulation in dental prosthesis design - Google Patents
Prosthesis manipulation in dental prosthesis design Download PDFInfo
- Publication number
- US20120070803A1 US20120070803A1 US12/884,618 US88461810A US2012070803A1 US 20120070803 A1 US20120070803 A1 US 20120070803A1 US 88461810 A US88461810 A US 88461810A US 2012070803 A1 US2012070803 A1 US 2012070803A1
- Authority
- US
- United States
- Prior art keywords
- prosthesis
- teeth
- tooth
- virtual
- computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C13/00—Dental prostheses; Making same
- A61C13/0003—Making bridge-work, inlays, implants or the like
- A61C13/0004—Computer-assisted sizing or machining of dental prostheses
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C11/00—Dental articulators, i.e. for simulating movement of the temporo-mandibular joints; Articulation forms or mouldings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C13/00—Dental prostheses; Making same
- A61C13/225—Fastening prostheses in the mouth
- A61C13/26—Dentures without palates; Partial dentures, e.g. bridges
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C5/00—Filling or capping teeth
- A61C5/70—Tooth crowns; Making thereof
- A61C5/77—Methods or devices for making crowns
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
Definitions
- the present application generally relates to dental planning, and more particularly to prosthesis manipulation in dental prosthesis design.
- the use of computer systems to design dental prostheses has increased in recent years.
- the computer systems allow a dentist, dental technician, or other operator to design dental prostheses for individual patients.
- Individual prosthesis designs are often called “situations,” “dental plans,” or “prosthetic plans.”
- Operators using the computer systems can design plans based on a library of the teeth shapes and positions, patient data, and available equipment and hardware.
- current systems may provide sets of 3D models of prosthetic teeth or crowns as part of a library. This library of teeth may be used to help design prosthetic teeth for a patient using 3D graphics or CAD software.
- Current systems don't, however, allow flexibility for designing dental prostheses. The systems limit, for example, what a dentist or other operator can do with libraries of 3D models of prosthetic teeth.
- the techniques, systems, methods, and computer-readable storage media described herein overcome some of the shortcomings of the prior art and provide for prosthesis manipulation in dental prosthesis design.
- Embodiments herein include techniques, methods, systems, devices, and computer-readable media for prosthesis manipulation in dental prosthesis design. These can include presenting, via a computer-implemented interface, a virtual multi-tooth prosthesis, said virtual multi-tooth prosthesis comprising two or more 3D models of individual teeth, said virtual multi-tooth prosthesis being presented relative to a 3D representation of a multi-tooth area of a patient's mouth that is to be reconstructed, said computer-implemented interface running on one or more computing devices.
- a command to manipulate a subset of the teeth in the virtual multi-tooth prosthesis may be received from an operator, via the computer-implemented interface.
- One or more parameters for the shape of the virtual multi-tooth prosthesis may be modified based on the manipulation command. The one or more parameters may be related to the subset of teeth. Production data related to the virtual multi-tooth prosthesis may be generated.
- Some embodiments for prosthesis manipulation in dental prosthesis design include presenting, via a computer-implemented interface, a virtual prosthesis, where the virtual prosthesis can be presented relative to a 3D representation of an area of a patient's mouth that is to be reconstructed.
- the 3D representation of the area of the patient's mouth that is to be reconstructed can have a corresponding antagonist area of teeth.
- One or more prosthesis manipulation commands may be received from an operator, via the computer-implemented interface.
- the virtual prosthesis may be modified based on the prosthesis manipulation command and based on an occlusion of the prosthesis relative to the corresponding antagonist area of teeth.
- production data related to the virtual prosthesis may be generated.
- FIGS. 1A and 1B illustrate two interfaces for occlusion estimation in dental prosthesis design.
- FIG. 2 illustrates an example system for occlusion estimation in dental prosthesis design.
- FIGS. 3A and 3B illustrate two example methods for occlusion estimation in dental prosthesis design.
- FIG. 4 illustrates a third interface for occlusion estimation in dental prosthesis design.
- FIG. 5 illustrates a fourth interface for occlusion estimation in dental prosthesis design.
- FIG. 6 illustrates a fifth interface for occlusion estimation in dental prosthesis design.
- FIG. 7 illustrates a sixth interface for occlusion estimation in dental prosthesis design.
- FIG. 8 illustrates a seventh interface for occlusion estimation in dental prosthesis design.
- FIGS. 9A , 9 B, and 9 C illustrate two sets of candidate contact points for occlusion estimation in dental prosthesis design.
- FIG. 10 illustrates an eighth interface for occlusion estimation in dental prosthesis design.
- FIG. 11 illustrates a ninth interface for occlusion estimation in dental prosthesis design.
- FIG. 12 illustrates a first interface for prosthesis manipulation in dental prosthesis design.
- FIGS. 13A and 13B illustrate two example methods for prosthesis manipulation in dental prosthesis design.
- FIGS. 14A and 14B illustrate two interfaces for prosthesis manipulation in dental prosthesis design.
- FIGS. 15A and 15B illustrate two schematics for prosthesis manipulation in dental prosthesis design.
- FIG. 16 depicts an abstract representation of the relative placement of contact points and the center of gravity.
- initial placement of occluding teeth is either defined by the relative placement of the upper and lower teeth during intraoral scanning, during the scanning of the occluding teeth's physical models, by using a scanned checkbite, and/or by an operator manipulating one or both of the 3D models associated with the occluding teeth in order to obtain an initial relative placement.
- the techniques may include finding a first contact point between the two 3D models (e.g., in the gravity direction).
- the first contact point can be used as a pivot in a motion simulation, such as a six-degree of freedom motion simulation, a constrained rigid body motion simulation, a free-fall simulation, etc.
- the techniques may proceed by simulating the motion of one of the 3D models with respect to the other where the pivot point is used to restrict the rotation. For example, if the first contact point had been between the cusp of one tooth and the fissure of its antagonist tooth, then the two 3D models remain together at that point, and that point can act as a pivot, as one of the 3D models rotates with respect to the other around that point. The simulated rotation continues until one or more contact point(s) are detected. The contact points are detected at each step of the simulation by a collision detection engine. That is, once one of the 3D models has rotated onto the other 3D model and the corresponding contact points are determined with enough precision, that step of the simulation is terminated.
- an attempt to improve the precision of contact points determined may include, once one or more contact points are found, refining the previous simulation step with smaller and smaller step sizes (e.g., simulating motion simulation over smaller amounts of time) to reduce any interpenetration of the two 3D models.
- a motion simulation such as a free fall, may be performed until a contact point is determined. In the case in which more than one contact point has been determined, then a check may be performed to determine what contact point(s) to use from the set of discovered contact points.
- Another motion simulation step may proceed using some or all of the candidate contact points (e.g., one or two of the candidate contact point may be used).
- a subsequent motion simulation step using two candidate contact points may include using the two candidate contact points to define an axis of rotation in the simulated motion. The process of determining new candidate contact points will continue until predetermined stopping criteria are met. Examples of stopping criteria are discussed more below.
- the 3D model of the teeth that is “moving” e.g., the upper teeth
- the other 3D model e.g., the lower teeth
- the center of gravity can be determined in numerous ways. For example, the center of gravity may be determined by assigning a weight to each of the triangles, vertices, pixels, or voxels that form the 3D model and determine a center of gravity based on those assigned weights.
- the predetermined stopping criteria may be met once there are three contact points that define a triangle that encompasses the center of gravity.
- the simulation may be terminated and the occlusion may be estimated based on those three contact points—those three contact points may define the placement of one of the 3D models with respect to another 3D model. For example, if the top set of teeth is seen as the moving set of teeth and a first contact point is determined between the top set of teeth and the lower set of teeth, then the first contact point may be used as a pivot point. The simulation may continue until subsequent contact points are determined that define a triangle that includes or encompasses the center of gravity of the top set of teeth.
- an interface 100 that includes an overlaid representation portion 110 as well as a global selection portion 111 .
- the overlaid representation portion 110 may display a lower teeth model 120 and an upper teeth model 130 .
- the lower teeth model 120 may be represented as an opaque 3D model and the upper teeth model 130 may be represented as a transparent or semitransparent 3D model.
- the global selection portion may have buttons that allow either or both of the lower teeth model 120 and upper teeth model 130 to be displayed transparently or to provide other global manipulation functions.
- an interface 100 may also include a distance map portion 112 .
- the distance map portion 112 may show the distance between the lower teeth model 120 and the upper teeth model 130 as a contour graph, a color-coded graph, a shaded graph, and/or the like. This distance map may also be displayed, in some embodiments, as a texture map on the models 120 and/or 130 shown in the overlaid representation portion 110 .
- the interface 100 shows the lower teeth model 120 and the upper teeth model 130 before the occlusion estimation has occurred.
- FIG. 1B shows the model 120 and the model 130 after the occlusion estimation has occurred. As depicted in FIG.
- That contact point can be used as a pivot to determine a subsequent set of one or more candidate contact points.
- the determination of candidate contact points may continue until three candidate contact points 141 , 140 , 142 are determined that define a triangle 199 that encompasses or includes the center of gravity 150 .
- FIG. 2 illustrates an example system 200 for occlusion estimation and/or prosthesis manipulation in dental prosthesis design.
- the system 200 may include one or more computers 210 coupled to one or more displays 220 , and one or more input devices 230 .
- An operator 240 who may be a dentist, dental technician, or other person, may plan dental prostheses using system 200 by manipulating the one or more input devices 230 , such as a keyboard and/or a mouse.
- the operator 240 may view the dental plan and other related dental plan data on the display 220 .
- the display 220 may include two or more display regions or portions, each of which displays a different view of the dental plan.
- the display 220 may show a semi-realistic 3D rendering of the dental plan, a localized abstraction of the dental plan, and/or a cross-sectional representation of the dental plan.
- Each of these displays or portions may be linked internally within a program and/or using data on computer 210 .
- a program running on a computer 210 may have a single internal representation of the dental plan in memory and the internal representation may be displayed in two or more abstract or semi-realistic manners on display 220 .
- the operator 240 may be able to perform a command, such as select, move, manipulate, or make transparent, opaque, or invisible, on a particular substructure in the dental plan.
- the operator 240 may be able to perform this command by manipulating the input device 230 , such as clicking with a mouse on a particular region of one of the abstract or semi-realistic versions of the dental plan displayed on the display 220 .
- the computer 210 may include one or more processors, one or more memories, and one or more communication mechanisms. In some embodiments, more than one computer may be used to execute the modules, methods, blocks, and processes discussed herein. Additionally, the modules and processes herein may each run on one or multiple processors, on one or more computers; or the modules herein may run on dedicated hardware.
- the input devices 230 may include one or more keyboards (one-handed or two-handed), mice, touch screens, voice commands and associated hardware, gesture recognition, or any other means of providing communication between the operator 240 and the computer 210 .
- the display 220 may be a two-dimensional (“2D”) or 3D display and may be based on any technology, such as LCD, CRT, plasma, projection, etc.
- the communication among the various components of system 200 may be accomplished via any appropriate coupling, including USB, VGA cables, coaxial cables, FireWire, serial cables, parallel cables, SCSI cables, IDE cables, SATA cables, wireless based on 802.11 or Bluetooth, or any other wired or wireless connection(s).
- One or more of the components in system 200 may also be combined into a single unit or module. In some embodiments, all of the electronic components of system 200 are included in a single physical unit or module.
- FIGS. 3A and 3B illustrate two techniques for occlusion estimation in dental prosthesis design.
- the technique may include motion simulation of one 3D model of teeth with respect to another 3D model of teeth.
- the motion simulation may include, in various embodiments, a six-degrees-of-freedom rigid body motion simulation, free-fall simulation, a rigid body simulation with one or a combination of constraints, or other motion simulation.
- the techniques can proceed by having either the upper or lower set of teeth be the 3D model that “moves” and the other be stationary. Alternatively, both models could move with respect to the others.
- the first step in occlusion estimation may include determining as a first contact point, the point on the lower 3D model which is closest, in the direction of gravity, to the upper 3D model. Once that initial contact point is determined, other candidate contact points between the upper 3D teeth model and lower 3D teeth model may be determined using simulated motion until one or more predetermined stopping criteria are met. In various embodiments, the initial contact point may be determined by finding the closest point between the first and second 3D models in the gravity direction. In subsequent steps of the motion simulation, candidate contact points (possibly including the initial contact point) may be found. After assessing the appropriateness of the candidate contact points, each candidate contact point (possibly including the initial contact point) may or may not be chosen for use in subsequent steps of the motion simulation.
- the particular contact point may be used instead of the initial contact point.
- the first contact point may not be used in subsequent steps of the simulation and, similarly, may not end up in the final set of contact points that are used to define the occlusion between the first and second 3D models.
- determining whether two contact points are on the opposite sides of the center of gravity may include defining a bisector or bisection plane through the center of gravity that splits the first 3D model into two segments, for example, a left segment and a right segment, and, optionally, splits the second 3D model into two segments, for example, the left segment and the right segment.
- the first 3D model of teeth includes all of the teeth in the lower jaw of a patient and the center of gravity is along the center line of the jaw, then the teeth on the left side of the mouth and the teeth in the right side of the mouth may be in different sections.
- Determining whether there are two contact points on the opposite sides of the center of gravity may include determining whether there are contact points in the two different sections of the mouth (the left section and the right section). As another example, consider FIG. 16 . If two contact points define a line segment 1610 , that is, from one contact point 1640 another contact point 1641 , and the line segment 1610 is part of a line L 1620 , then determining whether a center of gravity is between the two contact points may include determining whether the closest point on line L 1620 to the center of gravity is between the two contact points 1640 and 1641 , or on the line segment 1610 defined by the two contact points 1640 and 1641 . For an example center of gravity 1650 , the closest point on the line segment 1610 is point 1650 A.
- checking to see whether the center of gravity is between two contact points comprises can include projecting the contact points onto the rotation plane (e.g., a plane whose normal is the rotation axis and that includes the gravity center on the plane). Various embodiments can then determine whether the projected points are on each side of a certain line defined by the projection of the gravity force vector onto the rotation plane and going through the gravity center. If the two are on opposite sides of the certain line, then they are on opposite sides of the center of gravity. There are numerous other ways to determine whether the center of gravity is between the two contact points, and these are considered within the scope of the embodiments herein.
- Determining whether the center of gravity is within a triangle that is defined by three contact points may include projecting the triangle that is defined by the three contact points onto the occlusal plane and projecting the center of gravity onto the occlusal plane. If the center of gravity projected onto the occlusal plane lies within the triangle defined by the three contact points, then it may be considered that the center of gravity is within the triangle defined by the three contact points. As above, numerous other methods of determining whether the center of gravity is within the triangle defined by the three contact points may be used and are considered part of the embodiments herein.
- the techniques described herein may include changing the state (e.g., position, rotation, scaling, shape, etc.) of one or more 3D models based on the contact with the antagonist teeth. For example, if a crown or bridge is being designed and there are multiple units (e.g., teeth) in the crown or bridge, then each unit in the bridge or crown may be rotated, scaled, or otherwise changed in order to provide at least one contact point with the antagonist. After the relative placement of the occluding sets of teeth are determined based on the contact points or after new states for one or more 3D teeth models are determined based on the contact points, designing the prosthesis may continue and/or production data for the prosthesis may be generated based on the 3D model of the prosthesis.
- state e.g., position, rotation, scaling, shape, etc.
- a first contact point is determined in the direction of gravity based on the initial positions of the 3D models of occluding teeth.
- the initial position may be defined based on the known relative positions for the first 3D model and the second 3D model.
- the initial position may be known because a scanning procedure to obtain the first 3D model of the teeth (e.g., the lower set of teeth) and the second 3D model of the teeth (e.g., the upper set of teeth) may have been performed and the initial placement may be defined in the relative placement of the first 3D model and the second 3D model during the scanning procedure.
- the initial relative placement of the first 3D model with respect to the second 3D model of the teeth may be known based on a scanned check bite. That is, if the second 3D model of the teeth is determined at least in part based on the scanned check bite, then the first 3D model of the teeth may be surface matched to the check bite and that check bite may provide the initial placement of the two sets of teeth. Additionally, as noted above, an operator may manipulate the relative placement of the first 3D model and the second 3D model before performing the occlusion estimation.
- determining the first contact point between occluding teeth in the direction of gravity based on the initial position may also include an initial determination of the gravity direction.
- the gravity direction may be determined in any of numerous ways, including having it be predefined based on the scanning procedure and the like. Additionally, the gravity direction may, be perpendicular to an occlusal plane of the first and/or the second 3D model of teeth. The occlusal plane of the two 3D models may be known ahead of time or it may be determined in any number of ways.
- a planar object such as a rectangle
- that rectangular object once it has come to rest on the first 3D model, may define the occlusal plane.
- “Dropping” the rectangular object onto the 3D model may be accomplished using, e.g., the simulated motion described with respect to FIG. 3A or FIG. 3B , or any other appropriate technique.
- the normal of the planar rectangular object may be used to define the direction of gravity.
- the distance between the first and second 3D models is determined in a direction other than the direction of gravity.
- the overall closest point between two triangular meshes representing the first and second 3D models may be determined or the closest point between the two 3D models in a direction other than gravity may be determined and used as the closest point between the 3D models.
- Determining the first contact point between the occluding sets of teeth in the direction of gravity in block 310 may be performed based on any appropriate calculation. For example, it may be determined by performing a numerical calculation of the closest point between the two 3D models in the direction of gravity. In some embodiments, the closest point between the two 3D models may be determined by simulating free fall of one of the 3D models with respect to the other 3D model. For example, based on the initial position, one of the 3D models may be “dropped” onto the other 3D model. The first contact point between the two 3D models when dropped may be the closest point between the two 3D models.
- One 3D model may then, optionally, in some embodiments, be moved toward the other in the direction of gravity so that the closest point between the two 3D models would be the contact point between the two 3D models.
- the two 3D models may then, optionally, be moved toward each other in the gravity direction instead of moving one 3D model and keeping one fixed.
- motion simulation may be used to determine subsequent candidate contact points. After the first contact point is determined, it is used as a constraint on the simulated motion. That is, that contact point will remain in contact for the duration of that step of the simulated motion. The simulated motion will result in the moving 3D model pivoting around that contact point until one or more other contact points are determined. In some embodiments, it is possible that one or more contact points may be lost, perhaps due to numerical error. If one or more contact points are lost due to numerical error, the simulation may continue. For example, the moving 3D model could fall in the gravity direction until at least one contact point is found.
- Determining a contact point may include using any particular type of collision detection. For example, if the first and second 3D models each are represented as a triangular mesh, then contact points may be determined by looking for collisions between the two triangular meshes. Further, in some embodiments, if it is determined that two particular triangles, one in each 3D model's triangular mesh, intersect, then the actual point or edge of intersection may be used (e.g., if it is known) or if there is merely an indication that the two triangles intersect, then the contact points may be estimated as the centers of the two triangles. Numerous other collision detection techniques may be used to determine contact points and are encompassed by embodiments herein.
- Checking the stopping criteria may include determining whether two contact points in the set of candidate contact points are on opposite sides of the center of gravity. Another check of stopping criteria may include determining whether there are three contact points which define a triangle that includes the center of gravity of the moving 3D model.
- the previous set of candidate contact points may have already included contact points 940 and 941 in FIG. 9A , and those two contact points 940 and 941 and an additional contact point 942 may have been determined. Since contact points 940 , 941 , and 942 do not form a triangle that encompasses the center of gravity 950 , a determination may be made as to which of the candidate contact points 940 , 941 , and/or 942 to use in subsequent motion simulations.
- the three contact points define three axes of rotation 961 , 960 , and 962 .
- These axes of rotation may be used in a motion simulation to determine whether the other contact points should be included in the subsequent step of the motion simulation.
- the contact points 940 , 941 may have an axis of rotation 960 associated with them.
- FIG. 9C if axis of rotation 960 is used during a motion simulation, the normal force 998 applied on the moving object during the motion simulation on the other candidate contact point 942 will be against the force 999 associated with the simulated rotation 960 .
- the point 942 being on the same side of the center of gravity 950 would normally rotate during motion simulation. Yet, the contact point 942 is already in contact with the other 3D model and therefore further rotation (or dropping) will not be possible.
- axis of rotation 960 will be excluded from consideration as the proper axis of rotation. Therefore, the set including both candidate contact points 940 , 941 will not be used in the subsequent step of the simulation. If, on the other hand, the simulation were performed with axis of rotation 961 , which bridges candidate contact points 941 , 942 , then as the moving 3D model is in motion, the normal force on candidate contact point 940 will create a moment in the direction of rotation. Therefore, the axis of rotation 961 between candidate contact points 941 , 942 is a proper axis of rotation. Therefore, candidate contact points 941 , 942 will be used in the subsequent step of the motion simulation.
- candidate axes of rotation 960 - 965 will all be eliminated because a normal force on one of the candidate contact points 940 - 943 will be against the axis of rotation. Only the candidate axis of rotation 961 will have no candidate contact points with force normals create moments in the opposite direction of rotation. Therefore, the candidate contact points 941 and 943 will be used in the subsequent simulation step.
- the relative placement of the occluding sets of teeth may be determined based on the contact points.
- the relative placements of occluding teeth may be known based on the contact points and no further calculation may be needed to determine the relative placements.
- determining the relative placements of the occluding sets of teeth may include recording a matrix, quaternion, or other transformation of the 3D models of occluding teeth after the contact points have been determined.
- the contact points may define the relative placement of the two 3D models with respect to one another.
- the two 3D models may be translated, rotated, or a transformation between the two 3D models may be stored.
- the design of the prosthesis may be continued or data may be generated for production of the prosthesis.
- Designing dental prostheses may be performed using any appropriate system, methods, or techniques, such as those described in U.S. patent application Ser. No. 12/703,601, filed Feb. 10, 2010, entitled Dental Prosthetics Manipulation, Selection, and Planning, which is hereby incorporated by reference in its entirety for all purposes.
- FIG. 3B illustrates another method 301 of occlusion estimation for dental prosthesis design.
- one or more 3D models of prosthetic teeth and the 3D model of their antagonist may be received in block 311 .
- an antagonist 730 may be received, in addition to 3D models of individual prosthetic teeth 770 , 771 , and 772 . Together these may be used to design a crown or a bridge that is defined by the 3D models of the prosthetic teeth 770 , 771 , and 772 .
- an initial position of the one or more 3D models of prosthetic teeth and the 3D model of their antagonist may be received or determined. For example, an operator may initially place the teeth with respect to the antagonist, or the teeth in the antagonists' relative placement may be algorithmically determined or known based on the scanning procedure used to obtain the 3D model (described elsewhere herein).
- contact points between each of the one or more 3D models of prosthetic teeth and the antagonist may be determined in block 321 . Determining the contact points between the one or more 3D models and the antagonists may include rotating about an axis, simulating motion, manipulating the size, translation, rotation, or orientation of the 3D models until a contact point is determined, and the like.
- block 321 may, in some embodiments, include determining contact points 740 , 741 , and 742 by rotating the 3D models 770 , 771 , and 772 and/or simulating motion of the 3D models 770 , 771 , and 772 .
- 3D models 770 , 771 , and 772 may have a shared axis 755 .
- these first contact points, along with shared axis 755 may define the axes about which to rotate each of 3D models 770 , 771 , and 772 (e.g., the axis about which to rotate a 3D model may be defined as an axis through the contact point, parallel to shared axis 755 ).
- Simulated motion can continue as part of block 321 until two contact points are determined that are on opposite sides of the center of gravity, which is assessed in block 330 (e.g., similar to the method 300 ). After the one or more contact points have been determined in block 321 , then, in block 330 , stopping criteria may be checked.
- the stopping criteria may include the determination of a single contact point or the determination of multiple contact points as described above with respect to method 300 .
- multiple contact points may be determined for each of 3D models 770 , 771 , and 772 .
- two or more contact points are determined for each of 3D models 770 , 771 , and 772 that represent posterior teeth and only a single or first contact point is determined for each 3D model 770 , 771 , and 772 that represents anterior teeth.
- each 3D model 770 , 771 , and 772 that represents anterior teeth may be translated in the gravity direction in order to find the closest contact point (in block 321 ) and this may meet the stopping criteria (block 330 ) for that 3D model.
- the one or more 3D models of prosthetic teeth may be expanded or contracted until there is a single contact point (or multiple contact points). This expansion or contraction can continue until the stopping criteria are met (e.g., determining the requisite number of contact points). The expansion or contraction may also be followed by motion simulation.
- each of the individual one or more prosthetic teeth will have a separate simulated motion (e.g., without the constraints of the axis 755 shown in FIG. 7 ).
- the separate simulated motion of each of the 3D models (e.g., 3D models 770 , 771 , and 772 ) may be performed in a manner similar to that described in with respect to method 300 .
- the new state for the one or more 3D models based on the contact point may be determined.
- the new state may be the new position, rotation, orientation, size, and/or shape of the one or more 3D models of the prosthetic teeth.
- the operator may continue designing the prosthesis or may generate production data for the prosthesis (block 350 ).
- a method may commence by performing a motion simulation, thereby skipping block 310 and proceeding directly to block 320 .
- the first contact point will be determined by free fall of one 3D model onto the other and then subsequent contact points may be determined in block 320 until the predetermined stopping criteria are met in block 330 .
- the relative placement on the occluding sets of teeth based on the contact points may be determined in block 340 .
- the stopping criteria may include there being no further movement in the motion simulation.
- the simulation of motion may continue until the two 3D models are in a static position, one with respect to the other.
- Various other embodiments are also considered within the scope herein.
- FIG. 4 illustrates an interface 400 with an overlaid representation portion 410 that depicts a lower teeth model 420 and an upper teeth model 430 .
- FIG. 5 which illustrates an interface 500 with an overlaid representation portion 510 , having three contact points on an approximately linear set of upper and lower teeth models (e.g., 520 , 530 ) may cause the 3D models to “fall” in a way that is undesirable or anatomically impossible.
- FIG. 5 which illustrates an interface 500 with an overlaid representation portion 510 , having three contact points on an approximately linear set of upper and lower teeth models (e.g., 520 , 530 ) may cause the 3D models to “fall” in a way that is undesirable or anatomically impossible.
- FIG. 6 illustrates an interface 600 with an overlaid representation portion 610 in which an upper teeth model 630 dropped onto a lower teeth model 620 using motion simulation.
- the stopping criterion (a) used to determine the relative placement of the lower teeth model 620 and the upper teeth model 630 may include determining two candidate contact points 640 and 641 that are on the opposite side of the center of gravity 650 .
- this two-point stopping criterion may produce better results than a three-point stopping criterion.
- FIGS. 7 and 8 depict multiple 3D models of individual prosthetic teeth being moved with respect to an antagonist.
- FIG. 7 is described above.
- FIG. 8 illustrates an interface 800 that includes an overlaid representation portion 810 .
- the overlaid representation portion 810 illustrates the movement of 3D models of individual prosthetic teeth 870 , 871 , and 872 with respect to an antagonist 830 .
- the interface also shows a lower teeth model 820 .
- Interface 800 also illustrates a shared axis of rotation 855 for the 3D models of the individual prosthetic teeth 870 , 871 , and 872 .
- performing motion simulation of the individual 3D models of the prosthetic teeth 870 , 871 , and 872 may include allowing those prosthetic teeth to rotate about axes parallel to axis 855 (as described above with respect to shared axis 755 ) in the direction corresponding to gravity until contact points 840 , 841 , and 842 are determined.
- two contact points will be determined for each tooth, e.g., one on each side of the gravity center for the tooth (described above). In other embodiments, not depicted in FIG.
- each of the individual prosthetic teeth may have simulated motion performed, may be scale translated, rotated, or otherwise modified until contact points are determined, or any other appropriate technique may be used. Further, in some embodiments, the collision detection or other techniques may be used to ensure that neighboring teeth do not overlap or otherwise have intersecting volumes. Examples of this are described elsewhere herein.
- an interface 1000 can have an overlaid representation portion 1010 , a global selection portion 1011 , and a distance map portion 1012 , all on a single interface 1000 . It is also possible, as depicted in FIG. 11 , that two separate sub-interfaces 1100 and 1101 may be used.
- the distance map portion 1120 may be on interface portion 1101 and the overlaid representation portion 1110 and global selection portion 1111 may be on interface portion 1100 .
- These various interface portions may be shown on separate screens, on separate displays or in separate windows. Other configurations of the various portions on various displays or in various windows may also be used.
- FIG. 12 depicts an interface 1200 which has an overlaid representation portion 1210 .
- the overlaid representation portion 1210 there is a 3D model of lower teeth 1220 which is shown as opaque, as well as 3D models of prosthetic teeth 1270 , 1271 , and 1272 .
- manipulation handles 1280 , 1281 , 1282 , 1290 and 1291 are also depicted in FIG. 12 .
- These manipulation handles may provide a number of ways to manipulate the individual 3D models of the prosthetic teeth 1270 , 1271 and 1272 with respect to one another, with respect to the model of the lower teeth 1220 , and/or with respect to a virtual multi-tooth prosthesis encompassing the 3D models of prosthetic teeth 1270 , 1271 and 1272 . That is, if there were a 3D model of a virtual multi-tooth prosthesis that included the 3D prosthetic teeth 1270 , 1271 and 1272 , then the manipulation points 1280 , 1281 , 1282 , 1290 and 1291 may allow the models for 3D prosthetic teeth 1270 , 1271 and 1272 to be manipulated with respect to the virtual multi-tooth prosthesis.
- Examples of manipulations that may be available via the manipulators 1280 , 1281 , 1282 , 1290 and 1291 may be scaling, translating, rotating, etc.
- the 3D model of prosthetic tooth 1270 may decrease in size (e.g., be scaled smaller), and 3D model of prosthetic tooth 1271 may increase in size (e.g., be scaled larger). This is depicted in FIG. 14B , in which the manipulator 1490 has been moved to the left with respect to the location of manipulator 1290 in FIG.
- the 3D model of a prosthetic tooth 1470 has decreased in size as compared with 3D model of prosthetic tooth 1270 in FIG. 12 .
- the 3D model of the prosthetic tooth 1471 has increased in size as compared with 3D model of prosthetic tooth 1271 in FIG. 12 .
- the tooth associated with that manipulator may be translated with respect to the other teeth or with respect to the virtual multi-tooth prosthesis.
- manipulator 1481 has been moved up in screen space with respect to where it was in FIG. 12 . Therefore, the 3D model of the prosthetic tooth 1471 has been translated up in screen space with respect to the other teeth for the virtual multi-tooth prosthesis.
- manipulators and manipulations are also possible and considered part of the scope of embodiments discussed herein.
- other types of manipulations of the teeth are also possible.
- the manipulations may also include surface deformations, and the like.
- FIG. 13A depicts a method 1300 for prosthesis manipulation in dental prosthesis design.
- a virtual multi-tooth prosthesis is presented relative to an area to be reconstructed.
- a virtual multi-tooth prosthesis including 3D models of prosthetic teeth 1270 , 1271 and 1272 is presented relative to both the underlying portion that is to be reconstructed, as represented by lower teeth model 1220 , as well as with respect to its antagonist teeth (not shown in FIG. 12 ).
- a manipulation command is received, said manipulation command relating to a subset of the teeth in the virtual prosthesis.
- the phrase “subset of the teeth in the virtual prosthesis” includes its customary and ordinary meaning, including meaning that the subset is fewer than all of the teeth in the virtual prosthesis, including one tooth in the virtual prosthesis.
- a manipulation command may be received related to a single 3D model of a prosthetic tooth 1270 or with respect to multiple models of prosthetic teeth 1270 , 1271 ; and 1272 .
- a command relating only to a single 3D model of a tooth 1270 may be a translation manipulation indicated by the movement of manipulator 1280 .
- This manipulation command may affect only the 3D model of prosthetic tooth 1270 , as discussed more below, it may also affect, perhaps to a lesser extent, the position, scaling, placement, etc. of other 3D models of prosthetic teeth 1271 and 1272 .
- the manipulation of manipulator 1290 which indicates a scaling of the 3D model of prosthetic teeth 1270 and 1271 relative to the virtual multi-tooth prosthesis and/or relative to one another, may affect those teeth and, perhaps to a lesser extent, the other 3D model(s) of prosthetic teeth 1272 .
- block 1330 includes modifying the prosthesis based on the received manipulation commands.
- Modifying the teeth based on the received manipulation commands may include any appropriate action. For example, if the manipulation command is meant to translate a single 3D model of a tooth in the lingual or buccal direction, then that tooth may be translated in the lingual or buccal direction with respect to the other teeth and/or with respect to the virtual prosthesis. If the command received requires scaling of two or more of the teeth with respect to one another or with respect to the virtual multi-tooth prosthesis, then the 3D models of those teeth may be scaled appropriately. That is, in some embodiments, one of the teeth may be scaled to increase its size and the other may be scaled to decrease its size. Scaling the teeth with respect to one another in this way may prevent large gaps from forming in the multi-tooth prosthesis and/or prevent overlap between neighboring teeth.
- Modifying the prosthesis based on the received manipulations may include performing the requested action and, in some embodiments, performing additional actions or calculations in order to align or place all of the 3D models of teeth in the prosthesis and/or reduce gaps (or correct overlaps) between neighboring teeth. For example, in some embodiments, when neighboring teeth are scaled or translated, a gap may form between two teeth, as depicted in FIG. 15A . The gap depicted in FIG.
- 15A may have resulted, for example, from a scaling of the 3D model of prosthetic tooth 1570 with respect to the 3D model of prosthetic tooth 1571 or it may have resulted from translating either 3D model of prosthetic tooth 1570 and/or 3D model of prosthetic tooth 1571 with respect to one another.
- the techniques may include calculating the relative placement of all of the 3D models of prosthetic teeth in the virtual multi-tooth prosthesis after each manipulation command (or possibly after a series of manipulation commands). For example, after the manipulation command is received, all of the 3D models of the prosthetic teeth may be placed next to one another in the area to be reconstructed using bounding volumes as a first approximation. After the initial placement, the gaps (or overlaps) of the 3D models of the prosthetic teeth may be reduced or eliminated using the techniques described elsewhere herein. For example, looking to FIG.
- the 3D models of prosthetic teeth 1470 , 1471 , and 1472 are bounded by bounding boxes 1475 , 1476 , and 1477 (shown as rectangles in FIG. 14B , even though they may be rectilinear cubes). These bounding boxes 1475 , 1476 , and 1477 are used to align the 3D models of prosthetic teeth 1470 , 1471 , and 1472 over the area to be reconstructed on the patient (as represented as part of 3D model of the lower teeth 1420 ). In some embodiments, the bounding boxes 1475 , 1476 , and 1477 are scaled, translated and/or otherwise aligned such that the bounding boxes, together, fill the entire area to be reconstructed.
- the gaps (or overlaps) between neighboring teeth models 1470 , 1471 , and 1472 can be corrected or approximately corrected by, e.g., scaling and/or translating each tooth (as described with respect to FIGS. 15A and 15B and elsewhere herein).
- only the affected tooth or teeth may be manipulated, thereby leaving unchanged the placement, scale, and rotation of one or more of the 3D models of teeth in the virtual multi-tooth prosthesis.
- the manipulator 1490 in FIG. 14B is manipulated, then this may only affect the scale of the 3D models of prosthetic teeth 1470 and 1471 .
- 3D model of prosthetic tooth 1472 may remain unchanged in position, scale, and/or rotation. After the two 3D models of teeth 1470 and 1471 have been scaled, translated, rotated, etc., any gap or overlap between them may be reduced or eliminated in the manner described with respect to FIGS. 15A and 15B and elsewhere herein.
- an initial placement or alignment of the 3D models of prosthetic teeth 1470 , 1471 , and 1472 can be obtained from any appropriate means, such as by referencing an alignment stored in a dental prosthesis library.
- the techniques may use bounding volumes or bounding boxes, not only for initial alignment of the 3D models of prosthetic teeth, but also for attempting to ensure that 3D models of neighboring teeth do not intersect or overlap in terms of volume.
- the bounding box can be used as a first approximation to ensure that neighboring teeth do not intersect or overlap in terms of volume.
- the bounding boxes may be used as a first approximation in order to ensure that the 3D models of prosthetic teeth do not intersect or overlap in terms of volume.
- the bounding boxes may also be used for rotation or any other manipulation of one or more teeth in the virtual multi-tooth prosthesis.
- FIGS. 15A and 15B depict two neighboring teeth 1570 and 1571 and their respective bounding boxes 1590 and 1591 .
- a gap may exist between the neighboring teeth.
- the gap is closed in order to increase the aesthetic appeal and/or function of the virtual multi-tooth prosthesis.
- the smallest distance between the two models may be determined and the gap, as illustrated by 1595 and 1596 , between the two models 1570 and 1571 may be determined.
- the models 1570 and 1571 may then be scaled and/or translated to close the gap represented by 1595 and 1596 .
- each model may be scaled in the direction of the other model.
- the models in order to close the gap between 3D models 1570 and 1571 , the models may be scaled so that each 3D model 1570 and 1571 covers half of the distance (e.g., distance 1595 summed with distance 1596 ).
- the two 3D models 1570 and 1571 after scaling, are depicted in FIG. 15B . In FIG. 15B , the two models touch or nearly touch at point 1597 .
- each 3D model in order to close or approximately close a gap between neighboring teeth, each 3D model may be scaled to bring the closest points between the two 3D models 1570 and 1571 to the border of the two bounding boxes.
- the model 1570 may be scaled by an amount to close the previous gap 1595 to bring the 3D model 1570 to the border of the two bounding boxes (and the same may be the case for model 1571 ).
- Other methods and techniques of closing the gaps between teeth are also possible and are considered within the scope of the embodiments herein.
- the gap between neighboring teeth may not be closed in a virtual multi-tooth prosthesis. Or a connector may be used to span the space between the neighboring teeth (not depicted in FIGS. 15A and 15B ).
- the neighboring teeth have a gap (represented by distances 1595 and 1596 ).
- the techniques described herein can include removing or reducing an overlap of neighboring teeth (not shown in FIGS. 15A and 15B ).
- the neighboring teeth may be scaled (e.g., scaled to be smaller) and/or translated in order to remove the overlap between the neighboring teeth.
- the virtual multi-tooth prosthesis may be modified based on occlusion with the antagonist. Modifying a prosthesis based on occlusion with antagonist teeth is described elsewhere herein.
- the prosthesis may be treated as a rigid object and occlusion of that rigid object may be calculated with respect to the antagonist and the entire virtual multi-tooth prosthesis may move as a single structure.
- each individual 3D model of a tooth may be modified separately based on its own occlusion with the antagonist.
- gaps may form between neighboring teeth. That is, after the occlusion is estimated and the individual 3D models of teeth have been moved with respect to one another, then a gap (or an overlap) may result between neighboring 3D models of teeth. Closing the gap (or avoiding the overlap) between neighboring teeth is described above. In the situation and embodiments in which the virtual multi-tooth prosthesis is moved, as a rigid body, it is unlikely or impossible that an additional gap or overlap will be introduced between neighboring teeth and therefore there may be no gap or overlap to correct between neighboring teeth.
- the virtual multi-tooth prosthesis may be presented relative to the area to be reconstructed in block 1310 . Additionally, when the operator is ready to continue designing the multi-tooth prosthesis, the operator may continue to other steps not depicted in method 1300 . Additionally, when the operator is ready to produce the multi-tooth prosthesis, the manufacturing data may be produced as part of block 1350 .
- FIG. 13B depicts another method 1301 for prosthesis manipulation in dental prosthesis design.
- a command is received to translate, scale, rotate, or otherwise manipulate an individual prosthetic tooth in the virtual prosthesis (block 1321 )
- the 3D model for that prosthetic tooth may be manipulated based on that command and occlusion may be estimated for either that individual tooth or for the virtual prosthesis as a whole (block 1331 ). For example, turning to FIG.
- a 3D model of a virtual prosthesis is presented relative to the area to be reconstructed. This is described generally with respect to block 1310 .
- manipulation commands are received which are related to all or a portion of the prosthesis. The types of commands that may be received are described with respect to block 1320 .
- the prosthesis is modified based on the manipulation commands and based on the occlusion with antagonist teeth. Manipulation of the teeth is described above with respect to block 1330 and modifying the prosthesis based on occlusion with antagonist teeth is described above with respect to block 1340 .
- the prosthesis may be again displayed relative to the area to be reconstructed in block 1311 . Additionally, once the operator is happy with the virtual prosthesis or is ready to produce the prosthesis, the operator may continue to other steps in prosthesis design (not pictured) or may produce manufacturing data for the prosthesis (block 1350 ).
- the 3D model of the prosthetic tooth represents the outer surface of the prosthetic tooth.
- the inner portion of the 3D model of the prosthetic tooth may be associated with an implant, a prepared tooth, a gum surface, etc.—and may have an inner 3D surface designed to mate with the implant, prepared tooth, gum surface, etc.
- the 3D model of a prosthetic tooth is manipulated (block 1330 or block 1331 ) and/or modified based on occlusion (block 1340 or block 1331 ), then only the outer surface is manipulated or modified and the inner surface (which is mated with an implant, prepared tooth, gum surface, etc) may not be modified.
- manipulating or modifying the exterior surface of a tooth may not change how the tooth is mated with an underlying surface.
- computer 210 , display 220 , and/or input device 230 may each be separate computing devices, applications, or processes or may run as part of the same computing devices, applications, or processes—or one of more may be combined to run as part of one application or process—and/or each or one or more may be part of or run on computing devices.
- Computing devices may include a bus or other communication mechanism for communicating information, and a processor coupled with the bus for processing information.
- the computing devices may have a main memory, such as a random access memory or other dynamic storage device, coupled to the bus. The main memory may be used to store instructions and temporary variables.
- the computing devices may also include a read-only memory or other static storage device coupled to the bus for storing static information and instructions.
- the computer systems may also be coupled to a display, such as a CRT or LCD monitor.
- Input devices may also be coupled to the computing devices. These input devices may include a mouse, a trackball, or cursor direction keys.
- Each computing device may be implemented using one or more physical computers, processors, embedded devices, or computer systems or a combination or portions thereof.
- the instructions executed by the computing device may also be read in from a computer-readable storage medium.
- the computer-readable storage medium may be a CD, DVD, optical or magnetic disk, laserdisc, carrier wave, or any other medium that is readable by the computing device.
- hardwired circuitry may be used in place of or in combination with software instructions executed by the processor. Communication among modules, systems, devices, and elements may be over direct or switched connections, and wired or wireless networks or connections, via directly connected wires, or any other appropriate communication mechanism.
- the communication among modules, systems, devices, and elements may include handshaking, notifications, coordination, encapsulation, encryption, headers, such as routing or error detecting headers, or any other appropriate communication protocol or attribute.
- Communication may also messages related to HTTP, HTTPS, FTP, TCP, IP, ebMS OASIS/ebXML, secure sockets, VPN, encrypted or unencrypted pipes, MIME, SMTP, MIME Multipart/Related Content-type, SQL, etc.
- Any appropriate 3D graphics processing may be used for displaying or rendering, including processing based on OpenGL, Direct3D, Java 3D, etc.
- Whole, partial, or modified 3D graphics packages may also be used, such packages including 3DS Max, SolidWorks, Maya, Form Z, Cybermotion 3D, or any others.
- various parts of the needed rendering may occur on traditional or specialized graphics hardware.
- the rendering may also occur on the general CPU, on programmable hardware, on a separate processor, be distributed over multiple processors, over multiple dedicated graphics cards, or using any other appropriate combination of hardware or technique.
- Conditional language used herein such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements, and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements, and/or states are included or are to be performed in any particular embodiment.
- All of the methods and processes described above may be embodied in, and fully automated via, software code modules executed by one or more general purpose computers or processors, such as those computer systems described above.
- the code modules may be stored in any type of computer-readable storage medium or other computer storage device. Some or all of the methods may alternatively be embodied in specialized computer hardware.
Landscapes
- Health & Medical Sciences (AREA)
- Public Health (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Data Mining & Analysis (AREA)
- Biomedical Technology (AREA)
- Primary Health Care (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Processing Or Creating Images (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
- Prostheses (AREA)
Abstract
Description
- 1. Field
- The present application generally relates to dental planning, and more particularly to prosthesis manipulation in dental prosthesis design.
- 2. Description of Related Technology
- The use of computer systems to design dental prostheses has increased in recent years. The computer systems allow a dentist, dental technician, or other operator to design dental prostheses for individual patients. Individual prosthesis designs are often called “situations,” “dental plans,” or “prosthetic plans.” Operators using the computer systems can design plans based on a library of the teeth shapes and positions, patient data, and available equipment and hardware.
- When designing dental prostheses, current systems may provide sets of 3D models of prosthetic teeth or crowns as part of a library. This library of teeth may be used to help design prosthetic teeth for a patient using 3D graphics or CAD software. Current systems don't, however, allow flexibility for designing dental prostheses. The systems limit, for example, what a dentist or other operator can do with libraries of 3D models of prosthetic teeth. The techniques, systems, methods, and computer-readable storage media described herein overcome some of the shortcomings of the prior art and provide for prosthesis manipulation in dental prosthesis design.
- Presented herein are methods, systems, devices, and computer-readable media for prosthesis manipulation in dental prosthesis design. This summary in no way limits the invention herein, but instead is provided to summarize a few of the embodiments.
- Embodiments herein include techniques, methods, systems, devices, and computer-readable media for prosthesis manipulation in dental prosthesis design. These can include presenting, via a computer-implemented interface, a virtual multi-tooth prosthesis, said virtual multi-tooth prosthesis comprising two or more 3D models of individual teeth, said virtual multi-tooth prosthesis being presented relative to a 3D representation of a multi-tooth area of a patient's mouth that is to be reconstructed, said computer-implemented interface running on one or more computing devices. A command to manipulate a subset of the teeth in the virtual multi-tooth prosthesis may be received from an operator, via the computer-implemented interface. One or more parameters for the shape of the virtual multi-tooth prosthesis may be modified based on the manipulation command. The one or more parameters may be related to the subset of teeth. Production data related to the virtual multi-tooth prosthesis may be generated.
- Some embodiments for prosthesis manipulation in dental prosthesis design include presenting, via a computer-implemented interface, a virtual prosthesis, where the virtual prosthesis can be presented relative to a 3D representation of an area of a patient's mouth that is to be reconstructed. The 3D representation of the area of the patient's mouth that is to be reconstructed can have a corresponding antagonist area of teeth. One or more prosthesis manipulation commands may be received from an operator, via the computer-implemented interface. For each prosthesis manipulation command of the one or more prosthesis manipulation commands, the virtual prosthesis may be modified based on the prosthesis manipulation command and based on an occlusion of the prosthesis relative to the corresponding antagonist area of teeth. In addition, production data related to the virtual prosthesis may be generated.
- Numerous other embodiments are described throughout herein.
- For purposes of summarizing the invention and the advantages achieved over the prior art, certain objects and advantages of the invention are described herein. Of course, it is to be understood that not necessarily all such objects or advantages need to be achieved in accordance with any particular embodiment. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught or suggested herein without necessarily achieving other objects or advantages as may be taught or suggested herein.
- All of these embodiments are intended to be within the scope of the invention herein disclosed. These and other embodiments will become readily apparent to those skilled in the art from the following detailed description having reference to the attached figures, the invention not being limited to any particular disclosed embodiment(s).
-
FIGS. 1A and 1B illustrate two interfaces for occlusion estimation in dental prosthesis design. -
FIG. 2 illustrates an example system for occlusion estimation in dental prosthesis design. -
FIGS. 3A and 3B illustrate two example methods for occlusion estimation in dental prosthesis design. -
FIG. 4 illustrates a third interface for occlusion estimation in dental prosthesis design. -
FIG. 5 illustrates a fourth interface for occlusion estimation in dental prosthesis design. -
FIG. 6 illustrates a fifth interface for occlusion estimation in dental prosthesis design. -
FIG. 7 illustrates a sixth interface for occlusion estimation in dental prosthesis design. -
FIG. 8 illustrates a seventh interface for occlusion estimation in dental prosthesis design. -
FIGS. 9A , 9B, and 9C illustrate two sets of candidate contact points for occlusion estimation in dental prosthesis design. -
FIG. 10 illustrates an eighth interface for occlusion estimation in dental prosthesis design. -
FIG. 11 illustrates a ninth interface for occlusion estimation in dental prosthesis design. -
FIG. 12 illustrates a first interface for prosthesis manipulation in dental prosthesis design. -
FIGS. 13A and 13B illustrate two example methods for prosthesis manipulation in dental prosthesis design. -
FIGS. 14A and 14B illustrate two interfaces for prosthesis manipulation in dental prosthesis design. -
FIGS. 15A and 15B illustrate two schematics for prosthesis manipulation in dental prosthesis design. -
FIG. 16 depicts an abstract representation of the relative placement of contact points and the center of gravity. - Various embodiments for occlusion estimation in dental prosthesis design are described herein. Some embodiments provide improved occlusion estimation over current systems. In some embodiments, initial placement of occluding teeth (before estimating occlusion) is either defined by the relative placement of the upper and lower teeth during intraoral scanning, during the scanning of the occluding teeth's physical models, by using a scanned checkbite, and/or by an operator manipulating one or both of the 3D models associated with the occluding teeth in order to obtain an initial relative placement. After an initial relative placement has been defined, the techniques may include finding a first contact point between the two 3D models (e.g., in the gravity direction). This can be done by determining using a distance calculation, for example, the closest points between the two 3D models. Another method for finding the first contact point between the two 3D models could be simulating one of the two 3D models ‘falling’ onto the other. After the first contact point has been determined (and if it has not already been accomplished), one of the 3D models can be translated in the gravity direction in order to bring the two 3D models together at that first contact point. The first contact point can be used as a pivot in a motion simulation, such as a six-degree of freedom motion simulation, a constrained rigid body motion simulation, a free-fall simulation, etc.
- Once the pivot point is determined, the techniques may proceed by simulating the motion of one of the 3D models with respect to the other where the pivot point is used to restrict the rotation. For example, if the first contact point had been between the cusp of one tooth and the fissure of its antagonist tooth, then the two 3D models remain together at that point, and that point can act as a pivot, as one of the 3D models rotates with respect to the other around that point. The simulated rotation continues until one or more contact point(s) are detected. The contact points are detected at each step of the simulation by a collision detection engine. That is, once one of the 3D models has rotated onto the other 3D model and the corresponding contact points are determined with enough precision, that step of the simulation is terminated. If only one contact point is found, this contact point is used as a pivot in the subsequent step, regardless of whether or not it is the same contact point as in the previous step (‘losing’ a contact point could be caused, for example, by numerical error or imprecision). In various embodiments, an attempt to improve the precision of contact points determined may include, once one or more contact points are found, refining the previous simulation step with smaller and smaller step sizes (e.g., simulating motion simulation over smaller amounts of time) to reduce any interpenetration of the two 3D models.
- If no contact point is found, that is if the contact point or contact points in the previous step are lost (for instance due to numerical imprecision), a motion simulation, such as a free fall, may be performed until a contact point is determined. In the case in which more than one contact point has been determined, then a check may be performed to determine what contact point(s) to use from the set of discovered contact points. Another motion simulation step may proceed using some or all of the candidate contact points (e.g., one or two of the candidate contact point may be used). A subsequent motion simulation step using two candidate contact points may include using the two candidate contact points to define an axis of rotation in the simulated motion. The process of determining new candidate contact points will continue until predetermined stopping criteria are met. Examples of stopping criteria are discussed more below.
- In some embodiments, the 3D model of the teeth that is “moving” (e.g., the upper teeth) with respect to the other 3D model (e.g., the lower teeth) will have associated with it a center of gravity. The center of gravity can be determined in numerous ways. For example, the center of gravity may be determined by assigning a weight to each of the triangles, vertices, pixels, or voxels that form the 3D model and determine a center of gravity based on those assigned weights. In some embodiments, the predetermined stopping criteria may be met once there are three contact points that define a triangle that encompasses the center of gravity. Once this stopping criterion is met, the simulation may be terminated and the occlusion may be estimated based on those three contact points—those three contact points may define the placement of one of the 3D models with respect to another 3D model. For example, if the top set of teeth is seen as the moving set of teeth and a first contact point is determined between the top set of teeth and the lower set of teeth, then the first contact point may be used as a pivot point. The simulation may continue until subsequent contact points are determined that define a triangle that includes or encompasses the center of gravity of the top set of teeth.
- In some embodiments, there may be a single stopping criterion that is met when there are two contact points on opposite sides of the center of gravity. For example, if a crown and its antagonist are the subjects of the occlusion estimation, then candidate contact points may be determined until there are two contact points that span or are on opposite sides of the center of gravity of the moving body. In some embodiments, the stopping criterion (a) may be met when the force normals acting on the moving body are such that no additional rotation is possible in the motion simulation.
- Turning now to
FIG. 1A , we see aninterface 100 that includes an overlaidrepresentation portion 110 as well as aglobal selection portion 111. The overlaidrepresentation portion 110 may display alower teeth model 120 and anupper teeth model 130. Thelower teeth model 120 may be represented as an opaque 3D model and theupper teeth model 130 may be represented as a transparent or semitransparent 3D model. The global selection portion may have buttons that allow either or both of thelower teeth model 120 andupper teeth model 130 to be displayed transparently or to provide other global manipulation functions. As depicted inFIG. 1B , aninterface 100 may also include adistance map portion 112. Thedistance map portion 112 may show the distance between thelower teeth model 120 and theupper teeth model 130 as a contour graph, a color-coded graph, a shaded graph, and/or the like. This distance map may also be displayed, in some embodiments, as a texture map on themodels 120 and/or 130 shown in the overlaidrepresentation portion 110. Returning again toFIG. 1A , theinterface 100 shows thelower teeth model 120 and theupper teeth model 130 before the occlusion estimation has occurred.FIG. 1B , on the other hand, shows themodel 120 and themodel 130 after the occlusion estimation has occurred. As depicted inFIG. 1B , after the first contact point has been determined between thelower teeth model 120 and theupper teeth model 130, that contact point can be used as a pivot to determine a subsequent set of one or more candidate contact points. The determination of candidate contact points may continue until three candidate contact points 141, 140, 142 are determined that define atriangle 199 that encompasses or includes the center ofgravity 150. -
FIG. 2 illustrates anexample system 200 for occlusion estimation and/or prosthesis manipulation in dental prosthesis design. Thesystem 200 may include one ormore computers 210 coupled to one ormore displays 220, and one ormore input devices 230. Anoperator 240, who may be a dentist, dental technician, or other person, may plan dentalprostheses using system 200 by manipulating the one ormore input devices 230, such as a keyboard and/or a mouse. In some embodiments, while working on the dental plan, theoperator 240 may view the dental plan and other related dental plan data on thedisplay 220. Thedisplay 220 may include two or more display regions or portions, each of which displays a different view of the dental plan. For example, in some embodiments, thedisplay 220 may show a semi-realistic 3D rendering of the dental plan, a localized abstraction of the dental plan, and/or a cross-sectional representation of the dental plan. Each of these displays or portions may be linked internally within a program and/or using data oncomputer 210. For example, a program running on acomputer 210 may have a single internal representation of the dental plan in memory and the internal representation may be displayed in two or more abstract or semi-realistic manners ondisplay 220. - In some embodiments, the
operator 240 may be able to perform a command, such as select, move, manipulate, or make transparent, opaque, or invisible, on a particular substructure in the dental plan. Theoperator 240 may be able to perform this command by manipulating theinput device 230, such as clicking with a mouse on a particular region of one of the abstract or semi-realistic versions of the dental plan displayed on thedisplay 220. - In various embodiments, the
computer 210 may include one or more processors, one or more memories, and one or more communication mechanisms. In some embodiments, more than one computer may be used to execute the modules, methods, blocks, and processes discussed herein. Additionally, the modules and processes herein may each run on one or multiple processors, on one or more computers; or the modules herein may run on dedicated hardware. Theinput devices 230 may include one or more keyboards (one-handed or two-handed), mice, touch screens, voice commands and associated hardware, gesture recognition, or any other means of providing communication between theoperator 240 and thecomputer 210. Thedisplay 220 may be a two-dimensional (“2D”) or 3D display and may be based on any technology, such as LCD, CRT, plasma, projection, etc. - The communication among the various components of
system 200 may be accomplished via any appropriate coupling, including USB, VGA cables, coaxial cables, FireWire, serial cables, parallel cables, SCSI cables, IDE cables, SATA cables, wireless based on 802.11 or Bluetooth, or any other wired or wireless connection(s). One or more of the components insystem 200 may also be combined into a single unit or module. In some embodiments, all of the electronic components ofsystem 200 are included in a single physical unit or module. -
FIGS. 3A and 3B illustrate two techniques for occlusion estimation in dental prosthesis design. In estimating the occlusion, the technique may include motion simulation of one 3D model of teeth with respect to another 3D model of teeth. The motion simulation may include, in various embodiments, a six-degrees-of-freedom rigid body motion simulation, free-fall simulation, a rigid body simulation with one or a combination of constraints, or other motion simulation. The techniques can proceed by having either the upper or lower set of teeth be the 3D model that “moves” and the other be stationary. Alternatively, both models could move with respect to the others. The first step in occlusion estimation, in some embodiments, may include determining as a first contact point, the point on the lower 3D model which is closest, in the direction of gravity, to the upper 3D model. Once that initial contact point is determined, other candidate contact points between the upper 3D teeth model and lower 3D teeth model may be determined using simulated motion until one or more predetermined stopping criteria are met. In various embodiments, the initial contact point may be determined by finding the closest point between the first and second 3D models in the gravity direction. In subsequent steps of the motion simulation, candidate contact points (possibly including the initial contact point) may be found. After assessing the appropriateness of the candidate contact points, each candidate contact point (possibly including the initial contact point) may or may not be chosen for use in subsequent steps of the motion simulation. For example, if a particular contact point is determined, and it is between the initial contact point (assuming that the initial contact point was again found to be a contact point in this step of the simulation) and the center of gravity, the particular contact point may be used instead of the initial contact point. In this way, the first contact point may not be used in subsequent steps of the simulation and, similarly, may not end up in the final set of contact points that are used to define the occlusion between the first and second 3D models. - In various embodiments, determining whether two contact points are on the opposite sides of the center of gravity may include defining a bisector or bisection plane through the center of gravity that splits the first 3D model into two segments, for example, a left segment and a right segment, and, optionally, splits the second 3D model into two segments, for example, the left segment and the right segment. For example, if the first 3D model of teeth includes all of the teeth in the lower jaw of a patient and the center of gravity is along the center line of the jaw, then the teeth on the left side of the mouth and the teeth in the right side of the mouth may be in different sections. Determining whether there are two contact points on the opposite sides of the center of gravity may include determining whether there are contact points in the two different sections of the mouth (the left section and the right section). As another example, consider
FIG. 16 . If two contact points define aline segment 1610, that is, from onecontact point 1640 anothercontact point 1641, and theline segment 1610 is part of aline L 1620, then determining whether a center of gravity is between the two contact points may include determining whether the closest point online L 1620 to the center of gravity is between the twocontact points line segment 1610 defined by the twocontact points gravity 1650, the closest point on theline segment 1610 ispoint 1650A. Since 1650A is between the twocontact points gravity 1650 is considered “between” the two contact points. If on the other hand a center of gravity's 1651closest contact point 1651A online L 1620 is not online segment 1610, then the center of gravity is not considered to be “between” the twocontact points - As another example, in some embodiments, when motion simulation is about a rotation axis (described elsewhere herein), checking to see whether the center of gravity is between two contact points comprises can include projecting the contact points onto the rotation plane (e.g., a plane whose normal is the rotation axis and that includes the gravity center on the plane). Various embodiments can then determine whether the projected points are on each side of a certain line defined by the projection of the gravity force vector onto the rotation plane and going through the gravity center. If the two are on opposite sides of the certain line, then they are on opposite sides of the center of gravity. There are numerous other ways to determine whether the center of gravity is between the two contact points, and these are considered within the scope of the embodiments herein.
- Determining whether the center of gravity is within a triangle that is defined by three contact points may include projecting the triangle that is defined by the three contact points onto the occlusal plane and projecting the center of gravity onto the occlusal plane. If the center of gravity projected onto the occlusal plane lies within the triangle defined by the three contact points, then it may be considered that the center of gravity is within the triangle defined by the three contact points. As above, numerous other methods of determining whether the center of gravity is within the triangle defined by the three contact points may be used and are considered part of the embodiments herein.
- In various embodiments, the techniques described herein may include changing the state (e.g., position, rotation, scaling, shape, etc.) of one or more 3D models based on the contact with the antagonist teeth. For example, if a crown or bridge is being designed and there are multiple units (e.g., teeth) in the crown or bridge, then each unit in the bridge or crown may be rotated, scaled, or otherwise changed in order to provide at least one contact point with the antagonist. After the relative placement of the occluding sets of teeth are determined based on the contact points or after new states for one or more 3D teeth models are determined based on the contact points, designing the prosthesis may continue and/or production data for the prosthesis may be generated based on the 3D model of the prosthesis.
- Turning now to
FIG. 3A , which depicts amethod 300 for occlusion estimation in dental prosthesis design, in block 310 a first contact point is determined in the direction of gravity based on the initial positions of the 3D models of occluding teeth. As noted above, the initial position may be defined based on the known relative positions for the first 3D model and the second 3D model. For example, the initial position may be known because a scanning procedure to obtain the first 3D model of the teeth (e.g., the lower set of teeth) and the second 3D model of the teeth (e.g., the upper set of teeth) may have been performed and the initial placement may be defined in the relative placement of the first 3D model and the second 3D model during the scanning procedure. This may happen if both 3D models are placed in known relation to each other during the scanning procedure or if each of them is placed relative to some fixed coordinate system during the scanning procedure. In some embodiments, the initial relative placement of the first 3D model with respect to the second 3D model of the teeth may be known based on a scanned check bite. That is, if the second 3D model of the teeth is determined at least in part based on the scanned check bite, then the first 3D model of the teeth may be surface matched to the check bite and that check bite may provide the initial placement of the two sets of teeth. Additionally, as noted above, an operator may manipulate the relative placement of the first 3D model and the second 3D model before performing the occlusion estimation. - Returning again to block 310, determining the first contact point between occluding teeth in the direction of gravity based on the initial position may also include an initial determination of the gravity direction. The gravity direction may be determined in any of numerous ways, including having it be predefined based on the scanning procedure and the like. Additionally, the gravity direction may, be perpendicular to an occlusal plane of the first and/or the second 3D model of teeth. The occlusal plane of the two 3D models may be known ahead of time or it may be determined in any number of ways. For example, if a planar object such as a rectangle is “dropped” onto, e.g., the first 3D model, (or vice versa) then that rectangular object, once it has come to rest on the first 3D model, may define the occlusal plane. “Dropping” the rectangular object onto the 3D model may be accomplished using, e.g., the simulated motion described with respect to
FIG. 3A orFIG. 3B , or any other appropriate technique. The normal of the planar rectangular object may be used to define the direction of gravity. In various embodiments, the distance between the first and second 3D models is determined in a direction other than the direction of gravity. For example, the overall closest point between two triangular meshes representing the first and second 3D models may be determined or the closest point between the two 3D models in a direction other than gravity may be determined and used as the closest point between the 3D models. - Determining the first contact point between the occluding sets of teeth in the direction of gravity in
block 310 may be performed based on any appropriate calculation. For example, it may be determined by performing a numerical calculation of the closest point between the two 3D models in the direction of gravity. In some embodiments, the closest point between the two 3D models may be determined by simulating free fall of one of the 3D models with respect to the other 3D model. For example, based on the initial position, one of the 3D models may be “dropped” onto the other 3D model. The first contact point between the two 3D models when dropped may be the closest point between the two 3D models. One 3D model may then, optionally, in some embodiments, be moved toward the other in the direction of gravity so that the closest point between the two 3D models would be the contact point between the two 3D models. Similarly, in some embodiments, the two 3D models may then, optionally, be moved toward each other in the gravity direction instead of moving one 3D model and keeping one fixed. - In
block 320, motion simulation may be used to determine subsequent candidate contact points. After the first contact point is determined, it is used as a constraint on the simulated motion. That is, that contact point will remain in contact for the duration of that step of the simulated motion. The simulated motion will result in the moving 3D model pivoting around that contact point until one or more other contact points are determined. In some embodiments, it is possible that one or more contact points may be lost, perhaps due to numerical error. If one or more contact points are lost due to numerical error, the simulation may continue. For example, the moving 3D model could fall in the gravity direction until at least one contact point is found. - Determining a contact point may include using any particular type of collision detection. For example, if the first and second 3D models each are represented as a triangular mesh, then contact points may be determined by looking for collisions between the two triangular meshes. Further, in some embodiments, if it is determined that two particular triangles, one in each 3D model's triangular mesh, intersect, then the actual point or edge of intersection may be used (e.g., if it is known) or if there is merely an indication that the two triangles intersect, then the contact points may be estimated as the centers of the two triangles. Numerous other collision detection techniques may be used to determine contact points and are encompassed by embodiments herein.
- Once the candidate contact points are determined, a check will be made in
block 330 to determine whether the stopping criteria have been met. Checking the stopping criteria may include determining whether two contact points in the set of candidate contact points are on opposite sides of the center of gravity. Another check of stopping criteria may include determining whether there are three contact points which define a triangle that includes the center of gravity of the moving 3D model. - If the stopping criteria are not met, then a determination may be made as to which contact points to use in subsequent steps of the motion simulation (executed in block 320). For example, consider
FIGS. 9A and 9B . The previous set of candidate contact points may have already includedcontact points FIG. 9A , and those twocontact points additional contact point 942 may have been determined. Since contact points 940, 941, and 942 do not form a triangle that encompasses the center ofgravity 950, a determination may be made as to which of the candidate contact points 940, 941, and/or 942 to use in subsequent motion simulations. The three contact points define three axes ofrotation rotation 960 associated with them. Turning toFIG. 9C , if axis ofrotation 960 is used during a motion simulation, thenormal force 998 applied on the moving object during the motion simulation on the othercandidate contact point 942 will be against theforce 999 associated with thesimulated rotation 960. Thepoint 942 being on the same side of the center ofgravity 950 would normally rotate during motion simulation. Yet, thecontact point 942 is already in contact with the other 3D model and therefore further rotation (or dropping) will not be possible. As such, axis ofrotation 960 will be excluded from consideration as the proper axis of rotation. Therefore, the set including both candidate contact points 940, 941 will not be used in the subsequent step of the simulation. If, on the other hand, the simulation were performed with axis ofrotation 961, which bridges candidate contact points 941, 942, then as the moving 3D model is in motion, the normal force oncandidate contact point 940 will create a moment in the direction of rotation. Therefore, the axis ofrotation 961 between candidate contact points 941, 942 is a proper axis of rotation. Therefore, candidate contact points 941, 942 will be used in the subsequent step of the motion simulation. - As another example, in
FIG. 9B , if there are four candidate contact points 940, 941, 942, and 943, then there may be six candidate axes of rotation 960-965. Performing a similar analysis as that described above, candidate axes ofrotation rotation 961 will have no candidate contact points with force normals create moments in the opposite direction of rotation. Therefore, the candidate contact points 941 and 943 will be used in the subsequent simulation step. - If the stopping criteria are met in
block 330, then inblock 340 the relative placement of the occluding sets of teeth may be determined based on the contact points. In some embodiments, the relative placements of occluding teeth may be known based on the contact points and no further calculation may be needed to determine the relative placements. In various embodiments, determining the relative placements of the occluding sets of teeth may include recording a matrix, quaternion, or other transformation of the 3D models of occluding teeth after the contact points have been determined. The contact points may define the relative placement of the two 3D models with respect to one another. The two 3D models may be translated, rotated, or a transformation between the two 3D models may be stored. Inblock 350, the design of the prosthesis may be continued or data may be generated for production of the prosthesis. Designing dental prostheses may be performed using any appropriate system, methods, or techniques, such as those described in U.S. patent application Ser. No. 12/703,601, filed Feb. 10, 2010, entitled Dental Prosthetics Manipulation, Selection, and Planning, which is hereby incorporated by reference in its entirety for all purposes. -
FIG. 3B illustrates anothermethod 301 of occlusion estimation for dental prosthesis design. Inmethod 301, one or more 3D models of prosthetic teeth and the 3D model of their antagonist may be received inblock 311. For example, looking toFIG. 7 , anantagonist 730 may be received, in addition to 3D models of individualprosthetic teeth prosthetic teeth block 311, an initial position of the one or more 3D models of prosthetic teeth and the 3D model of their antagonist may be received or determined. For example, an operator may initially place the teeth with respect to the antagonist, or the teeth in the antagonists' relative placement may be algorithmically determined or known based on the scanning procedure used to obtain the 3D model (described elsewhere herein). - After the 3D models have been received in
block 311, contact points between each of the one or more 3D models of prosthetic teeth and the antagonist may be determined inblock 321. Determining the contact points between the one or more 3D models and the antagonists may include rotating about an axis, simulating motion, manipulating the size, translation, rotation, or orientation of the 3D models until a contact point is determined, and the like. Returning again toFIG. 7 , block 321 may, in some embodiments, include determiningcontact points 3D models 3D models 3D models axis 755. After a first contact point is determined for each of the3D models axis 755, may define the axes about which to rotate each of3D models block 321 until two contact points are determined that are on opposite sides of the center of gravity, which is assessed in block 330 (e.g., similar to the method 300). After the one or more contact points have been determined inblock 321, then, inblock 330, stopping criteria may be checked. - The stopping criteria may include the determination of a single contact point or the determination of multiple contact points as described above with respect to
method 300. As discussed above, in some embodiments, multiple contact points may be determined for each of3D models method 301, two or more contact points are determined for each of3D models 3D model 3D model - In some embodiments, the one or more 3D models of prosthetic teeth may be expanded or contracted until there is a single contact point (or multiple contact points). This expansion or contraction can continue until the stopping criteria are met (e.g., determining the requisite number of contact points). The expansion or contraction may also be followed by motion simulation. In some embodiments, each of the individual one or more prosthetic teeth will have a separate simulated motion (e.g., without the constraints of the
axis 755 shown inFIG. 7 ). The separate simulated motion of each of the 3D models (e.g.,3D models method 300. - After the predetermined stopping criteria have been met as determined in
block 330, then, inblock 341, the new state for the one or more 3D models based on the contact point may be determined. The new state may be the new position, rotation, orientation, size, and/or shape of the one or more 3D models of the prosthetic teeth. After the new state has been determined inblock 341, then, optionally, the operator may continue designing the prosthesis or may generate production data for the prosthesis (block 350). - Other methods and techniques may be used. Further, other blocks may be added to each of
methods block 310 and proceeding directly to block 320. In such a simulation, the first contact point will be determined by free fall of one 3D model onto the other and then subsequent contact points may be determined inblock 320 until the predetermined stopping criteria are met inblock 330. Then the relative placement on the occluding sets of teeth based on the contact points may be determined inblock 340. In various embodiments, the stopping criteria may include there being no further movement in the motion simulation. For example, the simulation of motion may continue until the two 3D models are in a static position, one with respect to the other. Various other embodiments are also considered within the scope herein. -
FIG. 4 illustrates aninterface 400 with an overlaidrepresentation portion 410 that depicts alower teeth model 420 and anupper teeth model 430. As depicted in the figure, there may initially be a gap between thelower teeth model 420 and theupper teeth model 430. In some embodiments, as depicted inFIG. 5 , which illustrates aninterface 500 with an overlaidrepresentation portion 510, having three contact points on an approximately linear set of upper and lower teeth models (e.g., 520, 530) may cause the 3D models to “fall” in a way that is undesirable or anatomically impossible. As depicted inFIG. 5 ,3D model 530 has fallen onto3D model 520 and has tilted in a way that would not be possible given the constraints of the human jaw. In such situations, it may be desirable to have a stopping criterion that includes looking for two candidate contact points that are on opposite sides of a center of gravity.FIG. 6 illustrates aninterface 600 with an overlaidrepresentation portion 610 in which anupper teeth model 630 dropped onto alower teeth model 620 using motion simulation. InFIG. 6 , the stopping criterion (a) used to determine the relative placement of thelower teeth model 620 and theupper teeth model 630 may include determining two candidate contact points 640 and 641 that are on the opposite side of the center ofgravity 650. In comparingFIGS. 5 and 6 , it may be seen that, in some circumstances, use of this two-point stopping criterion may produce better results than a three-point stopping criterion. -
FIGS. 7 and 8 depict multiple 3D models of individual prosthetic teeth being moved with respect to an antagonist.FIG. 7 is described above.FIG. 8 illustrates aninterface 800 that includes an overlaidrepresentation portion 810. The overlaidrepresentation portion 810 illustrates the movement of 3D models of individualprosthetic teeth antagonist 830. The interface also shows alower teeth model 820.Interface 800 also illustrates a shared axis ofrotation 855 for the 3D models of the individualprosthetic teeth prosthetic teeth FIG. 8 , if there is no axis ofrotation 855, then each of the individual prosthetic teeth may have simulated motion performed, may be scale translated, rotated, or otherwise modified until contact points are determined, or any other appropriate technique may be used. Further, in some embodiments, the collision detection or other techniques may be used to ensure that neighboring teeth do not overlap or otherwise have intersecting volumes. Examples of this are described elsewhere herein. - Various of the embodiments herein show interfaces of a certain configuration. Other configurations of interfaces are also possible. Turning to
FIG. 10 , it is possible that aninterface 1000 can have an overlaidrepresentation portion 1010, aglobal selection portion 1011, and adistance map portion 1012, all on asingle interface 1000. It is also possible, as depicted inFIG. 11 , that twoseparate sub-interfaces distance map portion 1120 may be oninterface portion 1101 and the overlaidrepresentation portion 1110 andglobal selection portion 1111 may be oninterface portion 1100. These various interface portions may be shown on separate screens, on separate displays or in separate windows. Other configurations of the various portions on various displays or in various windows may also be used. - As discussed above, when designing a virtual multi-tooth prosthesis, the operator may move 3D models of individual prosthetic teeth independently of one another. Consider, for example,
FIG. 12 .FIG. 12 depicts an interface 1200 which has an overlaidrepresentation portion 1210. In the overlaidrepresentation portion 1210, there is a 3D model oflower teeth 1220 which is shown as opaque, as well as 3D models ofprosthetic teeth FIG. 12 aremanipulation handles prosthetic teeth lower teeth 1220, and/or with respect to a virtual multi-tooth prosthesis encompassing the 3D models ofprosthetic teeth 3D prosthetic teeth 3D prosthetic teeth manipulators manipulator 1290 and were to shift it left (in the orientation shown inFIG. 12 ), then the 3D model ofprosthetic tooth 1270 may decrease in size (e.g., be scaled smaller), and 3D model ofprosthetic tooth 1271 may increase in size (e.g., be scaled larger). This is depicted inFIG. 14B , in which themanipulator 1490 has been moved to the left with respect to the location ofmanipulator 1290 inFIG. 12 , and the 3D model of aprosthetic tooth 1470 has decreased in size as compared with 3D model ofprosthetic tooth 1270 inFIG. 12 . The 3D model of theprosthetic tooth 1471 has increased in size as compared with 3D model ofprosthetic tooth 1271 inFIG. 12 . Returning again toFIG. 12 , if a different manipulator is moved by the operator, for example,manipulator 1281, then the tooth associated with that manipulator may be translated with respect to the other teeth or with respect to the virtual multi-tooth prosthesis. Looking again toFIG. 14B , we see thatmanipulator 1481 has been moved up in screen space with respect to where it was inFIG. 12 . Therefore, the 3D model of theprosthetic tooth 1471 has been translated up in screen space with respect to the other teeth for the virtual multi-tooth prosthesis. - Other manipulators and manipulations are also possible and considered part of the scope of embodiments discussed herein. In various embodiments, other types of manipulations of the teeth are also possible. For example, there may be a manipulator (not depicted in
FIG. 12 ) that would allow the operator to rotate an individualprosthetic tooth -
FIG. 13A depicts amethod 1300 for prosthesis manipulation in dental prosthesis design. Inblock 1310, a virtual multi-tooth prosthesis is presented relative to an area to be reconstructed. For example, looking toFIG. 12 , a virtual multi-tooth prosthesis including 3D models ofprosthetic teeth lower teeth model 1220, as well as with respect to its antagonist teeth (not shown inFIG. 12 ). - In
block 1320, a manipulation command is received, said manipulation command relating to a subset of the teeth in the virtual prosthesis. As used herein, the phrase “subset of the teeth in the virtual prosthesis” includes its customary and ordinary meaning, including meaning that the subset is fewer than all of the teeth in the virtual prosthesis, including one tooth in the virtual prosthesis. For example, looking again toFIG. 12 , a manipulation command may be received related to a single 3D model of aprosthetic tooth 1270 or with respect to multiple models ofprosthetic teeth tooth 1270 may be a translation manipulation indicated by the movement ofmanipulator 1280. This manipulation command may affect only the 3D model ofprosthetic tooth 1270, as discussed more below, it may also affect, perhaps to a lesser extent, the position, scaling, placement, etc. of other 3D models ofprosthetic teeth manipulator 1290, which indicates a scaling of the 3D model ofprosthetic teeth prosthetic teeth 1272. - Returning again to
FIG. 13A ,block 1330 includes modifying the prosthesis based on the received manipulation commands. Modifying the teeth based on the received manipulation commands may include any appropriate action. For example, if the manipulation command is meant to translate a single 3D model of a tooth in the lingual or buccal direction, then that tooth may be translated in the lingual or buccal direction with respect to the other teeth and/or with respect to the virtual prosthesis. If the command received requires scaling of two or more of the teeth with respect to one another or with respect to the virtual multi-tooth prosthesis, then the 3D models of those teeth may be scaled appropriately. That is, in some embodiments, one of the teeth may be scaled to increase its size and the other may be scaled to decrease its size. Scaling the teeth with respect to one another in this way may prevent large gaps from forming in the multi-tooth prosthesis and/or prevent overlap between neighboring teeth. - Modifying the prosthesis based on the received manipulations (in block 1330) may include performing the requested action and, in some embodiments, performing additional actions or calculations in order to align or place all of the 3D models of teeth in the prosthesis and/or reduce gaps (or correct overlaps) between neighboring teeth. For example, in some embodiments, when neighboring teeth are scaled or translated, a gap may form between two teeth, as depicted in
FIG. 15A . The gap depicted inFIG. 15A may have resulted, for example, from a scaling of the 3D model ofprosthetic tooth 1570 with respect to the 3D model ofprosthetic tooth 1571 or it may have resulted from translating either 3D model ofprosthetic tooth 1570 and/or 3D model ofprosthetic tooth 1571 with respect to one another. - In some embodiments, the techniques may include calculating the relative placement of all of the 3D models of prosthetic teeth in the virtual multi-tooth prosthesis after each manipulation command (or possibly after a series of manipulation commands). For example, after the manipulation command is received, all of the 3D models of the prosthetic teeth may be placed next to one another in the area to be reconstructed using bounding volumes as a first approximation. After the initial placement, the gaps (or overlaps) of the 3D models of the prosthetic teeth may be reduced or eliminated using the techniques described elsewhere herein. For example, looking to
FIG. 14A , we see that the 3D models ofprosthetic teeth boxes FIG. 14B , even though they may be rectilinear cubes). These boundingboxes prosthetic teeth boxes teeth bounding boxes teeth models FIGS. 15A and 15B and elsewhere herein). - In other embodiments, after one or more manipulation command are received, only the affected tooth or teeth may be manipulated, thereby leaving unchanged the placement, scale, and rotation of one or more of the 3D models of teeth in the virtual multi-tooth prosthesis. For example, if the
manipulator 1490 inFIG. 14B is manipulated, then this may only affect the scale of the 3D models ofprosthetic teeth prosthetic tooth 1472 may remain unchanged in position, scale, and/or rotation. After the two 3D models ofteeth FIGS. 15A and 15B and elsewhere herein. - In some embodiments, an initial placement or alignment of the 3D models of
prosthetic teeth - In some embodiments, the techniques may use bounding volumes or bounding boxes, not only for initial alignment of the 3D models of prosthetic teeth, but also for attempting to ensure that 3D models of neighboring teeth do not intersect or overlap in terms of volume. As neighboring teeth are scaled, for example, the bounding box can be used as a first approximation to ensure that neighboring teeth do not intersect or overlap in terms of volume. Similarly, when one of the 3D models of prosthetic teeth is translated, the bounding boxes may be used as a first approximation in order to ensure that the 3D models of prosthetic teeth do not intersect or overlap in terms of volume. The bounding boxes may also be used for rotation or any other manipulation of one or more teeth in the virtual multi-tooth prosthesis.
-
FIGS. 15A and 15B depict two neighboringteeth respective bounding boxes FIG. 15A , after a first relative placement of the 3D models of the prosthetic teeth has been computed, a gap may exist between the neighboring teeth. In some embodiments, the gap is closed in order to increase the aesthetic appeal and/or function of the virtual multi-tooth prosthesis. In some embodiments, the smallest distance between the two models may be determined and the gap, as illustrated by 1595 and 1596, between the twomodels models 3D models 3D models 3D model distance 1595 summed with distance 1596). The two3D models FIG. 15B . InFIG. 15B , the two models touch or nearly touch atpoint 1597. - In some embodiments, in order to close or approximately close a gap between neighboring teeth, each 3D model may be scaled to bring the closest points between the two
3D models model 1570 may be scaled by an amount to close theprevious gap 1595 to bring the3D model 1570 to the border of the two bounding boxes (and the same may be the case for model 1571). Other methods and techniques of closing the gaps between teeth are also possible and are considered within the scope of the embodiments herein. Further, in some embodiments, the gap between neighboring teeth may not be closed in a virtual multi-tooth prosthesis. Or a connector may be used to span the space between the neighboring teeth (not depicted inFIGS. 15A and 15B ). - In the example of
FIGS. 15A and 15B , the neighboring teeth have a gap (represented bydistances 1595 and 1596). The techniques described herein can include removing or reducing an overlap of neighboring teeth (not shown inFIGS. 15A and 15B ). For example, the neighboring teeth may be scaled (e.g., scaled to be smaller) and/or translated in order to remove the overlap between the neighboring teeth. - After modifying the virtual multi-tooth prosthesis based on the received manipulation commands in
block 1330, then optionally, in block 1340, the virtual multi-tooth prosthesis may be modified based on occlusion with the antagonist. Modifying a prosthesis based on occlusion with antagonist teeth is described elsewhere herein. Once the virtual multi-tooth prosthesis has been modified, the prosthesis may be treated as a rigid object and occlusion of that rigid object may be calculated with respect to the antagonist and the entire virtual multi-tooth prosthesis may move as a single structure. In other embodiments, each individual 3D model of a tooth may be modified separately based on its own occlusion with the antagonist. These two techniques are described elsewhere herein. - As described above with respect to block 1330 and with respect to
FIGS. 15A and 15B , if individual 3D models of teeth are modified in the virtual multi-tooth prosthesis based on occlusion with antagonist (block 1340), then gaps (or overlaps) may form between neighboring teeth. That is, after the occlusion is estimated and the individual 3D models of teeth have been moved with respect to one another, then a gap (or an overlap) may result between neighboring 3D models of teeth. Closing the gap (or avoiding the overlap) between neighboring teeth is described above. In the situation and embodiments in which the virtual multi-tooth prosthesis is moved, as a rigid body, it is unlikely or impossible that an additional gap or overlap will be introduced between neighboring teeth and therefore there may be no gap or overlap to correct between neighboring teeth. - After performing
block 1330 and/or block 1340, then, optionally, the virtual multi-tooth prosthesis may be presented relative to the area to be reconstructed inblock 1310. Additionally, when the operator is ready to continue designing the multi-tooth prosthesis, the operator may continue to other steps not depicted inmethod 1300. Additionally, when the operator is ready to produce the multi-tooth prosthesis, the manufacturing data may be produced as part ofblock 1350. -
FIG. 13B depicts anothermethod 1301 for prosthesis manipulation in dental prosthesis design. In some embodiments, if a command is received to translate, scale, rotate, or otherwise manipulate an individual prosthetic tooth in the virtual prosthesis (block 1321), then the 3D model for that prosthetic tooth may be manipulated based on that command and occlusion may be estimated for either that individual tooth or for the virtual prosthesis as a whole (block 1331). For example, turning toFIG. 12 , each time a 3D model of aprosthetic tooth - Returning again to
FIG. 13 , inblock 1311, a 3D model of a virtual prosthesis, possibly containing individual 3D models of individual prosthetic teeth, is presented relative to the area to be reconstructed. This is described generally with respect to block 1310. Inblock 1321, manipulation commands are received which are related to all or a portion of the prosthesis. The types of commands that may be received are described with respect to block 1320. - In block 1331, the prosthesis is modified based on the manipulation commands and based on the occlusion with antagonist teeth. Manipulation of the teeth is described above with respect to block 1330 and modifying the prosthesis based on occlusion with antagonist teeth is described above with respect to block 1340. After the prosthesis has been modified, both based on the manipulation command and based on the occlusion with antagonist teeth, the prosthesis may be again displayed relative to the area to be reconstructed in
block 1311. Additionally, once the operator is happy with the virtual prosthesis or is ready to produce the prosthesis, the operator may continue to other steps in prosthesis design (not pictured) or may produce manufacturing data for the prosthesis (block 1350). - Other methods and techniques may be used. Further, other blocks may be added to each of
methods method - The processes and systems described herein may be performed on or encompass various types of hardware, such as computing devices. In some embodiments,
computer 210,display 220, and/orinput device 230 may each be separate computing devices, applications, or processes or may run as part of the same computing devices, applications, or processes—or one of more may be combined to run as part of one application or process—and/or each or one or more may be part of or run on computing devices. Computing devices may include a bus or other communication mechanism for communicating information, and a processor coupled with the bus for processing information. The computing devices may have a main memory, such as a random access memory or other dynamic storage device, coupled to the bus. The main memory may be used to store instructions and temporary variables. The computing devices may also include a read-only memory or other static storage device coupled to the bus for storing static information and instructions. The computer systems may also be coupled to a display, such as a CRT or LCD monitor. Input devices may also be coupled to the computing devices. These input devices may include a mouse, a trackball, or cursor direction keys. - Each computing device may be implemented using one or more physical computers, processors, embedded devices, or computer systems or a combination or portions thereof. The instructions executed by the computing device may also be read in from a computer-readable storage medium. The computer-readable storage medium may be a CD, DVD, optical or magnetic disk, laserdisc, carrier wave, or any other medium that is readable by the computing device. In some embodiments, hardwired circuitry may be used in place of or in combination with software instructions executed by the processor. Communication among modules, systems, devices, and elements may be over direct or switched connections, and wired or wireless networks or connections, via directly connected wires, or any other appropriate communication mechanism. The communication among modules, systems, devices, and elements may include handshaking, notifications, coordination, encapsulation, encryption, headers, such as routing or error detecting headers, or any other appropriate communication protocol or attribute. Communication may also messages related to HTTP, HTTPS, FTP, TCP, IP, ebMS OASIS/ebXML, secure sockets, VPN, encrypted or unencrypted pipes, MIME, SMTP, MIME Multipart/Related Content-type, SQL, etc.
- Any appropriate 3D graphics processing may be used for displaying or rendering, including processing based on OpenGL, Direct3D, Java 3D, etc. Whole, partial, or modified 3D graphics packages may also be used, such packages including 3DS Max, SolidWorks, Maya, Form Z, Cybermotion 3D, or any others. In some embodiments, various parts of the needed rendering may occur on traditional or specialized graphics hardware. The rendering may also occur on the general CPU, on programmable hardware, on a separate processor, be distributed over multiple processors, over multiple dedicated graphics cards, or using any other appropriate combination of hardware or technique.
- As will be apparent, the features and attributes of the specific embodiments disclosed above may be combined in different ways to form additional embodiments, all of which fall within the scope of the present disclosure.
- Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements, and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements, and/or states are included or are to be performed in any particular embodiment.
- Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.
- All of the methods and processes described above may be embodied in, and fully automated via, software code modules executed by one or more general purpose computers or processors, such as those computer systems described above. The code modules may be stored in any type of computer-readable storage medium or other computer storage device. Some or all of the methods may alternatively be embodied in specialized computer hardware.
- It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
Claims (19)
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/884,618 US8594820B2 (en) | 2010-09-17 | 2010-09-17 | Prosthesis manipulation in dental prosthesis design |
PCT/IB2011/003158 WO2012035444A2 (en) | 2010-09-17 | 2011-09-13 | Prosthesis manipulation in dental prosthesis design |
CN201180044529.8A CN103108604B (en) | 2010-09-17 | 2011-09-13 | Dummy operation in dental prosthesis design |
DK11813556.5T DK2615998T3 (en) | 2010-09-17 | 2011-09-13 | Prosthetic manipulation in denture design |
EP11813556.5A EP2615998B1 (en) | 2010-09-17 | 2011-09-13 | Prosthesis manipulation in dental prosthesis design |
JP2013528789A JP6073227B2 (en) | 2010-09-17 | 2011-09-13 | Prosthesis operation in dental prosthesis design |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/884,618 US8594820B2 (en) | 2010-09-17 | 2010-09-17 | Prosthesis manipulation in dental prosthesis design |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120070803A1 true US20120070803A1 (en) | 2012-03-22 |
US8594820B2 US8594820B2 (en) | 2013-11-26 |
Family
ID=45531890
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/884,618 Active 2031-05-01 US8594820B2 (en) | 2010-09-17 | 2010-09-17 | Prosthesis manipulation in dental prosthesis design |
Country Status (6)
Country | Link |
---|---|
US (1) | US8594820B2 (en) |
EP (1) | EP2615998B1 (en) |
JP (1) | JP6073227B2 (en) |
CN (1) | CN103108604B (en) |
DK (1) | DK2615998T3 (en) |
WO (1) | WO2012035444A2 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110196653A1 (en) * | 2010-02-10 | 2011-08-11 | Nobel Biocare Services Ag | Dental data planning |
US20120115107A1 (en) * | 2010-11-04 | 2012-05-10 | Adams Bruce W | System and method for automated manufacturing of dental orthotics |
US20130297275A1 (en) * | 2012-05-02 | 2013-11-07 | Mark Sanchez | Systems and methods for consolidated management and distribution of orthodontic care data, including an interactive three-dimensional tooth chart model |
WO2015117973A1 (en) * | 2014-02-04 | 2015-08-13 | Sirona Dental Systems Gmbh | Method for the computer-aided editing of a digital 3d model |
US9179988B2 (en) | 2010-05-25 | 2015-11-10 | Biocad Medical, Inc. | Dental prosthesis connector design |
US9226806B2 (en) | 2010-09-17 | 2016-01-05 | Biocad Medical, Inc. | Occlusion estimation in dental prosthesis design |
US9801699B2 (en) | 2013-03-14 | 2017-10-31 | Devin Okay | Paired templates for placing dental implants and enhancing registration for denture prosthetics attached to the implants |
US20170364659A1 (en) * | 2014-12-31 | 2017-12-21 | Osstemimplant Co., Ltd. | Method for dental implant planning, apparatus for same, and recording medium having same recorded thereon |
KR20190071952A (en) * | 2017-12-15 | 2019-06-25 | 주식회사 디디에스 | method for designing Virtual prosthesis |
US10856957B2 (en) * | 2018-05-03 | 2020-12-08 | Dentsply Sirona Inc. | Methods of three-dimensional printing for fabricating a dental appliance |
US11185394B2 (en) | 2013-08-26 | 2021-11-30 | James R. Glidewell Dental Ceramics, Inc. | Computer-implemented dental restoration design |
US20220061957A1 (en) * | 2020-08-31 | 2022-03-03 | James R. Glidewell Dental Ceramics, Inc. | Automatic bite setting |
CN114342002A (en) * | 2019-09-05 | 2022-04-12 | 登士柏希罗纳有限公司 | Method, system and apparatus for customizing an on-the-fly automated design of a dental object |
US20220254511A1 (en) * | 2019-02-20 | 2022-08-11 | Osstemimplant Co., Ltd. | Dental formula information input method, dental formula information input device, and recording medium |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102013203449A1 (en) | 2013-02-28 | 2014-08-28 | Sirona Dental Systems Gmbh | Method and device for controlling a computer program by means of an intraoral scanner |
US10166091B2 (en) | 2014-02-21 | 2019-01-01 | Trispera Dental Inc. | Augmented reality dental design method and system |
US10327867B2 (en) | 2015-02-25 | 2019-06-25 | James R. Glidewell Dental Ceramics, Inc. | Arch form placement for dental restoration design |
ES2738982T3 (en) | 2015-03-19 | 2020-01-28 | Nobel Biocare Services Ag | Object segmentation in image data through the use of channel detection |
WO2017116033A1 (en) * | 2015-12-28 | 2017-07-06 | 오스템임플란트 주식회사 | Method for dental implant planning, apparatus for same, and recording medium having same recorded thereon |
US10820970B2 (en) * | 2017-04-07 | 2020-11-03 | 3M Innovative Properties Company | Method of making a dental restoration |
KR101984028B1 (en) | 2017-11-06 | 2019-06-04 | 주식회사 디디에스 | Method and system for design dental prosthesis based on arch lines |
KR101974636B1 (en) * | 2018-04-18 | 2019-05-03 | 오스템임플란트 주식회사 | Dental implant planning method, apparatus and recording medium thereof |
KR102138919B1 (en) * | 2019-04-05 | 2020-07-30 | 오스템임플란트 주식회사 | Method for adjusting prosthetic parameter and prosthetic CAD apparatus therefor |
KR102138922B1 (en) * | 2019-04-25 | 2020-07-28 | 오스템임플란트 주식회사 | Method for calculating contact distance to peripheral teeth of prosthesis using contact direction interface in designing prosthesis and prosthetic CAD apparatus therefor |
KR102292875B1 (en) * | 2019-08-08 | 2021-08-24 | 오스템임플란트 주식회사 | Method for designing dental guide and apparatus thereof |
BR102022006437A2 (en) * | 2022-04-04 | 2023-10-17 | Ruy Teichert Filho | TOOTH MORPHOLOGY SELECTION SYSTEM AND METHOD |
CN115192227B (en) * | 2022-07-08 | 2024-07-30 | 先临三维科技股份有限公司 | Tooth standard frame adjusting method, device, equipment and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5027281A (en) * | 1989-06-09 | 1991-06-25 | Regents Of The University Of Minnesota | Method and apparatus for scanning and recording of coordinates describing three dimensional objects of complex and unique geometry |
US20060063135A1 (en) * | 2002-11-11 | 2006-03-23 | Albert Mehl | Method for producing denture parts or for tooth restoration using electronic dental representations |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4663720A (en) | 1984-02-21 | 1987-05-05 | Francois Duret | Method of and apparatus for making a prosthesis, especially a dental prosthesis |
JP3672966B2 (en) * | 1995-04-14 | 2005-07-20 | 株式会社ユニスン | Method and apparatus for creating dental prediction model |
JPH09206320A (en) * | 1996-02-02 | 1997-08-12 | Technol Res Assoc Of Medical & Welfare Apparatus | Plate denture design supporting device |
JPH09220237A (en) * | 1996-02-19 | 1997-08-26 | Shiyuukai | Manufacture of dental plate |
JPH1075963A (en) | 1996-09-06 | 1998-03-24 | Nikon Corp | Method for designing dental prosthetic appliance model and medium recording program for executing the method |
US5975893A (en) | 1997-06-20 | 1999-11-02 | Align Technology, Inc. | Method and system for incrementally moving teeth |
US6705863B2 (en) | 1997-06-20 | 2004-03-16 | Align Technology, Inc. | Attachment devices and methods for a dental appliance |
CA2351420A1 (en) | 1998-11-30 | 2000-06-08 | Loc X. Phan | Attachment devices and methods for a dental appliance |
EP2266492A3 (en) * | 1999-12-29 | 2012-12-26 | Ormco Corporation | Method and apparatus for forming a custom orthodontic appliance |
US6463344B1 (en) | 2000-02-17 | 2002-10-08 | Align Technology, Inc. | Efficient data representation of teeth model |
US6947038B1 (en) | 2000-04-27 | 2005-09-20 | Align Technology, Inc. | Systems and methods for generating an appliance with tie points |
US7471821B2 (en) | 2000-04-28 | 2008-12-30 | Orametrix, Inc. | Method and apparatus for registering a known digital object to scanned 3-D model |
US7080979B2 (en) | 2001-04-13 | 2006-07-25 | Orametrix, Inc. | Method and workstation for generating virtual tooth models from three-dimensional tooth data |
US6767208B2 (en) | 2002-01-10 | 2004-07-27 | Align Technology, Inc. | System and method for positioning teeth |
EP1509158B1 (en) | 2002-05-31 | 2009-04-01 | Ormco Corporation | Providing custom orthodontic treament with appliance components from inventory |
US7252509B2 (en) | 2002-09-17 | 2007-08-07 | Orametrix, Inc. | Tooth templates for bracket positioning and other uses |
DE10312848A1 (en) | 2003-03-21 | 2004-10-07 | Sirona Dental Systems Gmbh | Database, tooth model and tooth replacement part, built up from digitized images of real teeth |
US7228191B2 (en) | 2003-05-02 | 2007-06-05 | Geodigm Corporation | Method and apparatus for constructing crowns, bridges and implants for dental use |
US7731495B2 (en) | 2005-12-20 | 2010-06-08 | 3M Innovative Properties Company | User interface having cross section control tool for digital orthodontics |
KR100795645B1 (en) * | 2007-08-17 | 2008-01-17 | 김정한 | Method for manufacturing the one body abutment of implant |
-
2010
- 2010-09-17 US US12/884,618 patent/US8594820B2/en active Active
-
2011
- 2011-09-13 WO PCT/IB2011/003158 patent/WO2012035444A2/en active Application Filing
- 2011-09-13 DK DK11813556.5T patent/DK2615998T3/en active
- 2011-09-13 CN CN201180044529.8A patent/CN103108604B/en active Active
- 2011-09-13 JP JP2013528789A patent/JP6073227B2/en active Active
- 2011-09-13 EP EP11813556.5A patent/EP2615998B1/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5027281A (en) * | 1989-06-09 | 1991-06-25 | Regents Of The University Of Minnesota | Method and apparatus for scanning and recording of coordinates describing three dimensional objects of complex and unique geometry |
US20060063135A1 (en) * | 2002-11-11 | 2006-03-23 | Albert Mehl | Method for producing denture parts or for tooth restoration using electronic dental representations |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9934360B2 (en) | 2010-02-10 | 2018-04-03 | Biocad Medical, Inc. | Dental data planning |
US20110196653A1 (en) * | 2010-02-10 | 2011-08-11 | Nobel Biocare Services Ag | Dental data planning |
US9179988B2 (en) | 2010-05-25 | 2015-11-10 | Biocad Medical, Inc. | Dental prosthesis connector design |
US9226806B2 (en) | 2010-09-17 | 2016-01-05 | Biocad Medical, Inc. | Occlusion estimation in dental prosthesis design |
US20160151132A1 (en) * | 2010-09-17 | 2016-06-02 | Biocad Medical, Inc. | Occlusion estimation in dental prosthesis design |
US10891403B2 (en) * | 2010-09-17 | 2021-01-12 | Biocad Medical, Inc. | Occlusion estimation in dental prosthesis design |
US20120115107A1 (en) * | 2010-11-04 | 2012-05-10 | Adams Bruce W | System and method for automated manufacturing of dental orthotics |
US20130297275A1 (en) * | 2012-05-02 | 2013-11-07 | Mark Sanchez | Systems and methods for consolidated management and distribution of orthodontic care data, including an interactive three-dimensional tooth chart model |
US9510918B2 (en) * | 2012-05-02 | 2016-12-06 | Cogent Design, Inc. | Systems and methods for consolidated management and distribution of orthodontic care data, including an interactive three-dimensional tooth chart model |
US9801699B2 (en) | 2013-03-14 | 2017-10-31 | Devin Okay | Paired templates for placing dental implants and enhancing registration for denture prosthetics attached to the implants |
US11185394B2 (en) | 2013-08-26 | 2021-11-30 | James R. Glidewell Dental Ceramics, Inc. | Computer-implemented dental restoration design |
WO2015117973A1 (en) * | 2014-02-04 | 2015-08-13 | Sirona Dental Systems Gmbh | Method for the computer-aided editing of a digital 3d model |
EP3241522A4 (en) * | 2014-12-31 | 2018-10-03 | Osstemimplant Co., Ltd. | Method for dental implant planning, apparatus for same, and recording medium having same recorded thereon |
US20170364659A1 (en) * | 2014-12-31 | 2017-12-21 | Osstemimplant Co., Ltd. | Method for dental implant planning, apparatus for same, and recording medium having same recorded thereon |
US11195623B2 (en) * | 2014-12-31 | 2021-12-07 | Osstemimplant Co., Ltd. | Method for dental implant planning, apparatus for same, and recording medium having same recorded thereon |
KR20190071952A (en) * | 2017-12-15 | 2019-06-25 | 주식회사 디디에스 | method for designing Virtual prosthesis |
KR102004449B1 (en) | 2017-12-15 | 2019-07-26 | 주식회사 디디에스 | method for designing Virtual prosthesis |
US10856957B2 (en) * | 2018-05-03 | 2020-12-08 | Dentsply Sirona Inc. | Methods of three-dimensional printing for fabricating a dental appliance |
US20220254511A1 (en) * | 2019-02-20 | 2022-08-11 | Osstemimplant Co., Ltd. | Dental formula information input method, dental formula information input device, and recording medium |
US12009108B2 (en) * | 2019-02-20 | 2024-06-11 | Osstemimplant Co., Ltd. | Dental formula information input method, dental formula information input device, and recording medium |
CN114342002A (en) * | 2019-09-05 | 2022-04-12 | 登士柏希罗纳有限公司 | Method, system and apparatus for customizing an on-the-fly automated design of a dental object |
US20220061957A1 (en) * | 2020-08-31 | 2022-03-03 | James R. Glidewell Dental Ceramics, Inc. | Automatic bite setting |
Also Published As
Publication number | Publication date |
---|---|
DK2615998T3 (en) | 2019-06-24 |
CN103108604A (en) | 2013-05-15 |
EP2615998B1 (en) | 2019-05-15 |
WO2012035444A3 (en) | 2012-06-14 |
US8594820B2 (en) | 2013-11-26 |
JP6073227B2 (en) | 2017-02-01 |
CN103108604B (en) | 2016-01-13 |
WO2012035444A2 (en) | 2012-03-22 |
EP2615998A2 (en) | 2013-07-24 |
JP2013537077A (en) | 2013-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8594820B2 (en) | Prosthesis manipulation in dental prosthesis design | |
US10891403B2 (en) | Occlusion estimation in dental prosthesis design | |
US10952817B1 (en) | Systems and methods for determining orthodontic treatments | |
EP2615999B1 (en) | Adjusting dental prostheses based on soft tissue | |
US10314674B2 (en) | Dental prosthetics manipulation, selection, and planning | |
US10631954B1 (en) | Systems and methods for determining orthodontic treatments | |
US20150248538A1 (en) | 3d modelling of bodies | |
CA2798664C (en) | Dental prosthesis connector design | |
US20230397972A1 (en) | Method and device for processing three-dimensional oral cavity model and computer-readable recording medium | |
KR102463390B1 (en) | Data processing method and data processing apparatus | |
KR20210017370A (en) | Method for designing dental guide and apparatus thereof | |
EP2550615A2 (en) | Improving dental prosthesis robustness |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BIOCAD MEDICAL, INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MANAI, MYRIAM;GIASSON, DAVID;HANG, BIN;REEL/FRAME:025677/0890 Effective date: 20101213 Owner name: NOBEL BIOCARE SERVICES AG, SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROBERTS, ADAM;REEL/FRAME:025678/0646 Effective date: 20101019 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
AS | Assignment |
Owner name: SERVICES NOBEL BIOCARE PROCERA INC. / NOBEL BIOCARE PROCERA SERVICES INC., CANADA Free format text: MERGER AND CHANGE OF NAME;ASSIGNORS:BIOCAD MEDICAL INC.;SERVICES NOBEL BIOCARE PROCERA INC. / NOBEL BIOCARE PROCERA SERVICES INC.;REEL/FRAME:064425/0928 Effective date: 20190101 |
|
AS | Assignment |
Owner name: NOBEL BIOCARE CANADA INC., CANADA Free format text: MERGER;ASSIGNOR:SERVICES NOBEL BIOCARE PROCERA INC. / NOBEL BIOCARE PROCERA SERVICES INC.;REEL/FRAME:064670/0892 Effective date: 20190101 |