US20180240270A1 - Method for recording individual three-dimensional optical images to form a global image of a tooth situation - Google Patents
Method for recording individual three-dimensional optical images to form a global image of a tooth situation Download PDFInfo
- Publication number
- US20180240270A1 US20180240270A1 US15/957,060 US201815957060A US2018240270A1 US 20180240270 A1 US20180240270 A1 US 20180240270A1 US 201815957060 A US201815957060 A US 201815957060A US 2018240270 A1 US2018240270 A1 US 2018240270A1
- Authority
- US
- United States
- Prior art keywords
- model
- lateral
- dimensional optical
- image
- subsection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 230000003287 optical effect Effects 0.000 title claims abstract description 26
- 238000005259 measurement Methods 0.000 description 9
- 210000004195 gingiva Anatomy 0.000 description 3
- 238000012567 pattern recognition method Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 0 *CCCCC*1C=C=C2C1C2 Chemical compound *CCCCC*1C=C=C2C1C2 0.000 description 1
- SMZRHECFUDWCER-UHFFFAOYSA-N C[O]1=CC=CC1=C Chemical compound C[O]1=CC=CC1=C SMZRHECFUDWCER-UHFFFAOYSA-N 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C11/00—Dental articulators, i.e. for simulating movement of the temporo-mandibular joints; Articulation forms or mouldings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
- A61C9/0053—Optical means or methods, e.g. scanning the teeth by a laser or light beam
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G06T7/0032—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/344—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/32—Image data format
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/016—Exploded view
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2008—Assembling, disassembling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- the invention relates to a method for recording individual three-dimensional optical images to form a global image of a tooth situation comprising an upper jaw and lower jaw, wherein a first 3D model of a first subsection of the upper jaw, and a second 3D model of a second subsection of the lower jaw, are generated from the individual images.
- a number of registration methods are already known from the prior art in which individual sequentially-recorded, three-dimensional optical images are combined into a global image. Corresponding areas in the individual images are detected and overlapped. These areas are also termed overlapping areas.
- registration errors can arise during registration that are caused by excessively small overlapping areas from imaging flaws or from faulty registration algorithms. As a consequence of these registration errors, the generated global image deviates from the actual dimensions of the imaged object. This registration error increases as the length of a registration chain from the individual images increases.
- the object of the present invention is hence to provide a registration method for determining a positional relationship between the individual images that reduces the registration error in order to produce a fitting dental prosthesis using to the generated global image.
- the invention relates to a method for recording individual three-dimensional optical images to form a global image of a tooth situation comprising an upper jaw and a lower jaw.
- a first 3D model of a first subsection of the upper jaw and a second 3D model of a second subsection of the lower jaw are generated from the individual images.
- a geometric positional relationship is determined between the first 3D model and the second 3D model.
- the positional relationship is determined using a lateral image and/or using a contact pattern.
- the lateral image has an image area that at least partially comprises the first subsection of the upper jaw, and at least partially comprises the second subsection of the lower jaw.
- the contact pattern comprises a plurality of contact areas between the upper jaw and lower jaw.
- the contact pattern is measured using occlusion paper by placing the occlusion paper between the upper jaw and lower jaw, and then bringing the upper jaw and lower jaw into a closed-bite position. Then, after the occlusion paper has been removed, the individual images are measured in order to record the contact areas of the contact pattern on the upper jaw and lower jaw. Then the generated first 3D model and the generated second 3D model are analyzed by means of a computer in order to determine the position of the contact areas.
- the optical measurement can for example be performed by a dental camera based on a strip projection method, a confocal optical method, or a color strip projection method.
- a pattern consisting of a number of colored strips is projected onto the object. Then the depth coordinates for the measured points are determined, and a 3D model of the object is generated.
- the colored strips can be clearly identified by their color. Four colored strips and three color transitions can, for example, be used for color-coding the colored strips.
- the colored strips can for example be generated using a slide.
- a different strip projection method can also be used in which the strips are coded using different optical properties such as intensity, color, polarization, coherence, phase, contrast, location or propagation time.
- the strip width of such strip projection methods can for example be 130 ⁇ m in the measured volume of the object to be measured.
- a so-called confocal chromatic triangulation method can also be used for the measurement in which the concepts of confocal measurement and triangulation measurement are combined with each other.
- the basic idea consists of coloring the surface of an object so that the height coordinate can be directly inferred from a color.
- the colors are generated by a spectral splitting of the projected light, and each wavelength is focused on its own height coordinate.
- the hand-held digital camera is moved relative to the dental object such as a lower jaw or an upper jaw during which the three-dimensional optical images are generated at regular intervals in time.
- the individual images can for example be generated at a cyclical frequency between 10 Hz and 20 Hz. Then the individual images are recorded using a computer and combined into a global image.
- the first step a subsection of the upper jaw or the entire upper jaw is thus measured, and the first 3D model is generated therefrom.
- the second step the second subsection of the lower jaw or the entire lower jaw is measured, and the second 3D module is generated by registration.
- the first 3D model and/or the second 3D model can, however, be distorted by the registration error, or by a calibration error in comparison to the actual dimensions of the measured object.
- the calibration error can for example be caused by faulty settings of camera parameters of the dental camera.
- the relevant camera parameters are the distance between the camera and the object being measured, the angle of incidence, as well as the grid interval of a grid to generate a strip pattern.
- the camera parameters can also be based on a pinhole camera model, wherein a distinction is drawn between intrinsic and extrinsic.
- Possible intrinsic parameters are for example the focal length of the camera, the pixel coordinates of the image center, and list parameters.
- the extrinsic parameters can comprise the rotation and translation between the camera and projector.
- the registration error can for example be caused by the following factors: An excessively small overlapping area, insufficient waviness of the object surface in the overlapping area, insufficient roughness of the object surface in the overlapping area, an insufficient number of characteristic geometries in the overlapping area such as dental protuberances or fissures, and/or defective image quality in the overlapping area.
- the registration is for example faulty in those instances when the dental camera moves too quickly in relation to the object, which causes the size of the overlapping area to be insufficient. Another reason could be that the autofocus of the digital camera is not sharply set, thus causing the object to be indistinctly imaged such that the quality of the image is insufficient.
- the geometric positional relationship between the first 3D model and the second 3D model is hence determined using the lateral image and/or using the contact pattern.
- the lateral image can for example be made from a labial direction, or a buccal direction, or at a slight angle thereto. It is key for the lateral image to comprise at least parts of the first subsection of the upper jaw and the second subsection of the lower jaw.
- the lateral image thereby makes it possible to determine the positional relationship between the first 3D model and the second 3D module, aided by characteristic structures of the upper jaw and lower jaw, or with the help of markers placed on the teeth.
- additional lateral images can also be created from different directions in order to check the positional relationship of different areas of the 3D models.
- the lateral image, or the buccal image can for example be taken in the area of teeth 14 , 44 or 24 , 34 , respectively, according to the FDI notation.
- the contact pattern can for example be measured using an occlusion paper, wherein the contact areas or contact points between the upper jaw and lower jaw are recorded in a closed-bite position. The precise positional relationship between the first 3D model of the upper jaw and the second 3D model of lower jaw can then be determined with the aid of these contact areas.
- the occlusion paper makes it possible to measure the proximal contact or contact areas of the contact pattern between the upper jaw and lower jaw in the closed-bite position.
- the positional relationship between the first 3D model and the second 3D model can then be determined with the assistance of a computer, while using the generated contact pattern. This is done by simulating the contact areas at which the upper jaw come into contact with the lower jaw in closed-bite position using the measured contact pattern, and using the geometries of the first 3D model and second 3D model. As a result of this simulation, it is determined in the next step precisely where the virtual contact areas are arranged on the first 3D model and the second 3D model. Subsequently, the positional relationship between the first 3D model and second 3D model is then established by overlapping the corresponding virtual contact areas of the two 3D models, or minimizing the distance between the contact areas.
- An advantage of this method is that a check of the generated 3D models is easily enabled using the lateral image and/or using the contact pattern. This minimizes the registration error and/or the calibration error.
- additional lateral images can be used to determine the positional relationship.
- the lateral images can be taken from a number of directions, such as for tooth pairs 14 - 44 , 11 - 41 and 24 - 34 according to the FDI notation.
- the lateral images can be taken from a labial or buccal direction, or from an oblique direction that is at a maximum angle of 30° to the labial direction.
- the lateral image can advantageously be searched for a first surface structure from the first 3D model, and for a second surface structure consisting of the second 3D model, wherein the positional relationship between the first 3D model and the second 3D model is determined with the aid of arrangement of the first surface structure relative to the second surface structure in the lateral image.
- the first surface structure and second surface structure can be characteristic structures of the upper jaw or the lower jaw such as certain teeth, gingiva structures or marks applied to the teeth.
- the precise position of these surface structures is determined by means of the pattern recognition method, and the positional relationship between the first 3D model and second 3D model is determined therefrom.
- the contact areas of the contact pattern can depict local correspondences between the first 3D model and the second 3D model.
- the contact areas hence arise from the contact between the tooth protuberances and tooth fissures in the upper jaw and lower jaw in the closed-bite position and accordingly correlate to local correspondences which enable the positional relationship between the two 3D models to be determined.
- the first 3D model and/or the second 3D model can be deformed or aligned with use of the determined geometric positional relationship in order to minimize a first deviation between first virtual contact areas on the first 3D model and second virtual contact areas on the second 3D model, and/or to minimize a second deviation between the arrangement of a first virtual surface structure of the first 3D model relative to a second virtual surface structure of the second 3D model and the arrangement of the corresponding first surface structure of the upper jaw relative to the corresponding second surface structure of the lower jaw from the lateral image.
- the registration error and/or calibration error can cause the first 3D model and second 3D model to be distorted or shifted in comparison to the actual dimensions of the measured object. This leads to the first deviation between contact areas in comparison to the contact pattern, and to the second deviation of the virtual surface structures in comparison to the lateral image.
- the first 3D model and/or the second 3D model are distorted or deformed such that the contact conditions arising from the contact pattern and the conditions arising from a lateral image, or from a number of lateral images, are satisfied.
- the first deviation and second deviation are thereby minimized as much as possible.
- the minimization method of the sum of error squares can be used therefor.
- the first 3D model or second 3D model can be deformed by means of a deformation method in which the respective 3D model is divided into different sections which are connected to each other by simulated springs.
- the simulated spring forces of the springs cause the deformation to be evenly distributed to these sections, and the respective 3D model is flexibly deformed.
- These sections can for example be the individual optical three-dimensional images which were combined into the respective 3D model.
- the first 3D model and/or the second 3D model can be deformed with the aid of the determined geometric positional relationship, such that the first virtual contact areas on the first 3D model correspond with the second virtual contact areas on the second 3D model, and/or such that the first virtual surface structure of the first 3D model is arranged relative to the second virtual surface structure of the second 3D model, and the corresponding first surface structure of the upper jaw is arranged relative to the corresponding second surface structure of the lower jaw from the lateral image.
- the measuring error arising from the registration error and calibration error is thereby completely corrected.
- the second 3D model that has a higher measuring precision the first 3D model can remained unchanged, and the first 3D model can be deformed to adapt to the second 3D model.
- the first 3D model that has a higher measuring precision than the second 3D model can remained unchanged, and the second 3D model can be deformed to adapt to the first 3D model.
- the individual images of the first 3D model and second 3D model can be recorded using a global registration.
- each image to be recorded is recorded both with a previous image as well as with another already made image in order to lessen a registration error.
- the image to be recorded, the previous image, and additional image have common overlapping areas.
- the additional image can be the pre-previous image, or an image taken before that in the image series.
- the registration error is reduced by means of the global registration in that each image is not just compared with the previous image, but rather also with additional images with which it possesses common overlapping areas.
- contradictions in the positional relationship can arise that are ascribable to registration error and/or calibration error.
- an average of the positional relationships can be calculated from the registration of the previous image and other images to mutually cancel the contradictions and minimize the global registration error.
- FIG. 1 shows a sketch to illustrate the existing method for registration
- FIG. 2 shows a sketch to illustrate the determination of the geometric positional relationship between the 3D models
- FIG. 3 shows the upper jaw and lower jaw from the occlusal direction after marking the contact areas
- FIG. 4 shows the first 3D model of the upper jaw and the second 3D model of the lower jaw with a registration error
- FIG. 5 shows the result of the correction from FIG. 4 .
- FIG. 1 shows a sketch to illustrate the present method for recording individual three-dimensional optical images 1 to form a global image of a tooth situation comprising an upper jaw 2 and a lower jaw 3 .
- a digital camera that is based on a strip projection method or a confocal optical method is moved in a first step along a first direction of movement 5 around the upper jaw 2 in order to measure a first 3D model 6 , and then in a second step along a second direction of movement 7 around the lower jaw 3 in order to measure a second 3D model 8 .
- the three-dimensional optical images are generated at regular intervals in time.
- the individual images can for example be generated at a cyclical frequency between 10 Hz and 20 Hz.
- the individual images 1 are recorded with each other using the overlapping areas 9 that are depicted in dashed lines and combined into the first 3D model 6 and second 3D model 8 .
- a computer 10 records the measured data of the digital camera 4 , calculates the individual images 1 , records the individual images 1 , and combines the individual images into the first 3D model 6 and second 3D model 8 .
- the user has the option of moving and rotating the first 3D model 6 and second 3D model 8 by means of a cursor using input means such as a keyboard 11 and a mouse 12 in order to change the direction of observation.
- the first 3D model 6 and first 3D model 8 can comprise the entire upper jaw or lower jaw, or only a subsection.
- FIG. 2 shows a sketch to illustrate the determination of the geometric positional relationship between the 3D models 6 and 8 .
- additional lateral images 20 , 21 and 22 generated from a first image direction 23 , a second image direction 24 and a third image direction 25 .
- the lateral images 20 and 22 are hence taken from a buccal direction in the area of tooth pairs 16 - 46 and 26 - 36 according to the FDI notation.
- the second lateral image 21 is hence made from a buccal direction in the area of tooth pairs 11 - 41 and 21 - 31 according to the FDI notation.
- the lateral images 20 , 21 and 22 are searched using a pattern recognition method by means of the computer 10 from FIG.
- the first surface structure or second surface structure can be an individual tooth 26 which in the present case corresponds to tooth 14 according to the FDI notation, a group of teeth 27 that are depicted with dashed lines, and teeth 24 , 25 and 26 according to the FDI notation, or a characteristic structure 28 of the gingiva.
- the positional relationship between the first 3D model 6 and the second 3D model 8 can be determined by using the arrangement of the first surface structure 26 , 27 relative to the second surface structure 28 in lateral images 20 , 21 and 22 .
- the different lateral images 20 , 21 and 22 may provide different, contradictory positional relationships. These contradictions can be used to correct the first 3D model 6 and/or second 3D model 8 in order to generate an error-free global image.
- an occlusion paper 29 can be placed between the upper jaw 2 and lower jaw 3 . Then the upper jaw 2 and lower jaw 3 are brought into the depicted closed-bite position, wherein a colored layer of occlusion paper 29 colors certain contact areas between the upper jaw 2 and lower jaw 3 . As in the depicted instance, the occlusion paper 29 can consist of an individual sheet or several strips that are clamped between the upper jaw 2 and lower jaw 3 . After the contact areas have been marked with the occlusion paper 29 , the upper jaw 2 and lower jaw 3 are measured as depicted in FIG. 1 , wherein the first 3D model 6 and the second 3D model are generated with the marked contact areas.
- FIG. 3 shows the upper jaw 2 and lower jaw 3 from the occlusal direction after marking the contact areas 30 using the occlusion paper.
- the first contact areas 31 on the upper jaw 2 correspond to the second contact areas 32 on the lower jaw 3 .
- the first contact areas 31 and corresponding second contact areas 32 constitute local correspondences that enable the geometric positional relationships to be determined between the first 3D model of the upper jaw 2 and the second 3D model of the lower jaw 3 .
- FIG. 4 shows a sketch of the first 3D model 6 of the upper jaw 2 and the second 3D model 8 of the lower jaw 3 , wherein the second 3D model 8 significantly deviates from the first 3D model in one area 40 due to a registration error and/or a calibration error.
- the first 3D model 6 was brought into correspondence with the second 3D model 8 .
- the arrows designate the imaging directions 23 , 24 and 25 for the lateral images.
- the first contact areas 31 hence deviate significantly in the first area 40 from the second contact areas 32 of the second 3D model 8 , wherein the first contact areas 31 are brought into correspondence with the second contact areas 32 in the second area.
- the lateral images 23 and 24 can also be used to determine the positional relationship and to overlap the two 3D models 6 and 8 in the area 41 .
- the second 3D model 8 is deformed along a deformation direction 42 such that a first deviation between the first contact areas on the first 3D model 6 and second contact areas 32 on the second 3D model 8 is minimized.
- the lateral image 25 can also be used, wherein a second deviation between the arrangement of a first virtual surface structure 27 of the first 3D model 6 relative to a second virtual surface structure 28 of the second 3D model 8 is minimized by arranging the corresponding surface structures 27 and 28 on the lateral image 25 .
- the least squares method can be used, for example.
- the first 3D model has a greater measuring precision such that the second 3D model 8 subject to the registration error is adapted to the first 3D model 6 along deformation direction 42 .
- the second 3D model 8 can remain unchanged, and the first 3D model 6 can be adapted thereto.
- Conditions from the lateral images 23 , 24 and 25 as well as the conditions from the differences of contact areas 31 and 32 are hence used, in order to correct the registration error and/or calibration error by means of a minimization method.
- FIG. 5 shows the result of the correction from FIG. 4 , wherein the first contact areas 31 of the first 3D model 6 and the second contact areas 32 of the second 3D model 8 are caused to overlap both in the first area 40 as well as in the second area 41 .
- the result of the method is hence a global image 50 comprising the first 3D model 6 and the second 3D model 8 which were adapted to each other by means of the contact areas 31 and 32 as well by means of the lateral images 23 , 24 and 25 in order to eliminate the registration error.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Veterinary Medicine (AREA)
- Epidemiology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
Abstract
The invention relates to a method for recording individual three-dimensional optical images to form a global image of a tooth situation comprising an upper jaw and a lower jaw. A first 3D model of a first subsection of the upper jaw and a second 3D model of a second subsection of the lower jaw are produced from the individual images. Subsequently, a geometric positional relationship between the first 3D model and the second 3D model is determined, said positional relationship being determined by using a lateral image and/or using a contact pattern. Said lateral image comprises an image area which comprises at least part of the first subsection of the upper jaw and at least part of the second subsection of the lower jaw. Said contact pattern comprises several contact areas between the upper jaw and the lower jaw. Said contact pattern is measured by means of an occlusion paper.
Description
- This application is a continuation of application Ser. No. 14/421,219 filed Feb. 12, 2015, the entire content of which is hereby incorporated by reference for all purposes.
- The invention relates to a method for recording individual three-dimensional optical images to form a global image of a tooth situation comprising an upper jaw and lower jaw, wherein a first 3D model of a first subsection of the upper jaw, and a second 3D model of a second subsection of the lower jaw, are generated from the individual images.
- A number of registration methods are already known from the prior art in which individual sequentially-recorded, three-dimensional optical images are combined into a global image. Corresponding areas in the individual images are detected and overlapped. These areas are also termed overlapping areas. However, registration errors can arise during registration that are caused by excessively small overlapping areas from imaging flaws or from faulty registration algorithms. As a consequence of these registration errors, the generated global image deviates from the actual dimensions of the imaged object. This registration error increases as the length of a registration chain from the individual images increases.
- This has the disadvantage that the registration error can lead to faulty planning of the dental prosthesis to be produced.
- The object of the present invention is hence to provide a registration method for determining a positional relationship between the individual images that reduces the registration error in order to produce a fitting dental prosthesis using to the generated global image.
- The invention relates to a method for recording individual three-dimensional optical images to form a global image of a tooth situation comprising an upper jaw and a lower jaw. In so doing, a first 3D model of a first subsection of the upper jaw and a second 3D model of a second subsection of the lower jaw are generated from the individual images. Then a geometric positional relationship is determined between the first 3D model and the second 3D model. The positional relationship is determined using a lateral image and/or using a contact pattern. The lateral image has an image area that at least partially comprises the first subsection of the upper jaw, and at least partially comprises the second subsection of the lower jaw. The contact pattern comprises a plurality of contact areas between the upper jaw and lower jaw. The contact pattern is measured using occlusion paper by placing the occlusion paper between the upper jaw and lower jaw, and then bringing the upper jaw and lower jaw into a closed-bite position. Then, after the occlusion paper has been removed, the individual images are measured in order to record the contact areas of the contact pattern on the upper jaw and lower jaw. Then the generated first 3D model and the generated second 3D model are analyzed by means of a computer in order to determine the position of the contact areas.
- The optical measurement can for example be performed by a dental camera based on a strip projection method, a confocal optical method, or a color strip projection method.
- In the color strip projection method, a pattern consisting of a number of colored strips is projected onto the object. Then the depth coordinates for the measured points are determined, and a 3D model of the object is generated. The colored strips can be clearly identified by their color. Four colored strips and three color transitions can, for example, be used for color-coding the colored strips. The colored strips can for example be generated using a slide.
- For the optical measurement, a different strip projection method can also be used in which the strips are coded using different optical properties such as intensity, color, polarization, coherence, phase, contrast, location or propagation time.
- The strip width of such strip projection methods can for example be 130 μm in the measured volume of the object to be measured.
- A so-called confocal chromatic triangulation method can also be used for the measurement in which the concepts of confocal measurement and triangulation measurement are combined with each other. The basic idea consists of coloring the surface of an object so that the height coordinate can be directly inferred from a color. The colors are generated by a spectral splitting of the projected light, and each wavelength is focused on its own height coordinate.
- During the measurement, the hand-held digital camera is moved relative to the dental object such as a lower jaw or an upper jaw during which the three-dimensional optical images are generated at regular intervals in time. The individual images can for example be generated at a cyclical frequency between 10 Hz and 20 Hz. Then the individual images are recorded using a computer and combined into a global image.
- In the first step, a subsection of the upper jaw or the entire upper jaw is thus measured, and the first 3D model is generated therefrom. In the second step, the second subsection of the lower jaw or the entire lower jaw is measured, and the second 3D module is generated by registration. The first 3D model and/or the second 3D model can, however, be distorted by the registration error, or by a calibration error in comparison to the actual dimensions of the measured object. The calibration error can for example be caused by faulty settings of camera parameters of the dental camera. With a dental camera based on the strip projection method, the relevant camera parameters are the distance between the camera and the object being measured, the angle of incidence, as well as the grid interval of a grid to generate a strip pattern.
- The camera parameters can also be based on a pinhole camera model, wherein a distinction is drawn between intrinsic and extrinsic. Possible intrinsic parameters are for example the focal length of the camera, the pixel coordinates of the image center, and list parameters. The extrinsic parameters can comprise the rotation and translation between the camera and projector.
- The registration error can for example be caused by the following factors: An excessively small overlapping area, insufficient waviness of the object surface in the overlapping area, insufficient roughness of the object surface in the overlapping area, an insufficient number of characteristic geometries in the overlapping area such as dental protuberances or fissures, and/or defective image quality in the overlapping area. The registration is for example faulty in those instances when the dental camera moves too quickly in relation to the object, which causes the size of the overlapping area to be insufficient. Another reason could be that the autofocus of the digital camera is not sharply set, thus causing the object to be indistinctly imaged such that the quality of the image is insufficient. An additional reason could be that movable objects such as the tongue of the patient or a finger of the treating dentist are recorded during measurement. Consequently, the overlapping areas of the images do not correspond. The cited reasons could hence lead to a distorted first 3D model of the upper jaw and/or a distorted second 3D model of the lower jaw in comparison to the object.
- The geometric positional relationship between the first 3D model and the second 3D model is hence determined using the lateral image and/or using the contact pattern.
- The lateral image can for example be made from a labial direction, or a buccal direction, or at a slight angle thereto. It is key for the lateral image to comprise at least parts of the first subsection of the upper jaw and the second subsection of the lower jaw. The lateral image thereby makes it possible to determine the positional relationship between the first 3D model and the second 3D module, aided by characteristic structures of the upper jaw and lower jaw, or with the help of markers placed on the teeth. In addition to the first lateral image, additional lateral images can also be created from different directions in order to check the positional relationship of different areas of the 3D models. The lateral image, or the buccal image, can for example be taken in the area of
teeth - The contact pattern can for example be measured using an occlusion paper, wherein the contact areas or contact points between the upper jaw and lower jaw are recorded in a closed-bite position. The precise positional relationship between the first 3D model of the upper jaw and the second 3D model of lower jaw can then be determined with the aid of these contact areas.
- The occlusion paper makes it possible to measure the proximal contact or contact areas of the contact pattern between the upper jaw and lower jaw in the closed-bite position. The positional relationship between the first 3D model and the second 3D model can then be determined with the assistance of a computer, while using the generated contact pattern. This is done by simulating the contact areas at which the upper jaw come into contact with the lower jaw in closed-bite position using the measured contact pattern, and using the geometries of the first 3D model and second 3D model. As a result of this simulation, it is determined in the next step precisely where the virtual contact areas are arranged on the first 3D model and the second 3D model. Subsequently, the positional relationship between the first 3D model and second 3D model is then established by overlapping the corresponding virtual contact areas of the two 3D models, or minimizing the distance between the contact areas.
- An advantage of this method is that a check of the generated 3D models is easily enabled using the lateral image and/or using the contact pattern. This minimizes the registration error and/or the calibration error.
- Advantageously, additional lateral images can be used to determine the positional relationship.
- This makes it possible to check the positional relationship of primary areas of the upper jaw or lower jaw. The lateral images can be taken from a number of directions, such as for tooth pairs 14-44, 11-41 and 24-34 according to the FDI notation. The lateral images can be taken from a labial or buccal direction, or from an oblique direction that is at a maximum angle of 30° to the labial direction.
- To determine the positional relationship using a pattern recognition method, the lateral image can advantageously be searched for a first surface structure from the first 3D model, and for a second surface structure consisting of the second 3D model, wherein the positional relationship between the first 3D model and the second 3D model is determined with the aid of arrangement of the first surface structure relative to the second surface structure in the lateral image.
- The first surface structure and second surface structure can be characteristic structures of the upper jaw or the lower jaw such as certain teeth, gingiva structures or marks applied to the teeth. The precise position of these surface structures is determined by means of the pattern recognition method, and the positional relationship between the first 3D model and second 3D model is determined therefrom.
- Advantageously, the contact areas of the contact pattern can depict local correspondences between the first 3D model and the second 3D model.
- The contact areas hence arise from the contact between the tooth protuberances and tooth fissures in the upper jaw and lower jaw in the closed-bite position and accordingly correlate to local correspondences which enable the positional relationship between the two 3D models to be determined.
- Advantageously, the first 3D model and/or the second 3D model can be deformed or aligned with use of the determined geometric positional relationship in order to minimize a first deviation between first virtual contact areas on the first 3D model and second virtual contact areas on the second 3D model, and/or to minimize a second deviation between the arrangement of a first virtual surface structure of the first 3D model relative to a second virtual surface structure of the second 3D model and the arrangement of the corresponding first surface structure of the upper jaw relative to the corresponding second surface structure of the lower jaw from the lateral image.
- The registration error and/or calibration error can cause the first 3D model and second 3D model to be distorted or shifted in comparison to the actual dimensions of the measured object. This leads to the first deviation between contact areas in comparison to the contact pattern, and to the second deviation of the virtual surface structures in comparison to the lateral image. To correct this measuring error, the first 3D model and/or the second 3D model are distorted or deformed such that the contact conditions arising from the contact pattern and the conditions arising from a lateral image, or from a number of lateral images, are satisfied. The first deviation and second deviation are thereby minimized as much as possible. For example, the minimization method of the sum of error squares can be used therefor. The first 3D model or second 3D model can be deformed by means of a deformation method in which the respective 3D model is divided into different sections which are connected to each other by simulated springs. The simulated spring forces of the springs cause the deformation to be evenly distributed to these sections, and the respective 3D model is flexibly deformed. These sections can for example be the individual optical three-dimensional images which were combined into the respective 3D model.
- Advantageously, the first 3D model and/or the second 3D model can be deformed with the aid of the determined geometric positional relationship, such that the first virtual contact areas on the first 3D model correspond with the second virtual contact areas on the second 3D model, and/or such that the first virtual surface structure of the first 3D model is arranged relative to the second virtual surface structure of the second 3D model, and the corresponding first surface structure of the upper jaw is arranged relative to the corresponding second surface structure of the lower jaw from the lateral image.
- The measuring error arising from the registration error and calibration error is thereby completely corrected.
- Advantageously, the second 3D model that has a higher measuring precision the first 3D model can remained unchanged, and the first 3D model can be deformed to adapt to the second 3D model.
- A check is performed of whether the criteria are sufficient for successful registration. It can thereby be ascertained that the first 3D model or the second 3D model has a smaller registration error and hence a greater measuring precision than the second 3D model or the first 3D model. Subsequently, the more precise 3D model remains unchanged, and the more imprecise 3D model is adapted thereto. As a result, the deviation between the 3D models and the actual dimensions of the object is extremely small after the adaptation.
- Advantageously, the first 3D model that has a higher measuring precision than the second 3D model can remained unchanged, and the second 3D model can be deformed to adapt to the first 3D model.
- Advantageously, the individual images of the first 3D model and second 3D model can be recorded using a global registration. In the global registration, each image to be recorded is recorded both with a previous image as well as with another already made image in order to lessen a registration error. The image to be recorded, the previous image, and additional image have common overlapping areas. The additional image can be the pre-previous image, or an image taken before that in the image series.
- The registration error is reduced by means of the global registration in that each image is not just compared with the previous image, but rather also with additional images with which it possesses common overlapping areas. In the registration of the particular image with the previous image and other images taken before then, contradictions in the positional relationship can arise that are ascribable to registration error and/or calibration error. To determine the actual positional relationships, an average of the positional relationships can be calculated from the registration of the previous image and other images to mutually cancel the contradictions and minimize the global registration error.
- The invention will be explained with reference to the drawings. In the figures:
-
FIG. 1 shows a sketch to illustrate the existing method for registration, -
FIG. 2 shows a sketch to illustrate the determination of the geometric positional relationship between the 3D models, -
FIG. 3 shows the upper jaw and lower jaw from the occlusal direction after marking the contact areas, -
FIG. 4 shows the first 3D model of the upper jaw and the second 3D model of the lower jaw with a registration error, -
FIG. 5 shows the result of the correction fromFIG. 4 . -
FIG. 1 shows a sketch to illustrate the present method for recording individual three-dimensional optical images 1 to form a global image of a tooth situation comprising anupper jaw 2 and alower jaw 3. During the measurement, a digital camera that is based on a strip projection method or a confocal optical method is moved in a first step along a first direction of movement 5 around theupper jaw 2 in order to measure afirst 3D model 6, and then in a second step along a second direction of movement 7 around thelower jaw 3 in order to measure asecond 3D model 8. During the measurement, the three-dimensional optical images are generated at regular intervals in time. The individual images can for example be generated at a cyclical frequency between 10 Hz and 20 Hz. Then the individual images 1 are recorded with each other using the overlapping areas 9 that are depicted in dashed lines and combined into thefirst 3D model 6 andsecond 3D model 8. - A
computer 10 records the measured data of thedigital camera 4, calculates the individual images 1, records the individual images 1, and combines the individual images into thefirst 3D model 6 andsecond 3D model 8. The user has the option of moving and rotating thefirst 3D model 6 andsecond 3D model 8 by means of a cursor using input means such as akeyboard 11 and a mouse 12 in order to change the direction of observation. - The
first 3D model 6 andfirst 3D model 8 can comprise the entire upper jaw or lower jaw, or only a subsection. - To generate the global image of the tooth situation, it is then necessary to determine a geometric positional relationship between the
first 3D model 6 and thesecond 3D model 8. -
FIG. 2 shows a sketch to illustrate the determination of the geometric positional relationship between the3D models dental camera 4 fromFIG. 1 , additionallateral images first image direction 23, asecond image direction 24 and athird image direction 25. Thelateral images lateral image 21 is hence made from a buccal direction in the area of tooth pairs 11-41 and 21-31 according to the FDI notation. Then thelateral images computer 10 fromFIG. 1 for a first surface structure of the first 3D model and for second surface structure from the second 3D model. The first surface structure or second surface structure can be anindividual tooth 26 which in the present case corresponds totooth 14 according to the FDI notation, a group ofteeth 27 that are depicted with dashed lines, andteeth characteristic structure 28 of the gingiva. The positional relationship between thefirst 3D model 6 and thesecond 3D model 8 can be determined by using the arrangement of thefirst surface structure second surface structure 28 inlateral images first 3D model 6 and/or the second 3D model Bare distorted by a registration error or calibration error, the differentlateral images first 3D model 6 and/orsecond 3D model 8 in order to generate an error-free global image. - In addition to the lateral images, an
occlusion paper 29 can be placed between theupper jaw 2 andlower jaw 3. Then theupper jaw 2 andlower jaw 3 are brought into the depicted closed-bite position, wherein a colored layer ofocclusion paper 29 colors certain contact areas between theupper jaw 2 andlower jaw 3. As in the depicted instance, theocclusion paper 29 can consist of an individual sheet or several strips that are clamped between theupper jaw 2 andlower jaw 3. After the contact areas have been marked with theocclusion paper 29, theupper jaw 2 andlower jaw 3 are measured as depicted inFIG. 1 , wherein thefirst 3D model 6 and the second 3D model are generated with the marked contact areas. -
FIG. 3 shows theupper jaw 2 andlower jaw 3 from the occlusal direction after marking thecontact areas 30 using the occlusion paper. Thefirst contact areas 31 on theupper jaw 2 correspond to thesecond contact areas 32 on thelower jaw 3. Thefirst contact areas 31 and correspondingsecond contact areas 32 constitute local correspondences that enable the geometric positional relationships to be determined between the first 3D model of theupper jaw 2 and the second 3D model of thelower jaw 3. -
FIG. 4 shows a sketch of thefirst 3D model 6 of theupper jaw 2 and thesecond 3D model 8 of thelower jaw 3, wherein thesecond 3D model 8 significantly deviates from the first 3D model in onearea 40 due to a registration error and/or a calibration error. In onearea 41, thefirst 3D model 6 was brought into correspondence with thesecond 3D model 8. The arrows designate theimaging directions first contact areas 31 hence deviate significantly in thefirst area 40 from thesecond contact areas 32 of thesecond 3D model 8, wherein thefirst contact areas 31 are brought into correspondence with thesecond contact areas 32 in the second area. In addition to thecontact areas lateral images 3D models area 41. - To correct the registration error and/or the calibration error, the
second 3D model 8 is deformed along adeformation direction 42 such that a first deviation between the first contact areas on thefirst 3D model 6 andsecond contact areas 32 on thesecond 3D model 8 is minimized. As an additional criterion for the correction, thelateral image 25 can also be used, wherein a second deviation between the arrangement of a firstvirtual surface structure 27 of thefirst 3D model 6 relative to a secondvirtual surface structure 28 of thesecond 3D model 8 is minimized by arranging thecorresponding surface structures lateral image 25. In this optimization process, the least squares method can be used, for example. - In the present case, the first 3D model has a greater measuring precision such that the
second 3D model 8 subject to the registration error is adapted to thefirst 3D model 6 alongdeformation direction 42. - Alternately, the
second 3D model 8 can remain unchanged, and thefirst 3D model 6 can be adapted thereto. - Conditions from the
lateral images contact areas -
FIG. 5 shows the result of the correction fromFIG. 4 , wherein thefirst contact areas 31 of thefirst 3D model 6 and thesecond contact areas 32 of thesecond 3D model 8 are caused to overlap both in thefirst area 40 as well as in thesecond area 41. The result of the method is hence aglobal image 50 comprising thefirst 3D model 6 and thesecond 3D model 8 which were adapted to each other by means of thecontact areas lateral images -
-
- 1 Individual three-dimensional optical images
- 2 Upper jaw
- 3 Lower jaw
- 4 Dental camera
- 5 First direction of movement
- 6 First 3D model
- 7 Second direction of movement
- 8 Second 3D model
- 9 Overlapping areas
- 10 Computer
- 11 Keyboard
- 12 Mouse
- 13 Cursor
- 20-22 Lateral images
- 23 First image direction
- 24 First image direction
- 25 Third image direction
- 26 Individual tooth
- 27 Group of teeth
- 28 Characteristic structure of the gingiva
- 29 Occlusion paper
- 30 Markings of the contact areas
- 31 First contact areas
- 32 Second contact areas
- 40 First area
- 41 Second area
- 42 Deformation direction
- 50 Global image
Claims (5)
1. A method, comprising the steps of:
generating a first 3D model of a first subsection of an upper jaw from a portion of a plurality of three-dimensional optical images;
generating a second 3D model of a second subsection of a lower jaw from another portion of the plurality of three-dimensional optical images;
determining a geometric positional relationship between the first 3D model and the second 3D model based on
i. a lateral three-dimensional optical image or
ii. a lateral three-dimensional optical image and a contact pattern,
wherein the lateral-three dimensional optical image has an image area which at least partially comprise the first subsection of the upper jaw and at least partially comprises the second subsection of the lower jaw,
wherein the contact pattern comprises a plurality of contact areas between the upper jaw and lower jaw, and
deforming the first 3D model and/or the second 3D model based on the determined geometric positional relationship, wherein the plurality of contact areas respectively correspond to a plurality of local correspondences between the first 3D model and the second 3D model.
2. The method according to claim 1 , wherein, in the determining, a plurality of lateral images are used to determine the geometric positional relationship.
3. A method, comprising the steps of:
generating a first 3D model of a first subsection of an upper jaw from a portion of a plurality of three-dimensional optical images;
generating a second 3D model of a second subsection of a lower jaw from another portion of the plurality of three-dimensional optical images;
determining a geometric positional relationship between the first 3D model and the second 3D model based on
i. a lateral three-dimensional optical image or
ii. a lateral three-dimensional optical image and a contact pattern,
wherein the lateral-three dimensional optical image has an image area which at least partially comprises the first subsection of the upper jaw and at least partially comprises the second subsection of the lower jaw,
wherein the contact pattern comprises a plurality of contact areas between the upper jaw and lower jaw, and
adjusting the first 3D model and/or the second 3D model such that first virtual contact areas on the first 3D model correspond with second virtual contact areas on the second 3D model and/or such that such that a first virtual surface structure of the first 3D model is arranged relative to a second virtual surface structure of the second 3D model.
4. The method according to claim 3 , wherein the adjusting further comprises:
deforming the first 3D model and/or the second 3D model along a deformation direction such that a first deviation between the first virtual contact areas on the first 3D model and second virtual contact areas on the second 3D model is minimized, and/or
deforming the first 3D model and/or the second 3D model such that, a second deviation between an arrangement of a first virtual surface structure of the first 3D model relative to a second virtual surface structure of the second 3D model is minimized, such that the first virtual surface structure of the first 3D model and the second virtual surface structure of the second 3D model are arranged on the lateral three-dimensional optical image.
5. The method according to claim 4 , wherein the minimization is achieved using a least squares method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/957,060 US20180240270A1 (en) | 2012-08-14 | 2018-04-19 | Method for recording individual three-dimensional optical images to form a global image of a tooth situation |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102012214470.6 | 2012-08-14 | ||
DE102012214470.6A DE102012214470B4 (en) | 2012-08-14 | 2012-08-14 | Method for registering individual three-dimensional optical images to form an overall image of a tooth situation |
US14/421,219 US9978172B2 (en) | 2012-08-14 | 2013-08-14 | Method for recording individual three-dimensional optical images to form a global image of a tooth situation |
PCT/EP2013/066993 WO2014027024A2 (en) | 2012-08-14 | 2013-08-14 | Method for recording individual three-dimensional optical images to form a global image of a tooth situation |
US15/957,060 US20180240270A1 (en) | 2012-08-14 | 2018-04-19 | Method for recording individual three-dimensional optical images to form a global image of a tooth situation |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/421,219 Continuation US9978172B2 (en) | 2012-08-14 | 2013-08-14 | Method for recording individual three-dimensional optical images to form a global image of a tooth situation |
PCT/EP2013/066993 Continuation WO2014027024A2 (en) | 2012-08-14 | 2013-08-14 | Method for recording individual three-dimensional optical images to form a global image of a tooth situation |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180240270A1 true US20180240270A1 (en) | 2018-08-23 |
Family
ID=49212739
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/421,219 Active US9978172B2 (en) | 2012-08-14 | 2013-08-14 | Method for recording individual three-dimensional optical images to form a global image of a tooth situation |
US15/957,060 Abandoned US20180240270A1 (en) | 2012-08-14 | 2018-04-19 | Method for recording individual three-dimensional optical images to form a global image of a tooth situation |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/421,219 Active US9978172B2 (en) | 2012-08-14 | 2013-08-14 | Method for recording individual three-dimensional optical images to form a global image of a tooth situation |
Country Status (5)
Country | Link |
---|---|
US (2) | US9978172B2 (en) |
EP (1) | EP2884942B1 (en) |
JP (1) | JP6278962B2 (en) |
DE (1) | DE102012214470B4 (en) |
WO (1) | WO2014027024A2 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014207667A1 (en) * | 2014-04-23 | 2015-10-29 | Sirona Dental Systems Gmbh | Method for carrying out an optical three-dimensional recording |
GB201408031D0 (en) * | 2014-05-07 | 2014-06-18 | Univ Leeds | A dental model scanner |
AT518148A1 (en) * | 2016-01-14 | 2017-07-15 | Heinrich Steger | Method for creating a digital dentition model |
WO2017207454A1 (en) * | 2016-05-30 | 2017-12-07 | 3Shape A/S | Predicting the development of a dental condition |
DE102016213399A1 (en) * | 2016-07-21 | 2018-01-25 | Sirona Dental Systems Gmbh | Surveying system and method for measuring an implant-implant situation |
DE102017124580B3 (en) | 2017-10-20 | 2019-01-31 | Sicat Gmbh & Co. Kg | Method for detecting and visualizing tooth positions under the influence of biting forces |
DE102017127128A1 (en) * | 2017-11-17 | 2019-05-23 | Gustav Gerstenkamp | Method for virtual modeling of a dental arch |
JP6803876B2 (en) * | 2018-07-03 | 2020-12-23 | 株式会社モリタ製作所 | Intraoral three-dimensional measurement method and handy scanner |
US10849723B1 (en) | 2019-05-07 | 2020-12-01 | Sdc U.S. Smilepay Spv | Scanning device |
US20220061957A1 (en) * | 2020-08-31 | 2022-03-03 | James R. Glidewell Dental Ceramics, Inc. | Automatic bite setting |
US11759296B2 (en) * | 2021-08-03 | 2023-09-19 | Ningbo Shenlai Medical Technology Co., Ltd. | Method for generating a digital data set representing a target tooth arrangement |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3380553B2 (en) * | 1992-11-09 | 2003-02-24 | オルムコ コーポレイション | Custom orthodontic appliance forming method and apparatus |
JPH09206320A (en) * | 1996-02-02 | 1997-08-12 | Technol Res Assoc Of Medical & Welfare Apparatus | Plate denture design supporting device |
US6152731A (en) * | 1997-09-22 | 2000-11-28 | 3M Innovative Properties Company | Methods for use in dental articulation |
US6648640B2 (en) * | 1999-11-30 | 2003-11-18 | Ora Metrix, Inc. | Interactive orthodontic care system based on intra-oral scanning of teeth |
US6582229B1 (en) | 2000-04-25 | 2003-06-24 | Align Technology, Inc. | Methods for modeling bite registration |
DE10252298B3 (en) * | 2002-11-11 | 2004-08-19 | Mehl, Albert, Prof. Dr. Dr. | Process for the production of tooth replacement parts or tooth restorations using electronic tooth representations |
DE102004038136B4 (en) * | 2004-07-08 | 2019-06-13 | Sirona Dental Systems Gmbh | Method of constructing the surface of a three-dimensional data dental prosthesis |
US20080176182A1 (en) * | 2006-10-05 | 2008-07-24 | Bruce Willard Hultgren | System and method for electronically modeling jaw articulation |
ATE446723T1 (en) * | 2006-11-28 | 2009-11-15 | Degudent Gmbh | METHOD FOR PRODUCING A DENTAL RESTORATION |
JP5046238B2 (en) * | 2007-02-27 | 2012-10-10 | 株式会社モリタ製作所 | X-ray CT imaging image display method, X-ray CT image display device, X-ray CT imaging device |
JP4869199B2 (en) * | 2007-09-28 | 2012-02-08 | 富士フイルム株式会社 | Radiography equipment |
EP2229914B1 (en) * | 2009-03-20 | 2018-05-30 | Nobel Biocare Services AG | System and method for aligning virtual models |
KR101001678B1 (en) * | 2009-07-15 | 2010-12-15 | 전남대학교산학협력단 | Method for acquiring 3-dimensional dentition image |
US8896592B2 (en) * | 2009-08-21 | 2014-11-25 | Align Technology, Inc. | Digital dental modeling |
-
2012
- 2012-08-14 DE DE102012214470.6A patent/DE102012214470B4/en active Active
-
2013
- 2013-08-14 JP JP2015526979A patent/JP6278962B2/en active Active
- 2013-08-14 EP EP13763194.1A patent/EP2884942B1/en active Active
- 2013-08-14 WO PCT/EP2013/066993 patent/WO2014027024A2/en active Application Filing
- 2013-08-14 US US14/421,219 patent/US9978172B2/en active Active
-
2018
- 2018-04-19 US US15/957,060 patent/US20180240270A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
DE102012214470B4 (en) | 2021-12-30 |
JP2015530137A (en) | 2015-10-15 |
WO2014027024A2 (en) | 2014-02-20 |
JP6278962B2 (en) | 2018-02-14 |
US9978172B2 (en) | 2018-05-22 |
US20150235412A1 (en) | 2015-08-20 |
EP2884942B1 (en) | 2018-06-27 |
DE102012214470A1 (en) | 2014-02-20 |
WO2014027024A3 (en) | 2014-07-31 |
EP2884942A2 (en) | 2015-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180240270A1 (en) | Method for recording individual three-dimensional optical images to form a global image of a tooth situation | |
JP5346033B2 (en) | Method for optically measuring the three-dimensional shape of an object | |
JP5689681B2 (en) | Non-contact probe | |
KR102236297B1 (en) | Calibration device and method for calibrating a dental camera | |
US11020208B2 (en) | System, device, and method for intraoral scanning accuracy | |
US7806687B2 (en) | Occlusal state correction-supporting apparatus, program, and recording medium | |
US9462993B2 (en) | Method and reference model for checking a measuring system | |
CN108759669B (en) | Indoor self-positioning three-dimensional scanning method and system | |
US20060120582A1 (en) | Method for determining dental alignment using radiographs | |
US20140104406A1 (en) | Method for the optical three-dimensional measurement of a dental object | |
CN107941145A (en) | System and method for the quality testing based on measurement | |
US8662890B2 (en) | Method for manufacturing a dental restoration | |
CN110462681B (en) | Multiple surfaces for physical-to-image/image-to-physical registration and image verification | |
JP2005509877A (en) | Computer vision system calibration method and system | |
KR20180093939A (en) | How to calibrate an X-ray image | |
JP2005509879A (en) | Method for determining corresponding points in 3D measurement | |
US10080636B2 (en) | Method for measuring a dental situation | |
JP5213783B2 (en) | Laser processing apparatus and laser processing method | |
JP2015524724A (en) | Method for recording individual three-dimensional optical images of dental objects | |
WO2020012707A1 (en) | Three-dimensional measurement device and method | |
KR20210023431A (en) | Position tracking system using a plurality of cameras and method for position tracking using the same | |
Knyaz | Image-based 3D reconstruction and analysis for orthodontia | |
JP5786999B2 (en) | Three-dimensional shape measuring device, calibration method for three-dimensional shape measuring device | |
KR20180040316A (en) | 3D optical scanner | |
JP2005229111A (en) | Method of presuming at least one part arrangement position on substrate, and equipment executing the method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |