EP3376989A1 - Verfahren zur visualisierung einer zahnsituation - Google Patents
Verfahren zur visualisierung einer zahnsituationInfo
- Publication number
- EP3376989A1 EP3376989A1 EP16815727.9A EP16815727A EP3376989A1 EP 3376989 A1 EP3376989 A1 EP 3376989A1 EP 16815727 A EP16815727 A EP 16815727A EP 3376989 A1 EP3376989 A1 EP 3376989A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data set
- face
- tooth
- data record
- facial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000001815 facial effect Effects 0.000 claims abstract description 34
- 238000012800 visualization Methods 0.000 claims abstract description 16
- 210000000214 mouth Anatomy 0.000 claims description 12
- 238000001514 detection method Methods 0.000 claims description 5
- 230000002596 correlated effect Effects 0.000 claims description 4
- 235000008331 Pinus X rigitaeda Nutrition 0.000 claims description 2
- 235000011613 Pinus brutia Nutrition 0.000 claims description 2
- 241000018646 Pinus brutia Species 0.000 claims description 2
- 230000008569 process Effects 0.000 claims description 2
- 210000001738 temporomandibular joint Anatomy 0.000 claims description 2
- 210000005036 nerve Anatomy 0.000 claims 1
- 230000000694 effects Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 210000004513 dentition Anatomy 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 210000004373 mandible Anatomy 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000036346 tooth eruption Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/24—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0088—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
- A61B5/1176—Recognition of faces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4538—Evaluating a particular part of the muscoloskeletal system or a particular medical condition
- A61B5/4542—Evaluating the mouth, e.g. the jaw
- A61B5/4547—Evaluating teeth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7425—Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C13/00—Dental prostheses; Making same
- A61C13/0003—Making bridge-work, inlays, implants or the like
- A61C13/0004—Computer-assisted sizing or machining of dental prostheses
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
- A61C9/0053—Optical means or methods, e.g. scanning the teeth by a laser or light beam
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/165—Detection; Localisation; Normalisation using facial parts and geometric relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2074—Interface software
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/368—Correlation of different images or relation of image positions in respect to the body changing the image on a display according to the operator's position
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
- A61B2576/02—Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
Definitions
- the invention relates to a method for visualizing a dental situation, wherein a virtual dental data record of the dental situation is present and wherein an image of a patient's face is recorded and stored as a facial data set.
- Previous methods for visualizing a dental situation for example for visualizing the effect of a planned tooth replacement in the face of a patient, usually show a rigid or at least rotatable reception of the patient's face, in which a virtual tooth situation with correct orientation and positioning is inserted.
- a method for visualizing a tooth situation is known, for example, from DE 20 2008 014 344 U1, DE 10 2013 102 421 A1 or US 2012/0095732 A1.
- FIG. Vukadinovic and M. Pantic "Automatic Facial Feature Point Detection Using Gabor Feature Based Boosted Classifiers” (2005) IEEE International Conference on Systems, Man and Cybernetics, from M. Dantone et al, "Real-time Facial Feature Detection Using Conditional Regression Forests” (2012) IEEE Conference on Computer Vision and Pattern Recognition or known from the US 2010/030578 AI.
- Object of the present invention is to develop and improve the state of the art. Presentation of the invention
- An object of the invention is a method for visualizing a dental situation in which an image of a patient's face is recorded and saved as a facial data set, wherein feature points within the facial data set are automatically detected by means of a face recognition method, whereby an oral space lying between lips is automatically recognized on the basis of the feature points wherein, based on the feature points, a position of the face in the facial dataset and a three-dimensional direction of an orientation of the face in the face dataset are automatically detected, the tooth datum aligned according to the three-dimensional direction of the face orientation and the position of the face to the face dataset is positioned, wherein the oral space is partially overlaid and / or superimposed and / or replaced within the face data set by the tooth data set and the result is displayed as a visualization data set.
- the tooth situation can be any tooth situation, both single or multiple teeth, as well as an image of a complete dentition.
- it may be a purely planned virtual model or generated by a model by means of a scan record of a planned denture.
- It can also be a data record of the actual tooth situation generated for example by means of a camera.
- the latter may, for example, have been supplemented or manipulated by tooth replacement parts in order, for example, to reproduce the planned result of an orthodontic treatment.
- the feature points are points that can be recognized within a face automatically, ie by means of a computer and without interaction of the user, eg the tip of the nose within the scope of a two-dimensional or three-dimensional face recognition process , the corners of the mouth, the eyes etc.
- FFP Facial Feature Points
- biometric feature points are points that can be recognized within a face automatically, ie by means of a computer and without interaction of the user, eg the tip of the nose within the scope of a two-dimensional or three-dimensional face recognition process , the corners of the mouth, the eyes etc.
- An advantage of the method according to the invention is that a planned dental situation for a patient or a treating physician or dental technician can be visualized within the patient's face.
- the combination of up-to-date reality data with virtual data is often referred to as augmented reality.
- the effect of the planned dental situation is illustrated in particular by complete superimposition or replacement of the current dental situation by the planned dental situation.
- a planned dental situation and a current dental situation can be displayed simultaneously. For example, the scheduled
- Tooth situation and the current dental situation so the area of the face data set, which is located within the oral cavity, colored differently and transparent. As a result, for example, during the execution of a preparation of the current dental situation, the correspondence or deviation from a planned dental situation can be checked.
- the facial data set is two-dimensional or three-dimensional.
- a two-dimensional data set can be generated simply or with a simple device, for example a camera, while a three-dimensional data record can be generated by a three-dimensional data set. a better quality of the visualization data set can be achieved.
- the tooth data set is three-dimensional.
- the tooth data set can be easily combined with the face data set in an orientation adapted to the orientation of the face in the face data set or displayed in the context of the visualization data record.
- the method is carried out repeatedly over a period of time.
- the repetitions may be automatically, e.g. be carried out at ever equal intervals. But it could also be provided a trigger mechanism for the repetitions. This could e.g. lead at different time intervals.
- the time intervals amount to a maximum of 42 ms, as a result of which a video rate of
- At least four feature points are automatically detected for automatic detection of the oral cavity to ensure correct detection.
- at least one three-dimensional additional data set according to the three-dimensional direction of Orientation of the face aligned and positioned according to the position of the face to the visualization data set, wherein the face data set and / or the tooth data set in the visualization data set is partially overlaid and / or overlaid and / or replaced by the additional data set.
- the additional data set is a representation of a tooth root segmented from a volume data record and correlated to a current tooth situation with respect to the position, a representation of a pine mandible, a representation of a temporomandibular joint, segmented from a magnetic resonance tomograph and correlated to a current tooth situation with respect to the position -
- the representation of a drill or a borehole For planning or monitoring, for example, a planned course of a borehole or a current position of a drill determined by means of a sensor can be displayed in the visualization data record.
- FIG. 1 shows a schematic representation of a first
- 2 shows a schematic view of a digital data set of a denture designed for the care of a patient
- 3 shows a schematic view of a two-dimensional digital data record of a patient's face
- 4A, B is a schematic view of a facial data set with feature points for determining the orientation of the face within the
- Fig. 5 is a schematic view of a visualization of a planned denture.
- FIG. 1 shows a first embodiment of a method according to the invention is outlined.
- a three-dimensional tooth data set 1 of a dental prosthesis designed for a patient is provided.
- An example of a three-dimensional tooth data set 1 is sketched in FIG.
- a facial data set 2 is provided by taking a two-dimensional image of the patient's face or by taking a series of two-dimensional images at short time intervals a series of facial data sets 2 Image of the series in each case a face data set 2 is generated.
- the recording of one or more two-dimensional images can be done with a camera, for example.
- the face may e.g. be filmed by a camera.
- a facial data set 3 is shown by way of example in FIG.
- a face recognition method By means of a face recognition method, in a method step S3 feature points 3, ie characteristic see points of the face, such as the corners of the mouth, detected within the face data set 2. On the basis of the identified feature points 3, the mouth or the lips in the facial data set 2 are detected in a method step S4. As illustrated in FIGS. 4A and 4B with reference to two differently aligned faces, a three-dimensional direction 5 of the orientation of the patient's face in the facial data set 2 is determined on the basis of the feature points 3 in a method step S5. For a face-aligned face, for example, while the direction 5 of the alignment is perpendicular to the image plane of the facial data set 2 and out of the image plane, the direction of a profile-picked face 5 is within the image plane. In method step S5, the three-dimensional orientation of the face is thus deduced from the two-dimensional image of the face.
- an area located between the lips is segmented and cut out as oral space 4 within facial data set 2.
- step S7 the tooth data set 1 is rotated and positioned relative to the image plane of the face data set 2 such that the alignment of the tooth data set 1 with the determined direction 5 matches the orientation of the face and the position matches the position of the oral space 4.
- the relative orientation of the denture to the face is defined for this purpose in method step S8, for example based on a relationship of an alignment of the tooth data set 1 within the image plane to a direction of the orientation of the face within the face data set 2.
- the relationship to the orientation of the face in the face For example, record can be based on existing remaining or replaced teeth.
- Corresponding data are stored, so that the alignment of the tooth data set 1 in method step S7 takes place automatically on the basis of the stored data and the determined orientation 5 of the face within the face data set 2.
- Method step S8 is therefore not to be executed again for each facial data set generated and therefore shown in dashed lines in Figure 1. It is also possible to make the alignment according to method step S8 once by hand.
- the relative alignment of the tooth data set 1 thus produced to the direction 5 of the orientation of the face is stored within the facial data set 2, so that subsequently with a change in the orientation of the face and a change in direction 5 caused thereby, a positioning of the tooth data set takes place automatically can.
- the aligned three-dimensional tooth data set 1 is displayed behind the face data set.
- the data set composed of the face data set 2 and the tooth data set 1 with cut-out oral space 4 and shown diagrammatically in FIG. 5 is referred to herein as a visualization data record.
- the tooth data set 2 is reworked according to a further development in a method step S10.
- the provided tooth data set 2 is a data record generated in a CAD method and for a CAM method, that is to say for a computer-aided data set Production process is generated.
- a data set to be used for the computer-aided production only has to have information regarding the shape, that is to say the position of the surface.
- the tooth data set 1 has the most realistic possible appearance.
- the tooth data set 1 provided according to method step S1 is correspondingly revised in accordance with the optional method step S10. If the face data set is repeatedly provided at regular time intervals according to method step S2 and at least method steps S3, S4, S5, S6, S7 and S9 are executed for each new face data set, then the display means corresponds to a virtual mirror or to the constantly newly generated one Visualization record a virtual mirror image. The patient can move the head and / or open and close the mouth to view the effect of the planned tooth replacement from different perspectives.
- the oral cavity 4 of the facial data set 2 is not cut out but only detected and partially overlaid by the tooth data set 1, a deviation of a current dental situation from a planned dental situation can be illustrated.
- the oral cavity 4 and the dental data set 1 partially transparent and displayed simultaneously, for example, differently colored.
- an additional data record into the visualization data record by aligning the additional data record in accordance with the determined direction 5 of the orientation of the face and the position of the face in the face data set 2 and overlaying the face data set 2 and / or the tooth data set 1.
- additional information for example, about the jawbone, TMJ, tooth roots, the location of a planned borehole or the position and orientation of a current drill during its use, can be graphically displayed.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Epidemiology (AREA)
- Biophysics (AREA)
- Theoretical Computer Science (AREA)
- Dentistry (AREA)
- Primary Health Care (AREA)
- Radiology & Medical Imaging (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physical Education & Sports Medicine (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Rheumatology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102015222782.0A DE102015222782A1 (de) | 2015-11-18 | 2015-11-18 | Verfahren zur Visualisierung einer Zahnsituation |
PCT/EP2016/077934 WO2017085160A1 (de) | 2015-11-18 | 2016-11-17 | Verfahren zur visualisierung einer zahnsituation |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3376989A1 true EP3376989A1 (de) | 2018-09-26 |
Family
ID=57588948
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16815727.9A Pending EP3376989A1 (de) | 2015-11-18 | 2016-11-17 | Verfahren zur visualisierung einer zahnsituation |
Country Status (6)
Country | Link |
---|---|
US (1) | US10980422B2 (de) |
EP (1) | EP3376989A1 (de) |
JP (1) | JP6857178B2 (de) |
CN (1) | CN108289722B (de) |
DE (1) | DE102015222782A1 (de) |
WO (1) | WO2017085160A1 (de) |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
CN109717966B (zh) * | 2017-10-27 | 2021-04-30 | 华硕电脑股份有限公司 | 用于牙齿整形的影像仿真方法及其影像仿真装置 |
US20190254753A1 (en) | 2018-02-19 | 2019-08-22 | Globus Medical, Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
EP3530232B1 (de) | 2018-02-21 | 2021-04-28 | Ivoclar Vivadent AG | Verfahren zur ausrichtung eines dreidimensionalen modells eines gebisses eines patienten auf ein bild des gesichtes des patienten, das mit einer kamera aufgenommen wurde |
BG67204B1 (bg) * | 2018-05-14 | 2020-12-15 | Schayder Jordao Adriano | Система за производство на триизмерни дигитални зъбни модели |
CN109124805B (zh) * | 2018-07-13 | 2021-02-26 | 四川大学 | 数字化镜像cad/cam临时牙的制作方法 |
WO2020068681A1 (en) * | 2018-09-24 | 2020-04-02 | Surgical Theater Inc. | 360 vr volumetric media editor |
EP3632369B1 (de) | 2018-10-02 | 2022-02-09 | SIRONA Dental Systems GmbH | Verfahren zur integration von fotografischen gesichtsbildern und/oder filmen einer person in die planung von odontologischen und/oder kosmetischen zahnbehandlungen und/oder zur herstellung von zahnersätzen für die besagte person |
USD896254S1 (en) * | 2018-10-30 | 2020-09-15 | Perfect Mobile Corp. | Display screen with graphical user interface |
EP3689287B1 (de) | 2019-01-30 | 2022-07-27 | DENTSPLY SIRONA Inc. | System zum vorschlagen und visualisieren von zahnbehandlungen |
CN110459083B (zh) * | 2019-08-22 | 2020-08-04 | 北京众绘虚拟现实技术研究院有限公司 | 一种视觉-触觉融合的增强现实口腔手术技能训练模拟器 |
DE102019126111A1 (de) * | 2019-09-27 | 2021-04-01 | Urban Technology GmbH | Verfahren, Computerprogrammprodukt und Simulationssystem zur Erstellung und Ausgabe eines dreidimensionalen Modells eines Gebisses |
US11464581B2 (en) | 2020-01-28 | 2022-10-11 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
CN111568397A (zh) * | 2020-04-21 | 2020-08-25 | 上海上实龙创智慧能源科技股份有限公司 | 一种人体健康体征数据采集系统及方法 |
US11607277B2 (en) | 2020-04-29 | 2023-03-21 | Globus Medical, Inc. | Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
US20230380934A1 (en) * | 2020-10-29 | 2023-11-30 | Medit Corp. | Image processing method and device using the same |
KR102512838B1 (ko) * | 2020-10-29 | 2023-03-22 | 주식회사 메디트 | 이미지 처리 방법 및 이를 사용한 장치 |
KR20230163182A (ko) * | 2022-05-23 | 2023-11-30 | 주식회사 메가젠임플란트 | 인공지능을 적용한 3차원 얼굴스캔 자동매칭장치 및 그 장치의 구동방법, 그리고 매체에 저장된 컴퓨터프로그램 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7234937B2 (en) * | 1999-11-30 | 2007-06-26 | Orametrix, Inc. | Unified workstation for virtual craniofacial diagnosis, treatment planning and therapeutics |
US7717708B2 (en) * | 2001-04-13 | 2010-05-18 | Orametrix, Inc. | Method and system for integrated orthodontic treatment planning using unified workstation |
JP4093273B2 (ja) * | 2006-03-13 | 2008-06-04 | オムロン株式会社 | 特徴点検出装置、特徴点検出方法および特徴点検出プログラム |
GB0707454D0 (en) * | 2007-04-18 | 2007-05-23 | Materialise Dental Nv | Computer-assisted creation of a custom tooth set-up using facial analysis |
CN102036616B (zh) * | 2008-03-21 | 2015-05-13 | 高桥淳 | 三维数字放大镜手术支持系统 |
GB2458388A (en) | 2008-03-21 | 2009-09-23 | Dressbot Inc | A collaborative online shopping environment, virtual mall, store, etc. in which payments may be shared, products recommended and users modelled. |
KR100926978B1 (ko) * | 2008-04-08 | 2009-11-17 | 삼성전자주식회사 | 영상 수집 제어 방법 및 장치 |
US8092215B2 (en) | 2008-05-23 | 2012-01-10 | Align Technology, Inc. | Smile designer |
US8108778B2 (en) | 2008-09-30 | 2012-01-31 | Yahoo! Inc. | System and method for context enhanced mapping within a user interface |
US8088551B2 (en) | 2008-10-09 | 2012-01-03 | Micron Technology, Inc. | Methods of utilizing block copolymer to form patterns |
WO2010042990A1 (en) | 2008-10-16 | 2010-04-22 | Seeing Machines Limited | Online marketing of facial products using real-time face tracking |
DE202008014344U1 (de) | 2008-10-28 | 2010-03-25 | Edinger-Strobl, Verena, Mag. DDr. | Gebissbild-Simulationsvorrichtung |
CA2755555C (en) * | 2009-03-20 | 2018-09-11 | 3Shape A/S | System and method for effective planning, visualization, and optimization of dental restorations |
JP4794678B1 (ja) * | 2010-05-24 | 2011-10-19 | 株式会社ソニー・コンピュータエンタテインメント | 映像処理装置、映像処理方法、および映像通信システム |
JP2013531531A (ja) | 2010-06-29 | 2013-08-08 | 3シェイプ アー/エス | 2d画像配置 |
WO2012090211A1 (en) | 2010-12-29 | 2012-07-05 | Santiago Jeevan Kumar | Augmented reality computer model facebow system for use in dentistry |
KR101223937B1 (ko) * | 2011-02-22 | 2013-01-21 | 주식회사 모르페우스 | 안면보정 이미지 제공방법 및 그 시스템 |
JP2013236749A (ja) * | 2012-05-15 | 2013-11-28 | Denso Corp | 歯科インプラント手術支援装置 |
DE102012110491A1 (de) | 2012-11-02 | 2014-05-08 | Carsten Dursteler | Verfahren und Vorrichtung zur kosmetischen Zahnanalyse und Zahnberatung |
WO2014135695A1 (en) | 2013-03-08 | 2014-09-12 | 3Shape A/S | Visualising a 3d dental restoration on a 2d image |
DE102013102421A1 (de) | 2013-03-11 | 2014-09-11 | Polymetric GmbH | Verfahren zur Überlagerung von digitalisierten Darstellungen und Referenzmarkereinrichtung |
JP5883816B2 (ja) * | 2013-03-11 | 2016-03-15 | 株式会社ミウラ | 顎変形症術後顔貌予測方法及びシステム |
US9378576B2 (en) * | 2013-06-07 | 2016-06-28 | Faceshift Ag | Online modeling for real-time facial animation |
JP6635929B2 (ja) * | 2014-02-21 | 2020-01-29 | トリスペラ デンタル インコーポレイテッド | 拡張現実歯科設計方法およびシステム |
US9808326B2 (en) * | 2014-03-18 | 2017-11-07 | President And Fellows Of Harvard College | 3D dentofacial system and method |
-
2015
- 2015-11-18 DE DE102015222782.0A patent/DE102015222782A1/de not_active Withdrawn
-
2016
- 2016-11-17 EP EP16815727.9A patent/EP3376989A1/de active Pending
- 2016-11-17 US US15/775,601 patent/US10980422B2/en active Active
- 2016-11-17 CN CN201680067682.5A patent/CN108289722B/zh active Active
- 2016-11-17 WO PCT/EP2016/077934 patent/WO2017085160A1/de active Application Filing
- 2016-11-17 JP JP2018520504A patent/JP6857178B2/ja active Active
Also Published As
Publication number | Publication date |
---|---|
WO2017085160A1 (de) | 2017-05-26 |
JP2018534050A (ja) | 2018-11-22 |
DE102015222782A1 (de) | 2017-05-18 |
US20180249912A1 (en) | 2018-09-06 |
CN108289722A (zh) | 2018-07-17 |
JP6857178B2 (ja) | 2021-04-14 |
CN108289722B (zh) | 2021-08-27 |
US10980422B2 (en) | 2021-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017085160A1 (de) | Verfahren zur visualisierung einer zahnsituation | |
EP3589232B1 (de) | Verfahren zur konstruktion einer restauration | |
EP3134027B1 (de) | Verfahren zur durchführung einer optischen dreidimensionalen aufnahme | |
EP2337498B1 (de) | Verfahren zur erstellung einer dentalen 3d-röntgenaufnahme | |
EP2922491B1 (de) | Verfahren zur planung einer dentalen behandlung | |
EP2914203A1 (de) | Verfahren und vorrichtung zur kosmetischen zahnanalyse | |
DE69837353T2 (de) | System zum Aufbringen einer Vorrichtung auf eine Zahnoberfläche | |
EP0741994A1 (de) | Verfahren zur Darstellung des Kiefers | |
EP3618758B1 (de) | Verfahren zur konstruktion eines dentalen bauteils | |
EP2914201B1 (de) | Verfahren zur ermittlung mindestens eines relevanten einzelbildes eines dentalen objekts | |
EP2978384B1 (de) | Verfahren zur planung einer wurzelbehandlung eines patienten | |
WO2014139944A1 (de) | Verfahren zur überlagerung von digitalisierten darstellungen und referenzmarkereinrichtung | |
DE102014102111B4 (de) | Verfahren zur Visualisierung zahnmedizinisch relevanter anatomischer Relationen und/oder Strukturen | |
EP3682453B1 (de) | Verfahren zur ermittlung und visualisierung von zahnbewegungen und geplanten zahnumstellungen | |
EP3618759B1 (de) | Verfahren zur ermittlung von daten für die herstellung von zahnersatz | |
DE102019126111A1 (de) | Verfahren, Computerprogrammprodukt und Simulationssystem zur Erstellung und Ausgabe eines dreidimensionalen Modells eines Gebisses | |
EP3454776B1 (de) | Computerimplementiertes verfahren zur festlegung einer zahnaufstellung | |
EP3195826B1 (de) | Verfahren zum erstellen eines digitalen gebissmodells | |
DE102018204098A1 (de) | Bildausgabeverfahren während einer dentalen Anwendung und Bildausgabevorrichtung | |
EP3636202B1 (de) | Dentales fuehrungssystem zur zahnpraeparation | |
EP3599443A1 (de) | Verfahren zum optischen erfassen der oberflächengeometrie von zahnfleisch | |
DE19923978A1 (de) | Verfahren zur computergestützten patientenspezifischen Darstellung und Planung zahnärztlicher und/oder zahnprothetischer Arbeiten | |
WO2013021022A2 (de) | Datenbank und verfahren zur erzeugung einer virtuellen dentalen objektdarstellung aus einer aufnahme | |
DE102013203888B4 (de) | Computer-implementiertes Verfahren zum Festlegen der Anbringungspositionen einer Mehrzahl von Angriffselementen für einen Drahtbogen einer zahnmedizinischen Apparatur auf zugeordneten Zähnen eines Patienten sowie Anzeige mit linearer Nebeneinanderanordnung der Zähne | |
DE102013109484A1 (de) | Verfahren zum Einpassen eines Zahnersatzes in eine Reparaturstelle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180604 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20200204 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
TPAC | Observations filed by third parties |
Free format text: ORIGINAL CODE: EPIDOSNTIPA |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230524 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |