EP3319508A1 - System und verfahren zum scannen von anatomischen strukturen und zum darstellen eines scanergebnisses - Google Patents
System und verfahren zum scannen von anatomischen strukturen und zum darstellen eines scanergebnissesInfo
- Publication number
- EP3319508A1 EP3319508A1 EP16748268.6A EP16748268A EP3319508A1 EP 3319508 A1 EP3319508 A1 EP 3319508A1 EP 16748268 A EP16748268 A EP 16748268A EP 3319508 A1 EP3319508 A1 EP 3319508A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- scanner
- screen
- scanning
- anatomical structures
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000003484 anatomy Anatomy 0.000 title claims abstract description 77
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000001514 detection method Methods 0.000 claims abstract description 27
- 230000009466 transformation Effects 0.000 claims description 28
- 210000003128 head Anatomy 0.000 claims description 17
- 230000003190 augmentative effect Effects 0.000 claims description 15
- 238000000844 transformation Methods 0.000 claims description 11
- 230000003287 optical effect Effects 0.000 claims description 10
- 239000011521 glass Substances 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 4
- 230000001419 dependent effect Effects 0.000 claims description 2
- 238000001454 recorded image Methods 0.000 abstract 1
- 238000003384 imaging method Methods 0.000 description 4
- 210000004513 dentition Anatomy 0.000 description 3
- 238000005562 fading Methods 0.000 description 3
- 238000002156 mixing Methods 0.000 description 3
- 230000036346 tooth eruption Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000001839 endoscopy Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000006798 ring closing metathesis reaction Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 241000270725 Caiman Species 0.000 description 1
- 238000005773 Enders reaction Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000011888 autopsy Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 235000013312 flour Nutrition 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000002357 laparoscopic surgery Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 125000006850 spacer group Chemical group 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000004148 unit process Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7425—Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6846—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
- A61B5/6886—Monitoring or controlling distance between sensor and tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0088—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1077—Measuring of profiles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4538—Evaluating a particular part of the muscoloskeletal system or a particular medical condition
- A61B5/4542—Evaluating the mouth, e.g. the jaw
- A61B5/4547—Evaluating teeth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/743—Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00216—Electrical control of surgical instruments with eye tracking or head position tracking control
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
Definitions
- the present invention relates to a system for scanning anatomical structures, such as teeth or the jaw of patients, to provide an optical,
- the present invention particularly relates to
- the invention provides the scan results in a natural way and thus simplifies the handling of the
- the present invention can be used in the field of augmented reality, but differs
- Mini / BMW is planning a kind
- Augmented reality The endoscopic instruments need to be precisely navigated to a specific point with what Augmented reality can be simplified.
- the invention presented here differs from other augmented reality applications also in the way in which the position of the artificial data is determined to be reality. This positioning is done very accurately and in a simple way about the position of the scanner and the scan result.
- US 2007/0172101 AI describes a similar system. According to this invention is a
- the two-dimensional image is from an intraoral camera.
- the three-dimensional model is the combination of the three-dimensional images of the
- a single scan may be imperfect in certain situations. For example, a scan may be incomplete (holey), faulty, or noisy at certain, hard-to-measure areas. Such imperfections may possibly be caused by the
- the method of joining is usually one
- Model matches typically a variant of the ICP (Iterative Closest Point) method.
- the new scan is added to the already scanned scan data, that is, set so that the new scan data cling to the existing scan data with minimal error. Now it has to be checked on a regular basis, if this appendix was also successful. This will be
- the found minimum error for example, the found minimum error or the
- the degree of overlap should be large enough. If such a
- Criteria are met, the new scan is also part of the model, otherwise it must be discarded, at least for the moment.
- the user has moved the scanner too fast, so the new scan does not overlap or overlap with the model too little. - The user moves the scanner so that the object to be scanned is outside the scanning range of the scanner.
- the scan data has too many perturbations, so the ICP error is too big.
- the scanned structure is not rigidly connected and has deformed. So the new scan does not work with the Match Model.
- such parts are, for example, the tongue or the cheek, which can move independently of the rigid teeth.
- the scanner can only be used to some extent as a tool to display the virtual model in the correct position relative to the anatomical structure. It lacks the information where the current scan is relative to the model.
- the scan and the scanned model can not be scanned simultaneously.
- the scanner will project
- WO 2004/100067 A2 describes a system that has a
- WO 2015/181454 A1 describes a system that has a
- Augmented reality eyewear allow a calculation of a sufficiently accurate point cloud of anatomical structures for a precise blending.
- US 2012/0056993 AI, WO 2010/067267 AI and WO 2014/120909 AI are further documents describing systems for displaying scan results. It is the object of the present invention to provide a system for scanning anatomical structures and displaying the scan result and an associated method for
- the presentation of the scan result should allow easier handling of the scanner.
- the task is solved by a system as it is in
- a system for scanning anatomical structures and displaying the scan result comprising: an intraoral scanner having an anatomical image
- Registration unit which determines the spatial position of the
- a computing unit that sets when scanning the scanner with a screen and the detection unit and a scan result based on the intraoral acquired image of the anatomical structures and the
- Scan pauses position, orientation and scaling of anatomical structures and estimates an appropriate image of the anatomical structures as
- 3D video-enabled surface imaging for real-time superimposition of reality with captured 3D data using the acquisition device to determine the
- An anatomical structure can be used in particular as teeth
- the properties needed for matching are selected from a variety of properties during the successful addition of scans to the model.
- the properties needed for matching are selected from a variety of properties during the successful addition of scans to the model.
- the screen is included in an augmented reality goggle to be located directly in the field of view of a person to be scanned, the detection unit being provided in the immediate vicinity of the screen and ideally rigidly connected thereto.
- a dentist can thus easily examine the teeth of a patient without having his eyes move back and forth between a patient's mouth and a screen.
- the detection unit and the screen may be rigidly attached to the head of the scanning person.
- the system further comprises an eye camera that detects movements of the eyes and / or the head of the scanning person relative to the detection unit and the screen, wherein the arithmetic unit adapted to the detected on the screen scan result
- Recalibration of the screen can be avoided and the usable angle range can be increased.
- the system also includes position sensors that detect movements of the scanning person. It also has position sensors on the scanner, which detect the movements of the scanner.
- the processing unit takes into account the detected movements in the generation of the on the
- the position sensors also make a representation of a scan result displayed on the screen even more accurate than without these. Due to the position sensors, the robustness of the
- the arithmetic unit performs transformations between coordinate systems. These include a coordinate system of the intraoral acquired image, a coordinate system of the scanner, a coordinate system of the detection unit and a coordinate system of the
- the coordinate systems may further comprise a respective coordinate system of the screen for a left and a right eye of the scanning person and a respective coordinate system for a left and a right eye of the scanning person.
- the screen continues to display clues as to successful scanning. This ensures that it is always ensured that an unsuccessful scan is immediately recognizable.
- the arithmetic unit processes the from
- Scanner intraoral captured image of the anatomical structures such that it is displayed on the screen at the head of the scanner.
- This makes it possible to use the scanner as a virtual mirror.
- Virtual in this context means that there is no conventional optical mirror, but that the captured image when
- Scanner head is displayed and only visible through the screen.
- the mirror is not physically present - only by the display on the screen is the
- the detection unit may be, for example, a simple 3D camera which displays the three-dimensional position of the Scanners detected, or even just a 2D camera, with which under certain conditions, the
- the scanner is optical
- Position marks provided. Due to the position marks is a location of the scanner through the
- Detecting unit in the scanner in a particularly simple manner possible with a common 2D camera as a detection unit by a distance between the position marks with the 2D camera is determined and because of this
- Transformations between the coordinate systems, and displaying the generated scan result on the screen By means of the method according to the invention, it is possible to improve a representation of a scan result on a screen and to facilitate further use of the representation.
- a scan result is formed by merging individual intraorally acquired images of the anatomical structures.
- the anatomical structures are teeth, an entire jaw can be readily visualized.
- the respective position of the scanner is tracked and detected by means of the extraoral detection unit. This makes it possible to display a scan result together with a representation of the position of the scanner together with an extraorally acquired image.
- the respective position of the scanner is tracked using position marks on the scanner.
- position markers on the scanner an exact position determination of the scanner and thus a precise tracking (English, “tracking") of the scanner possible.
- the arithmetic unit estimates position, orientation and scaling of the anatomical structures by an invasive or a non-invasive method, wherein the invasive method involves attaching markers or another tracking sensor to the scan
- anatomical structure includes and non-invasive
- Method comprises a pairing of optical two-dimensional or three-dimensional properties of the anatomical structure. For better estimation of position, orientation and scaling of anatomical structures
- Orientation and scaling of the anatomical structures during a successful append of scans to a model undergoes a learning phase which helps during scan pauses to display the virtual model, with transformations between the virtual model and the features on or close to the anatomical structures during a successful scan are known or learned can be and being two- or three-dimensional
- Properties can be selected to make the virtual model stable and robust during scan pauses
- the intraorally acquired image of the anatomical structures and / or the scan result relative to the position of the scanner is displayed on the screen.
- a comprehensive and accurate representation of a scan result is possible.
- FIG. 1 shows a system according to the present invention
- Figure 2 is a display of a screen of the system according to the present invention
- Figure 3 is an overview of the imaging chain of the system according to the present invention
- FIG. 4 shows a coordinate system of a detected scan according to the present invention
- FIG. 5 shows a coordinate system of the scanner according to the present invention
- FIG. 6 shows a coordinate system of the detection unit according to the present invention
- Figure 7 illustrates coordinate systems of the left and right eye screen of a scanning person according to the present invention
- FIG. 8 Coordinate systems of a viewer as
- FIG. 9 shows a calibration according to the present invention
- Figure 10 illustrates a correspondence between real teeth and a virtual model of the teeth with position marks on the scanner according to the present invention
- FIG. 11 shows a cross-linked, standardized scan instruction for achieving reproducible scan results according to the present invention
- FIG. 12 shows superimposed markings which disappear after successful detection, according to FIG. 12
- Figure 13 illustrates a blended reference to gaps in the model or a low dot density according to the present invention
- Figure 14 shows a display of an image of the intraoral scanner at the head of the scanner according to the present invention.
- Figure 1 shows a system in which a screen 40 is placed in the form of augmented reality glasses directly into the field of view of a person scanning. Real teeth as an anatomical structure and a scan result are visible to a scanning person at a glance.
- Screen 40 fades in a display unit virtual content in the real field of view (advanced
- Augmented reality goggles will soon be commercially available.
- the present invention can be used, for example, with a
- a detection unit 20 of the spatial position of the scanner is a 3D depth camera with an associated 2D color image.
- the screen 40 and also the computer unit 30 are integrated in the glasses.
- the system has as a central element the screen 40, on which scan results and notes are visible.
- the screen 40 allows the user, the scanning person, to see teeth as a scan and scan results simultaneously and superimposed on each other.
- the screen 40 may be implemented in various ways. For example, in a semi-transparent screen as a first example, a virtual content of the
- the virtual content is superimposed on a video of the environment. This video is shot from a natural perspective.
- the virtual content can also be projected directly onto the retina of a person to be scanned.
- any combination of the first example and the second example are conceivable:
- the considered environment can also be displayed semi-transparent and / or the cross-fading can be done by not fading in the two screens in front of the eyes the same content. It is also conceivable that the degree of crossfading is set individually by each user.
- the system further comprises a detection unit 20 which detects the spatial position of the intraoral scanner 10 and which is connected in the immediate vicinity of the screen 40 and as rigidly as possible with the screen 40
- a two- or optionally a three-dimensional camera is often integrated. This camera may function as a capture unit 20 and capture the scene from a similar viewpoint as the user of the scanner 10 as the scanning person.
- the detection unit 20 is used to detect the spatial position of the scanner 10 and the
- intraoral scan result at a particular location relative to the interoral scanner 10. It is for
- Example makes sense to blend the real anatomical structures with the virtual scan result.
- the system may also include an eye camera 21
- Detected detection unit 20 and the screen 40 For example, if the distance between the head and the Screen 40 changed, must also have an indicator
- the display may also need to be changed.
- the system may also include optional position sensors 22 which detect movements of the user and help stably display the contents of the screen 40.
- the system has a scanner 10, which is designed as a two- and / or three-dimensional, digital detection device. This can be carried out in various ways.
- the two-dimensional image can be carried out in various ways.
- the two-dimensional image can be carried out in various ways.
- the three-dimensional model can be captured using triangulation under structured illumination, stereo cameras, confocal, time-of-flight, or other principles.
- the system according to the invention has in particular a computing unit 30 which stores the scanner 10, the screen 40 and the computer
- Detection unit 20 connects to each other.
- the arithmetic unit 30 determines the superimposition of the scan result and reality taking into account the viewing angle of the
- Transformation chain between 3D data and user visual field possible there are different coordinate systems in the system according to the invention.
- a coordinate system shown in FIG. 4 is determined by the detected scan, the result of the intraoral scanner 10. Another coordinate system is this
- Coordinate system of the scanner 10 shown in FIG. Another coordinate system is a
- Coordinate system of the detection unit (hereinafter also referred to as overview camera 20), which detects the position of the scanner 10, which is shown in Figure 6.
- Another co-ordinate system is the coordinate system of a room shown in Figure 7, as seen by the user or person being scanned, such as a dentist, for each eye a system, i. two
- a transformation concerns a transformation of the
- Another transformation concerns a transformation of the reality to the scanner 10, which is known during scanning.
- the scanner 10 creates the relationship between the reality and the digital model with typically very high accuracy.
- Another transformation relates to a transformation of the scanner 10 to the overview camera 20.
- the location of the scanner 10 relates to a transformation of the scanner 10 to the overview camera 20.
- Scanner 10 can be detected by easily recognizable position marks 11 on the scanner 10. These position marks 11 may also be any illustrations of known texture, such as a company or company
- Optical position marks may be, for example, known textures and according to "M. Ozuysal et al .: Feature Harvesting for
- Another transformation relates to a transformation of overview camera 20 to the screen 40.
- Correspondence can be stably determined in advance, for example, by a fixed geometry of the overview camera 20 to the screen 40.
- Another transformation concerns a transformation from the eye of a scanning person to the screen 40. This correspondence varies from person to person and must be preceded by a personal calibration before a first
- This picture chain can be used for example with
- Transformations are calculated in homogeneous coordinates.
- ⁇ ⁇ ' can be expressed in linear algebra in matrix notation, where A represents the mapping matrix:
- Scope of a calibration for example, to a mean solid angle of the field of view of a
- Geometry to be displayed leaves this solid angle, so the overlay can be hidden.
- the transformation instruction makes it possible to transfer the movements of teeth as anatomical structure of a patient into the coordinate system of a respective eye of a user and detected by the scanner 10 3D geometries on the screen 40 in the correct position with respect to a naturally seen by the eye image for display bring to. For example, to calibrate the entire system (especially eye-to-screen), the scanner 10 may do so
- This position is preferably determined by:
- Teeth of the upper jaw of a patient are firmly attached to his head.
- the observation of facial features, such as the eyes, may indicate a
- Faulty observations may be incorporated into a model of movement of teeth as an anatomical structure, such as a caiman filter to pass through Computing unit to calculate a stable estimate of the situation, which can be similar to a position estimate of a car in a GPS signal loss.
- anatomical structure such as a caiman filter to pass through Computing unit to calculate a stable estimate of the situation, which can be similar to a position estimate of a car in a GPS signal loss.
- scan logs can be displayed during routine scanning. For example, from “Ender and Flour: The Influence of Scanning Strategies on the Accuracy of Digital Intraoral Scanning Systems, International Journal of
- Scan protocols can increase the accuracy of the model. Such protocols are usually taught to users in a training phase.
- the proposed ⁇ Sys "TBM allows to propose such scanning protocols during scanning and display. These instructions can be easily applied because the correspondence between the scanner and the model is obvious.
- the scanning protocols must not be permanently fixed in advance, but can also be proposed interactively be around
- Scan instructions can be made, for example, by arrows which show the scanning direction, as can be seen in FIG. It is also conceivable that numbered
- Landmarks are displayed as help. These landmarks disappear as soon as this region was successfully scanned. This possibility can be seen in FIG.
- the quality of a scan can be virtually blended, such as by a particular color scheme, which can be seen in FIG.
- the quality of the scan is bad if, for example, gaps are found in the model. These regions can then
- Visualize the scanner 10 acts as a virtual
- the view viewable is virtually mirrored at the head of the scanner 10, as shown in FIG.
- the mirror is virtual - so it can be positioned as desired, such as according to Figure 14 directly on the scanner 10 or also next to the scanner 10.
- the mirror can also, if necessary, work as a magnifying glass or magnifying concave mirror.
- Another virtual magnifier can be added to enlarge critical regions.
- the virtual mirror can also function as an electronic dental mirror of the model or real teeth.
- the virtual mirror can also function as an electronic dental mirror of the model or real teeth.
- the image of the displayed scan is displayed by introducing artificial lighting.
- the corresponding virtual light source is preferably positioned where a dentist as a scanning person
- the light source must be able to position the light source correctly
- the overview camera 20 is rigidly connected to the head of the dentist.
- Teeth of a patient is known by scanning.
- a representation must be done in real time, ideally with a maximum latency of 0.1 seconds. Higher latencies do not generally limit the applicability of the method, but a user is one
- a presentation with a short latency requires a correspondingly powerful computing hardware.
- the computer hardware can be distributed.
- a CPU and a GPU may be located between the display 40 and the acquisition unit 20. More CPUs and GPUs can be found in the
- Arithmetic unit 30 which connects the screen 40, the detection unit 20 and the scanner 10.
- the objects must be in the immediate vicinity the teeth are segmented. This segmentation can be done, for example, by matching the optical flow of the virtually covered real teeth and the virtual model.
- Covered 2D view must match the flow of the estimated virtual model. If not
- these moving perturbations are spatially displayed in front of the virtual model. If the intraoral scanner 10 provides an RGB image and the virtual model can be colored in this way, the color deviations between real teeth and model could also be used for a segmentation. If the color values are not
- this option can also be turned off to show, for example, gaps in the model behind the head of the scanner 10.
- Orientation and scaling of the anatomical structures with tools estimated by the computing unit 30 One can distinguish invasive and non-invasive methods, with non-invasive methods being preferred.
- An invasive method is to attach markers or another tracking sensor to the anatomical structure to be scanned.
- a non-invasive method is the pairing of optical 2D or 3D properties of the
- 2D or 3D properties may be a sparse number of conspicuous, local points, or even a number of larger regions that are densely distributed over the object.
- Local 2D properties include: Local color differences on anatomical structures. In the described application typically tooth-gum transitions.
- Local 3D properties include: - Local differences in shape on anatomical structures, which can be described with curvature masses, for example.
- Local 2D or 3D properties can also be examined for their spatial distribution, so a
- Color data or 3D data of the environment Color data or 3D data of the environment. The matching of the properties is the easier, the more similar the
- the head or parts of the head can be used to determine the parameters of the teeth.
- Invasive markers can be applied to the face, for example.
- Non-invasive, for example, certain aspects or a face mask can be tracked.
- an automated learning phase during scanning with the scanner 10 with the method described as the supervisor when creating an image. While scanning and the
- the data from the overview camera are continuously analyzed and evaluated so that later during scan breaks the virtual model is as correctly as possible blended with the real anatomical structures.
- this learning phase it is automatically recognized which properties that are tracked with the overview camera are suitable for estimating the position, orientation and scaling of the virtual model. These properties should, for example, enable the most stable tracking possible. From a variety of
- the automated learning phase makes it possible to match properties that are only close to the scanned anatomical structures with the estimated parameters. For example, the relationship between the position, orientation and scaling of the virtual model relative to the position, orientation and scaling of a face mask (tracked with the
- the face mask may still be tracked and the anatomical structures may be displayed at the learned position, orientation and scaling relative to the face mask.
- the virtual model can also be placed manually by the user over the true anatomical structures. In this manual calibration step can then also the relationship between the traced properties and the position, orientation and scaling of the
- Calibration step can be done once or repeatedly to obtain a higher robustness.
- the present invention provides a system and method for scanning anatomical structures and displaying the scan result which, with ease of operation, provides improved rendering of scan results.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Rheumatology (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Physical Education & Sports Medicine (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
- Endoscopes (AREA)
- Processing Or Creating Images (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102015212806.7A DE102015212806A1 (de) | 2015-07-08 | 2015-07-08 | System und Verfahren zum Scannen von anatomischen Strukturen und zum Darstellen eines Scanergebnisses |
PCT/EP2016/066259 WO2017005897A1 (de) | 2015-07-08 | 2016-07-08 | System und verfahren zum scannen von anatomischen strukturen und zum darstellen eines scanergebnisses |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3319508A1 true EP3319508A1 (de) | 2018-05-16 |
Family
ID=56615937
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16748268.6A Pending EP3319508A1 (de) | 2015-07-08 | 2016-07-08 | System und verfahren zum scannen von anatomischen strukturen und zum darstellen eines scanergebnisses |
Country Status (9)
Country | Link |
---|---|
US (1) | US11412993B2 (de) |
EP (1) | EP3319508A1 (de) |
JP (1) | JP6941060B2 (de) |
KR (1) | KR102657956B1 (de) |
CN (1) | CN107735016B (de) |
AU (1) | AU2016290620B2 (de) |
CA (1) | CA2991659A1 (de) |
DE (1) | DE102015212806A1 (de) |
WO (1) | WO2017005897A1 (de) |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11457998B2 (en) | 2016-07-29 | 2022-10-04 | Ivoclar Vivadent Ag | Recording device |
US10470645B2 (en) | 2017-05-22 | 2019-11-12 | Gustav Lo | Imaging system and method |
CN107644454B (zh) * | 2017-08-25 | 2020-02-18 | 北京奇禹科技有限公司 | 一种图像处理方法及装置 |
CN107909609B (zh) * | 2017-11-01 | 2019-09-20 | 欧阳聪星 | 一种图像处理方法及装置 |
DE102018204098A1 (de) * | 2018-03-16 | 2019-09-19 | Sirona Dental Systems Gmbh | Bildausgabeverfahren während einer dentalen Anwendung und Bildausgabevorrichtung |
CN108827151B (zh) * | 2018-06-22 | 2020-05-19 | 北京大学口腔医学院 | 数据配准方法及数据配准系统 |
EP3649919A1 (de) | 2018-11-12 | 2020-05-13 | Ivoclar Vivadent AG | Dentales bildaufnahmesystem |
CN109700531B (zh) * | 2018-12-17 | 2021-07-30 | 上海交通大学医学院附属第九人民医院 | 个体化下颌骨导航配准导板及其配准方法 |
CN109864829B (zh) * | 2019-01-28 | 2024-06-18 | 苏州佳世达光电有限公司 | 扫描系统及扫描方法 |
EP3689218B1 (de) * | 2019-01-30 | 2023-10-18 | DENTSPLY SIRONA Inc. | Verfahren und system zur führung eines intraoralen scans |
EP3960121A1 (de) * | 2019-01-30 | 2022-03-02 | DENTSPLY SIRONA Inc. | Verfahren und system zur dreidimensionalen bildgebung |
CN113424523B (zh) * | 2019-02-15 | 2023-10-27 | 株式会社美迪特 | 扫描过程再生方法 |
JP6936826B2 (ja) * | 2019-03-18 | 2021-09-22 | 株式会社モリタ製作所 | 画像処理装置、表示システム、画像処理方法、および画像処理プログラム |
CN109770857A (zh) * | 2019-03-22 | 2019-05-21 | 昆明医科大学附属口腔医院(云南省口腔医院) | 一种口腔拍照定位装置 |
KR102236486B1 (ko) * | 2019-04-26 | 2021-04-06 | 주식회사 메디씽큐 | 전자적 루페 장치 및 이를 이용한 진단 방법 |
US10849723B1 (en) | 2019-05-07 | 2020-12-01 | Sdc U.S. Smilepay Spv | Scanning device |
KR102313319B1 (ko) * | 2019-05-16 | 2021-10-15 | 서울대학교병원 | 증강현실 대장 내시경 시스템 및 이를 이용한 모니터링 방법 |
US10832486B1 (en) | 2019-07-17 | 2020-11-10 | Gustav Lo | Systems and methods for displaying augmented anatomical features |
TWI710357B (zh) * | 2020-01-17 | 2020-11-21 | 東昕精密科技股份有限公司 | Ar牙科贋復製造引導系統及其應用方法 |
CN113223140A (zh) * | 2020-01-20 | 2021-08-06 | 杭州朝厚信息科技有限公司 | 利用人工神经网络生成牙科正畸治疗效果的图像的方法 |
CA3169232A1 (en) | 2020-02-26 | 2021-09-02 | Pamela Sharon Oren-Artzi | Systems and methods for remote dental monitoring |
USD962437S1 (en) | 2020-05-14 | 2022-08-30 | Get-Grin Inc. | Dental scope |
CN112506343B (zh) * | 2020-12-07 | 2022-07-12 | 青岛农业大学 | 动物解剖学教学方法、系统、介质、计算机设备、终端 |
US11445165B1 (en) | 2021-02-19 | 2022-09-13 | Dentsply Sirona Inc. | Method, system and computer readable storage media for visualizing a magnified dental treatment site |
EP4079258A1 (de) * | 2021-04-23 | 2022-10-26 | DENTSPLY SIRONA Inc. | Zahnabtastung |
WO2024007117A1 (en) * | 2022-07-04 | 2024-01-11 | Braun Gmbh | Oral scanner system |
US20240046555A1 (en) * | 2022-08-08 | 2024-02-08 | Gustav Lo | Arcuate Imaging for Altered Reality Visualization |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014120909A1 (en) * | 2013-02-01 | 2014-08-07 | Sarment David | Apparatus, system and method for surgical navigation |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6503195B1 (en) | 1999-05-24 | 2003-01-07 | University Of North Carolina At Chapel Hill | Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction |
US7013191B2 (en) * | 1999-11-30 | 2006-03-14 | Orametrix, Inc. | Interactive orthodontic care system based on intra-oral scanning of teeth |
US7027642B2 (en) * | 2000-04-28 | 2006-04-11 | Orametrix, Inc. | Methods for registration of three-dimensional frames to create three-dimensional virtual models of objects |
CA2524213A1 (en) * | 2003-04-30 | 2004-11-18 | D4D Technologies, L.P. | Intra-oral imaging system |
US7463757B2 (en) | 2003-12-09 | 2008-12-09 | Carestream Health, Inc. | Tooth locating within dental images |
US8194067B2 (en) * | 2004-02-04 | 2012-06-05 | 3M Innovative Properties Company | Planar guides to visually aid orthodontic appliance placement within a three-dimensional (3D) environment |
US7323951B2 (en) | 2005-07-13 | 2008-01-29 | John Mezzalinqua Associates, Inc. | Casing for CATV filter |
US11428937B2 (en) | 2005-10-07 | 2022-08-30 | Percept Technologies | Enhanced optical and perceptual digital eyewear |
CA2632557C (en) | 2005-12-08 | 2016-06-14 | Peter S. Lovely | Infrared dental imaging |
US7840042B2 (en) | 2006-01-20 | 2010-11-23 | 3M Innovative Properties Company | Superposition for visualization of three-dimensional data acquisition |
US20070238981A1 (en) * | 2006-03-13 | 2007-10-11 | Bracco Imaging Spa | Methods and apparatuses for recording and reviewing surgical navigation processes |
WO2010067267A1 (en) * | 2008-12-09 | 2010-06-17 | Philips Intellectual Property & Standards Gmbh | Head-mounted wireless camera and display unit |
JP5476036B2 (ja) * | 2009-04-30 | 2014-04-23 | 国立大学法人大阪大学 | 網膜投影型ヘッドマウントディスプレイ装置を用いた手術ナビゲーションシステムおよびシミュレーションイメージの重ね合わせ方法 |
US20120056993A1 (en) * | 2010-09-08 | 2012-03-08 | Salman Luqman | Dental Field Visualization System with Improved Ergonomics |
US9237878B2 (en) | 2011-04-22 | 2016-01-19 | Mayo Foundation For Medical Education And Research | Generation and assessment of shear waves in elasticity imaging |
JP5935344B2 (ja) * | 2011-05-13 | 2016-06-15 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム、記録媒体、および、画像処理システム |
JP2013034764A (ja) * | 2011-08-10 | 2013-02-21 | Akira Takebayashi | サージカルガイド装置及びドリルの位置決め方法 |
US9606992B2 (en) * | 2011-09-30 | 2017-03-28 | Microsoft Technology Licensing, Llc | Personal audio/visual apparatus providing resource management |
US9517041B2 (en) * | 2011-12-05 | 2016-12-13 | Controlrad Systems Inc. | X-ray tube |
DE102012221374A1 (de) * | 2012-11-22 | 2014-05-22 | Sirona Dental Systems Gmbh | Verfahren zur Planung einer dentalen Behandlung |
US9135498B2 (en) * | 2012-12-14 | 2015-09-15 | Ormco Corporation | Integration of intra-oral imagery and volumetric imagery |
US9808148B2 (en) * | 2013-03-14 | 2017-11-07 | Jan Erich Sommers | Spatial 3D sterioscopic intraoral camera system |
CN103442169B (zh) * | 2013-07-29 | 2016-10-05 | 北京智谷睿拓技术服务有限公司 | 操纵图像采集设备的拍摄功能的方法和图像采集设备 |
CN103440626B (zh) * | 2013-08-16 | 2016-10-19 | 北京智谷睿拓技术服务有限公司 | 照明方法和照明系统 |
CN103886145B (zh) | 2014-03-12 | 2017-01-04 | 南京航空航天大学 | 牙齿预备体数字化模型设计方法 |
FR3021518A1 (fr) * | 2014-05-27 | 2015-12-04 | Francois Duret | Dispositif de visualisation pour faciliter la mesure et le diagnostic 3d par empreinte optique en dentisterie |
US10042937B2 (en) | 2014-07-30 | 2018-08-07 | Microsoft Technology Licensing, Llc | Adjusting search results based on overlapping work histories |
US10504386B2 (en) * | 2015-01-27 | 2019-12-10 | Align Technology, Inc. | Training method and system for oral-cavity-imaging-and-modeling equipment |
-
2015
- 2015-07-08 DE DE102015212806.7A patent/DE102015212806A1/de active Pending
-
2016
- 2016-07-08 WO PCT/EP2016/066259 patent/WO2017005897A1/de active Application Filing
- 2016-07-08 CA CA2991659A patent/CA2991659A1/en active Pending
- 2016-07-08 EP EP16748268.6A patent/EP3319508A1/de active Pending
- 2016-07-08 AU AU2016290620A patent/AU2016290620B2/en active Active
- 2016-07-08 CN CN201680040203.0A patent/CN107735016B/zh active Active
- 2016-07-08 JP JP2017567070A patent/JP6941060B2/ja active Active
- 2016-07-08 KR KR1020187000188A patent/KR102657956B1/ko active IP Right Grant
- 2016-07-08 US US15/742,556 patent/US11412993B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014120909A1 (en) * | 2013-02-01 | 2014-08-07 | Sarment David | Apparatus, system and method for surgical navigation |
Also Published As
Publication number | Publication date |
---|---|
AU2016290620A1 (en) | 2018-02-01 |
CA2991659A1 (en) | 2017-01-12 |
CN107735016A (zh) | 2018-02-23 |
JP2018527965A (ja) | 2018-09-27 |
US11412993B2 (en) | 2022-08-16 |
US20180192964A1 (en) | 2018-07-12 |
AU2016290620B2 (en) | 2021-03-25 |
CN107735016B (zh) | 2022-10-11 |
WO2017005897A1 (de) | 2017-01-12 |
JP6941060B2 (ja) | 2021-09-29 |
KR102657956B1 (ko) | 2024-04-15 |
KR20180027492A (ko) | 2018-03-14 |
DE102015212806A1 (de) | 2017-01-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017005897A1 (de) | System und verfahren zum scannen von anatomischen strukturen und zum darstellen eines scanergebnisses | |
DE102011078212B4 (de) | Verfahren und Vorrichtung zum Darstellen eines Objektes | |
EP0799434B1 (de) | Mikroskop, insbesondere stereomikroskop und verfahren zum überlagern zweier bilder | |
EP2082687B1 (de) | Überlagerte Darstellung von Aufnahmen | |
EP2289061B1 (de) | Ophthalmoskop-simulator | |
DE102007033486B4 (de) | Verfahren und System zur Vermischung eines virtuellen Datenmodells mit einem von einer Kamera oder einer Darstellungsvorrichtung generierten Abbild | |
DE102016106121A1 (de) | Verfahren und Vorrichtung zum Bestimmen von Parametern zur Brillenanpassung | |
DE112009000100T5 (de) | Navigieren zwischen Bildern eines Objekts im 3D-Raum | |
DE112004000902T5 (de) | Kalibrierung von tatsächlichen und virtuellen Ansichten | |
EP2937058A1 (de) | Kopfgetragene plattform zur integration von virtualität in die realität | |
DE112010006052T5 (de) | Verfahren zum Erzeugen stereoskopischer Ansichten von monoskopischen Endoskopbildern und Systeme, die diese verwenden | |
DE112017001315T5 (de) | Rechenvorrichtung zum Überblenden eines laparoskopischen Bildes und eines Ultraschallbildes | |
EP3635478A1 (de) | Verfahren, vorrichtungen und computerprogramm zum bestimmen eines nah-durchblickpunktes | |
DE69519623T2 (de) | Operations-mikroskope | |
DE69524785T2 (de) | Optische einrichtung zur anzeige eines virtuellen dreidimensionalen bildes bei überlagerung mit einem reellen objekt besonders für chirurgische anwendungen | |
WO2017220667A1 (de) | Verfahren und vorrichtung zur veränderung der affektiven visuellen information im gesichtsfeld eines benutzers | |
DE102020215559B4 (de) | Verfahren zum Betreiben eines Visualisierungssystems bei einer chirurgischen Anwendung und Visualisierungssystem für eine chirurgische Anwendung | |
DE102021206565A1 (de) | Darstellungsvorrichtung zur Anzeige einer graphischen Darstellung einer erweiterten Realität | |
CN116419715A (zh) | 用于牙科治疗系统的双向镜显示器 | |
DE102012100848B4 (de) | System und Verfahren zur stereoskopischen Darstellung von Aufnahmen eines Endoskops | |
DE102020126029A1 (de) | Chirurgisches Assistenzsystem und Darstellungsverfahren | |
EP3274753A1 (de) | Verfahren zum betreiben einer operationsmikroskopanordnung | |
EP4124283A1 (de) | Messverfahren und eine messvorrichtung | |
DE102018102252A1 (de) | Elektronisches Stereomikroskop oder Stereoendoskop |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180208 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210415 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230524 |