WO2016200167A1 - Dispositif et procédé de guidage orthodontique - Google Patents

Dispositif et procédé de guidage orthodontique Download PDF

Info

Publication number
WO2016200167A1
WO2016200167A1 PCT/KR2016/006113 KR2016006113W WO2016200167A1 WO 2016200167 A1 WO2016200167 A1 WO 2016200167A1 KR 2016006113 W KR2016006113 W KR 2016006113W WO 2016200167 A1 WO2016200167 A1 WO 2016200167A1
Authority
WO
WIPO (PCT)
Prior art keywords
tooth
orthodontic
image
bracket
guide
Prior art date
Application number
PCT/KR2016/006113
Other languages
English (en)
Korean (ko)
Inventor
김건태
허다솜
Original Assignee
(주) 시원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020150162406A external-priority patent/KR20160144891A/ko
Application filed by (주) 시원 filed Critical (주) 시원
Priority to KR1020177035025A priority Critical patent/KR102067089B1/ko
Publication of WO2016200167A1 publication Critical patent/WO2016200167A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C19/00Dental auxiliary appliances
    • A61C19/04Measuring instruments specially adapted for dentistry
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C7/00Orthodontics, i.e. obtaining or maintaining the desired position of teeth, e.g. by straightening, evening, regulating, separating, or by correcting malocclusions
    • A61C7/12Brackets; Arch wires; Combinations thereof; Accessories therefor
    • A61C7/14Brackets; Fixing brackets to teeth

Definitions

  • It relates to a guide device for orthodontics, and more particularly to a device for providing visual guidance and feedback to a medical practitioner in the orthodontic process of adhering the bracket to the teeth.
  • Orthodontic treatment refers to a series of procedures that prevent or treat abnormalities in tooth alignment.
  • orthodontic treatment means correcting healthy oral tissues by correcting various skeletal inconsistencies that may occur during the growth process, including simply arranging crooked teeth, so as to function normally.
  • Orthodontic treatment in children and adolescents is corrected before the growth of the jaw bone growth, and can be adjusted while controlling the jaw growth.
  • orthodontic treatment is required to correct the position of the teeth using a fixed braces or removable braces.
  • the method of bonding through the adhesive between the teeth and the bracket is mainly used.
  • the orthodontic guide device implemented by the computing terminal is a processor for generating matching information between the guide and the teeth image including the bonding position and the orientation of the bracket according to the orthodontic planning, and based on the generated matching information And a controller configured to overlap the guide with the tooth image.
  • the adhesive position and direction of the bracket may be represented by an image implemented with augmented reality including a cross.
  • the bonding position and the direction of the bracket may be represented as an image implemented with augmented reality in the position where the bracket is to be bonded to the 3D image.
  • the orthodontic guide device may further include a display for providing a tooth image overlapping the guide, the display may include a spectacle wearable device having a camera.
  • the orthodontic guide device may further include a display that provides a tooth image in which the guide is overlapped, and the display may include a camera and an external display linked to the camera.
  • the processor identifies a plurality of points associated with a tooth in the tooth image, and generates positions of the identified points as the matching information.
  • the plurality of points are feature points included in at least one of the gum and tooth.
  • At least one of the plurality of points corresponds to a radiopaque material disposed on the gum or tooth.
  • At least one of the plurality of points corresponds to a positive protrusion block or a negative depression disposed on the gum or tooth.
  • At least one of the plurality of points corresponds to at least one of a tattoo point pre-coloured to the gum or tooth and a color tape attached thereto.
  • the orthodontic guide device implemented by the computing terminal, a processor for calculating an error between the position of the bracket in the dental image and the adhesive position of the bracket according to the orthodontic planning, and if the error is less than the threshold value to the operator It includes a signal generator for generating a control signal for providing a notification.
  • the notification comprises orthodontic guide device comprising a visual marker implemented in augmented reality provided via a display. In one embodiment, the notification comprises orthodontic guide device comprising a sound effect provided to the operator.
  • the photopolymer control device implemented by the photopolymer control device computing terminal is a processor for determining the activation of the photopolymerizer based on the position of the bracket with respect to the teeth, and when the photopolymerizer is activated, in response to the control signal And a controller for controlling the photopolymerizer to irradiate light to the bracket.
  • the processing unit determines the activation of the photopolymerizer when the position of the bracket relative to the tooth meets a predetermined criterion.
  • the processing unit determines that the predetermined criteria meet.
  • control signal is generated by a switching module connected to the photopolymerizer and wired or wireless.
  • the switching module is controlled on and off by a foot switch and controls the generation of the control signal in accordance with the on and off control.
  • the switching module generates the control signal in response to a predetermined voice signal.
  • control unit controls the photopolymerizer to irradiate the light by controlling at least one of the irradiation amount and the irradiation time.
  • an operation method of a dental orthodontic guide device implemented by a computing terminal may include generating matching information between a guide and a tooth image including an adhesive position of a bracket according to dental orthodontic planning, and generating the matching information. Based on the overlapping of the guide on the tooth image.
  • the photopolymer control method implemented by the computing terminal, in the processing unit, if the position of the bracket with respect to the tooth meets a predetermined criterion, determining the activation of the photopolymerizer, and in the control unit, the photopolymerizer If activated, controlling the photopolymerizer to irradiate the bracket with light in response to a control signal.
  • the orthodontic guide device implemented by the computing terminal is a processor for generating matching information between the 3D image and the teeth image of the tooth including the root, and based on the generated matching information, the image representing the root It includes a control that overlaps the tooth image.
  • the processor extracts a 3D image of the tooth from a CT image, and identifies the tooth in the tooth image corresponding to the 3D image of the tooth to generate the matching information.
  • the processor generates a screw placement position guide according to the orthodontic planning, and the controller overlaps the screw placement position guide on the tooth image.
  • FIG 1 illustrates an orthodontic guide device according to an embodiment.
  • FIG. 2A illustrates a state in which a guide is implemented as an augmented reality on a display device by a dental orthodontic guide device according to an embodiment.
  • 2B illustrates an embodiment of bonding a bracket according to a guide according to an embodiment.
  • 3A is a view for explaining an embodiment of matching a guide and a dental image by displaying a feature point on a palatal rugae of the palate.
  • 3B is a view for explaining an embodiment of matching a guide and a tooth image by displaying a feature point on a tooth.
  • 3C is a view for explaining an embodiment of matching a guide and a tooth image by using feature points included in information of one tooth.
  • Figure 3d shows a state of guiding the placement position of the root and orthodontic screw with augmented reality by the orthodontic guide device according to one embodiment.
  • 3E is a diagram for describing a process of identifying an object coordinate system, according to an exemplary embodiment.
  • FIG. 4 illustrates an apparatus for controlling a photopolymerizer according to an embodiment.
  • FIG. 5 is a flowchart illustrating a method of operating a dental orthodontic guide device according to an embodiment.
  • FIG. 6 is a flowchart illustrating a method of operating a light polymerizer control apparatus according to an embodiment.
  • FIG. 1 illustrates a dental orthodontic guide device 100 according to one embodiment.
  • the orthodontic guide device 100 may include a processor 110 and a controller 120.
  • Apparatus 100 may be implemented at least temporarily by a computing terminal.
  • the computing terminal includes any type of electronic device, such as a personal computer, medical device, smartphone, tablet computer, wearable device.
  • the processor 110 and the controller 120 may each be a physical and / or logical element included in such an electronic device.
  • the processor 110 and the controller 120 may be implemented by dedicated hardware or general purpose computing resources controlled by software or an operating system.
  • the processor 110 and the controller 120 may be implemented together in one chip, and thus may be physically indistinguishable.
  • the implementation of such hardware resources may be changed by technology development or design change. Therefore, although it is understood that the functions, operations, and structures of the processor 110 and the controller 120 are distinguished from each other, the division may be interpreted differently according to embodiments.
  • the processor 110 generates matching information between the guide and the tooth image including the bonding position of the bracket according to the orthodontic planning.
  • the controller 120 overlaps the guide with the tooth image based on the generated matching information.
  • a dental image is input to the device.
  • the input dental image is an example, but not limited to, an image of the teeth of the orthodontic patient waiting for orthodontic treatment.
  • the dental image may be an image of photographing a patient's teeth in real time at the time when the device 100 operates.
  • each tooth-specific bracket for this patient may be planned in advance.
  • Such planning may be automatically generated by software using medical image data of a patient's dental state, and / or may include a surgical plan manually generated by a medical person with reference to such data.
  • a medical practitioner takes an oral impression, builds a model, and obtains scan data. At this time, if necessary, attach the radio fake point. Then, CT scan is performed, and the scan data and CT scan results are matched to plan the adhesive position of the bracket for each tooth. While there are many ways of planning, it is understood throughout this specification that planning includes planning for what kind of brackets at which locations for each tooth.
  • a matching marker may be attached to the surgical site immediately before surgery. After the impression is taken, scan or oral scan to obtain the data, and match this data with the matching of the scan data and CT scan result. Later, during the actual operation, the captured image is matched with this data and displayed as an image.
  • the orthodontic guide device 100 recognizes a difference between a coordinate system of a camera viewpoint (called an object viewpoint) and a reference coordinate system which is a reference of 3D coordinates in the planning, by using a tooth image which is currently photographed and input.
  • This process can be understood as calculating a coordinate transformation function (or transformation vector) between the reference coordinate system and the object coordinate system.
  • coordinate transformation is performed to synthesize virtual information on a real object or an image of an object so that augmented reality may be precisely provided.
  • FIG. 2A illustrates a state in which the guide 220 is augmented reality on the display device 210 by the orthodontic guide device according to an embodiment.
  • Such photographing may be performed by a stand alone camera, but according to an embodiment, may be performed by a camera attached to glasses for augmented reality worn by a medical person.
  • the present invention is not limited to photographing an image in a wearable device, such as a wearable device in glasses, and providing augmented reality in the device.
  • a wearable device such as a wearable device in glasses
  • providing augmented reality in the device may be photographing an image in a wearable device, such as a wearable device in glasses, and providing augmented reality in the device.
  • the wearable device in the form of glasses will be described below. Again, these wearable devices are just one example application.
  • a device in the form of a transparent display is also possible.
  • the transparent display positioned between the medical practitioner and the patient conducting the surgery allows the medical practitioner to operate more freely and comfortably.
  • FIG. 2A is a view illustrating a guide for orthodontic dental augmented reality technique in a spectacle wearable device according to this exemplary application.
  • the guide 220 is information indicating an adhesive position of the bracket for each tooth based on the calibration planning, and is represented by a dotted line in the embodiment of FIG. 2A.
  • the guide 220 may be distributed according to the tooth arrangement based on the feature points in the oral cavity.
  • the feature points used in the generation of the guide and the feature points in the tooth image (or the intraoral image) currently identified through the display device 210 may be used. By matching them with each other, the guide can accurately indicate the bonding position.
  • Guide 220 may further include a bracket order displayed on one side of the adhesive position of the bracket. In addition, it may further include related information such as the type of the bracket, the bonding order.
  • FIG. 2B illustrates an embodiment of bonding the bracket 230 according to a guide according to one embodiment.
  • the operator can use the bracket holder to accurately position the bracket 230 at the bonding position along the guide displayed in the form of augmented reality.
  • the orthodontic guide device may calculate the overlap rate between the guide and the bracket, and update the calculated overlap rate to the guide. That is, the display device may display in real time how much the bracket and the guide overlap, that is, how accurately the adhesive is bonded.
  • the orthodontic guide device when the bonding of the bracket at the current position is completed, the guide corresponding to the next sequence or the thickness of the display to give the effect of the bracket adhesion to the operator to the operator You can also guide.
  • 3A is a view for explaining an embodiment of matching a guide and a dental image by displaying a feature point 310 on a palatal rugae of the palate.
  • 3B is a view for explaining an embodiment of matching a guide and a tooth image by displaying a feature point 320 on a tooth.
  • Figure 3c is a view for explaining an embodiment of matching the guide and the teeth image using the feature point 330 included in the information of one tooth. Since each tooth is different in shape and size, matching using only one tooth information is also possible.
  • the information of one tooth may include, for example, three or more feature points that are not on the same line.
  • an embodiment of matching the image of a close-up image of a bracket-adhesive target by using information about one tooth regardless of the surrounding teeth is effective. Can be used.
  • the dental orthodontic guide device When the camera linked to the wearable device photographs and sends a patient's dental image, the dental orthodontic guide device recognizes the object coordinate system at the camera viewpoint using a plurality of feature points included in the dental image.
  • the tooth image may not be an image including only a tooth, but may be interpreted as an image of an intraoral cavity including a tooth.
  • the plurality of points is a feature point included in at least one of a gum and a tooth.
  • the gum generally refers to soft tissue surrounding the alveolar bone, but may also mean both soft tissue and hard tissue.
  • Such a feature point may be a corner of a tooth, an interdental point between a tooth and a tooth, a palatal rugae, and a palatal incisal paplla.
  • This markerless based embodiment is possible because multidimensional feature points are extracted to a fairly precise level in existing teeth.
  • teeth protrude outwards and are commonly present in 3D data such as X-rays and CTs, it is very easy to extract tooth feature points from a dental image and match them with 3D data.
  • marker based recognition is also possible. Since the image of the patient's mouth is taken by X-ray or CT to facilitate the matching and coordinate transformation of feature points only when the marker is taken, the material used may be a material having radiopaque or radiocontrast characteristics. For example, as illustrated by a reference numeral 310 of FIG. 3A, the points may be points made of a radiopaque material previously placed by a medical person on the patient's gum or palate. For example, it is a gta capture or an impermeable resin block.
  • points in which a medical person has previously tattooed a tattoo on a gum or a tooth may be used.
  • Various similar embodiments may exist and the listed embodiments are illustrative rather than limiting.
  • the markers can be precisely aligned between the real-time image and the CT image data during surgery by additionally placing the marker and performing the scan again.
  • the guide overlapping the dental image may have various shapes such as a dotted line, a solid line, an arrow, and an actual bracket shape.
  • the guide may be displayed as text or an image together with information indicating the adhesive position of the bracket. For example, the patient's name, sex, age, disease classification, how many teeth are currently being prepared for adhesion, and comments on specifics previously left by the practitioner may optionally be presented. These various visual processes can be prepared in a variety of ways to assist the caregiver and to prevent any possible medical mistakes.
  • the guide information may be presented together with the visual guideline or optionally voice.
  • Figure 3d shows a state of guiding the placement position of the root and orthodontic screw with augmented reality by the orthodontic guide device according to one embodiment.
  • the orthodontic guide device according to one embodiment may be used to assist the operator in placing the orthodontic screw in the patient's alveolar bone without damaging the patient's root.
  • Orthodontic screws are generally placed in the alveolar bone between the roots to move the teeth in the desired direction without anchorage loss.
  • the roots are narrow, so that the root of the patient may be damaged when the orthodontic screw is placed depending on the operator's prediction. Therefore, if the operator can confirm the position of the patient's root through augmented reality when placing the orthodontic screw can greatly reduce the risk of damage to the patient's root.
  • the orthodontic guide device may overlap and display a root image 340 representing the root of the patient in a dental image of the oral cavity of the patient in real time.
  • the root image 340 and / or the screw placement position guide 350 of the patient may be implemented in augmented reality in a dental image of the patient's mouth taken in real time. In this way, the operator can easily check the position of the root of the patient, it is possible to prevent the root damage of the patient at the time of orthodontic screw placement.
  • the CT image may be used to obtain a root image for augmented reality.
  • a method of extracting a 3D image of a tooth from a CT image and matching a crown portion of the extracted 3D image of the tooth with a crown portion in a dental image of the oral cavity may be used.
  • a 3D image of a tooth extracted from a tooth and a gum which is a target region of a patient
  • a tooth image corresponding to each tooth in the tooth image is generated, and the root image within the tooth image. May be overlapped with augmented reality.
  • based on the 3D image of the tooth can generate a screw placement position guide 350 that can prevent root damage in accordance with the orthodontic planning and overlap it in the tooth image.
  • the bracket 360 may be used as a marker to implement augmented reality.
  • the accuracy of tooth recognition in the matching step can be improved by utilizing the bracket 360 in the dental image as a marker.
  • 3E is a diagram for describing a process of identifying an object coordinate system, according to an exemplary embodiment.
  • markers may be displayed at each vertex in the form of an equilateral triangle in order to reduce the amount of calculation and to be accurate.
  • the accuracy of image recognition may be improved by placing two or more marker portions of the equilateral triangle shape.
  • the material used as a marker may be a material having radiopaque or radiocontrast characteristics. .
  • these points may be points of radiopaque material previously placed by a medical person on the patient's gum or palate.
  • a marker As a marker, a gata captureer, a pin, a tattoo, or the like may be used.
  • the gatapocher is used as a conventional dental article and can be used as a marker according to embodiments.
  • a pin with a radiopaid material disposed on the head portion may be used, and in cases where it is difficult to insert a physical form into the patient's gums, a point where a medical person has previously tattooed a gum or a tooth may be used. It may be.
  • Various similar embodiments may exist and the listed embodiments are illustrative rather than limiting.
  • Position vector of camera providing tooth image for augmented reality vector The direction vector from to the position of the first marker The direction vector to the position of the second marker The direction vector to the position of the third marker
  • the position vector of each marker And Is expressed as the following equation.
  • the scalar values r, s and t are the distance between the camera position and the markers. Of course, in the example where there is one marker, these vectors may correspond to a plurality of points in the marker.
  • Principal vector as above Is computed for one or more, and is compared to a vector in the reference coordinate system in which the original 3D data was constructed, the vector transforms between coordinate systems using vector operations. Can be obtained. This To convert the object coordinate system to the reference coordinate system, and vice versa. Inverse of Using a transforming the planning guide on the reference coordinate system to the real object coordinate system may be used.
  • FIG. 4 illustrates a light polymerizer control apparatus 400 according to an embodiment.
  • the photopolymerizer control apparatus 400 may determine whether the bracket is properly positioned on the tooth based on the calibration planning.
  • the light source may be controlled to irradiate ultraviolet rays when it is properly positioned as a result of determination.
  • the photopolymerizer control apparatus 400 may include a processor 410 and a controller 420.
  • the processor 410 determines the activation of the photopolymerizer based on the position of the bracket with respect to the tooth. That is, the processor 410 may determine the activation of the photopolymerizer when the position of the bracket with respect to the teeth meets a predetermined criterion. For example, the corrected position set in the tooth may be predetermined according to the corrected plan, and the processor 410 may determine whether the bracket meets a predetermined criterion by determining whether the bracket is properly positioned at the corrected position.
  • the processor 410 may determine that the predetermined position meets a predetermined criterion when the corrected position and the position of the bracket overlap the threshold value. In addition, the processor 410 may determine that the predetermined position meets a predetermined criterion when the correction position and the bracket position set on the tooth have an error of less than or equal to a threshold value.
  • the controller 420 controls the photopolymerizer to radiate ultraviolet rays to the bracket in response to the control signal.
  • the controller 420 may control the photopolymerizer to irradiate ultraviolet rays by controlling at least one of an irradiation amount and an irradiation time point according to the control signal.
  • the control signal may be generated by the switching module 430 connected to the photopolymerizer and the wired or wireless.
  • generation of the control signal may be controlled according to the switching module 430 controlled on and off by a foot switch.
  • control signal may be generated through the switching module 430 in response to a predetermined voice signal.
  • FIG. 5 is a flowchart illustrating a method of operating a dental orthodontic guide device according to an embodiment.
  • the operation method of the orthodontic guide device or the orthodontic guide method according to an embodiment generates the matching information between the tooth information and the guide information including the adhesive position of the bracket according to the orthodontic planning (step 501).
  • the orthodontic guide method may identify a plurality of points associated with a tooth in an input image, and generate positions of the identified points as the matching information.
  • the feature points included in at least one of the gum and teeth may be used as points.
  • at least one of the plurality of points may correspond to a radiopaque material disposed on the gum or tooth, and may correspond to at least one of a tattoo point pre-colored on the gum or tooth and a color tape attached thereto. have.
  • the orthodontic guide method may overlap the tooth image with the guide information based on the generated matching information (step 502). Then, the orthodontic guide method outputs a matching process between the bracket and the guide information (step 503). For example, the orthodontic guide method displays the overlap rate between the bracket and the guided bonding position in%, and the operator can check the displayed overlap rate and use it for positioning of the bracket, thereby improving the accuracy according to the adhesion. .
  • FIG. 6 is a flowchart illustrating a method of operating a light polymerizer control apparatus according to an embodiment.
  • the method of operating the photopolymerizer control apparatus or the photopolymerizer control method according to an embodiment checks the position of the bracket with respect to the tooth (step 601).
  • the calibration position set in the tooth may be previously determined according to the calibration plan, and the photopolymerizer control method according to an embodiment may monitor in real time the calibration position determined in accordance with the calibration plan and the relative position of the tracked bracket.
  • the photopolymerizer control method determines whether the position of the bracket with respect to the tooth identified in step 601 meets a predetermined criterion (step 602).
  • the position of the bracket may be determined by determining whether or not the corrected position of the tooth and the position of the bracket overlap the threshold value or more according to the corrective plan, or whether the set corrected position and the position of the bracket have an error below the threshold value. It can be determined whether the s meet the predetermined criteria.
  • the method of controlling the photopolymerizer may control the photopolymerizer to be activated when the position of the bracket with respect to the identified tooth meets a predetermined criterion as a result of the determination of step 602 (step 603).
  • the photopolymerizer control method may control the photopolymerizer to be activated when the set calibration position and the bracket position overlap more than the threshold value or when the set calibration position and the bracket position have an error less than or equal to the threshold value.
  • the photopolymerizer control method if the determination result of the step 602, the bracket position with respect to the identified tooth does not meet a predetermined criterion to branch to step 601 to check the position of the bracket relative to the tooth can be performed. have.
  • the photopolymerizer control method branches to step 601 when the set calibration position and the bracket position overlap below the threshold value, or when the set calibration position and the bracket position have an error greater than or equal to the threshold value. Confirmation process can be performed.
  • the photopolymerizer can irradiate ultraviolet rays.
  • the photopolymer control method receives a control signal from the switching module when the photopolymerizer is activated (step 604) and controls the photopolymerizer to irradiate the bracket with ultraviolet rays according to the received control signal (step 605).
  • the apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components.
  • the devices and components described in the embodiments may be, for example, processors, controllers, arithmetic logic units (ALUs), digital signal processors, microcomputers, field programmable arrays (FPAs), It may be implemented using one or more general purpose or special purpose computers, such as a programmable logic unit (PLU), microprocessor, or any other device capable of executing and responding to instructions.
  • the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
  • the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
  • OS operating system
  • the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
  • processing device includes a plurality of processing elements and / or a plurality of types of processing elements. It can be seen that it may include.
  • the processing device may include a plurality of processors or one processor and one controller.
  • other processing configurations are possible, such as parallel processors.
  • the software may include a computer program, code, instructions, or a combination of one or more of the above, and configure the processing device to operate as desired, or process it independently or collectively. You can command the device.
  • Software and / or data may be any type of machine, component, physical device, virtual equipment, computer storage medium or device in order to be interpreted by or to provide instructions or data to the processing device. Or may be permanently or temporarily embodied in a signal wave to be transmitted.
  • the software may be distributed over networked computer systems so that they may be stored or executed in a distributed manner.
  • Software and data may be stored on one or more computer readable recording media.
  • the method according to the embodiment may be embodied in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the media may be those specially designed and constructed for the purposes of the embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
  • Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Engineering & Computer Science (AREA)
  • Dentistry (AREA)
  • Epidemiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

La présente invention concerne un dispositif de guidage orthodontique, et plus précisément un dispositif permettant de fournir à un prestataire de soins de santé un guidage et un retour d'informations visuels au cours d'un processus de liaison de brackets à des dents. Selon un aspect, le dispositif de guidage orthodontique comprend : un processeur générant des informations de concordance entre une image des dents et un guide, comprenant les emplacements des brackets en fonction d'une planification orthodontique ; et une unité de commande mettant le guide en chevauchement sur l'image des dents sur la base des informations de concordance générées.
PCT/KR2016/006113 2015-06-09 2016-06-09 Dispositif et procédé de guidage orthodontique WO2016200167A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020177035025A KR102067089B1 (ko) 2015-06-09 2016-06-09 치아교정 가이드 장치 방법

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2015-0081373 2015-06-09
KR20150081373 2015-06-09
KR1020150162406A KR20160144891A (ko) 2015-06-09 2015-11-19 치아교정 가이드 장치 및 방법
KR10-2015-0162406 2015-11-19

Publications (1)

Publication Number Publication Date
WO2016200167A1 true WO2016200167A1 (fr) 2016-12-15

Family

ID=57504209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/006113 WO2016200167A1 (fr) 2015-06-09 2016-06-09 Dispositif et procédé de guidage orthodontique

Country Status (1)

Country Link
WO (1) WO2016200167A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018112427A1 (fr) * 2016-12-16 2018-06-21 Align Technology, Inc. Planification et visualisation en réalité augmentée de résultats de traitement dentaire
WO2018164351A1 (fr) * 2017-03-10 2018-09-13 박태현 Appareil et procédé permettant de réaliser une publicité d'après un tatouage temporaire
WO2019031658A1 (fr) * 2017-08-11 2019-02-14 서울대학교병원 Système de guidage orthodontique et procédé de guidage orthodontique l'utilisant
US10695150B2 (en) 2016-12-16 2020-06-30 Align Technology, Inc. Augmented reality enhancements for intraoral scanning
US20220233287A1 (en) * 2016-06-20 2022-07-28 Carestream Dental Llc Dental restoration assessment using virtual model

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7147464B2 (en) * 1997-09-30 2006-12-12 Cadent Ltd Placing an orthodontic element on a tooth surface
US7245753B2 (en) * 2003-06-26 2007-07-17 Carestream Health, Inc. Method for determining dental alignment using radiographs
US7993133B2 (en) * 2006-10-20 2011-08-09 3M Innovative Properties Company Digital orthodontic treatment planning
US8733928B1 (en) * 2005-10-07 2014-05-27 Percept Technologies Inc. Enhanced optical and perceptual digital eyewear
KR20140111902A (ko) * 2013-03-12 2014-09-22 주식회사 이우소프트 치과용 치료계획 수립방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7147464B2 (en) * 1997-09-30 2006-12-12 Cadent Ltd Placing an orthodontic element on a tooth surface
US7245753B2 (en) * 2003-06-26 2007-07-17 Carestream Health, Inc. Method for determining dental alignment using radiographs
US8733928B1 (en) * 2005-10-07 2014-05-27 Percept Technologies Inc. Enhanced optical and perceptual digital eyewear
US7993133B2 (en) * 2006-10-20 2011-08-09 3M Innovative Properties Company Digital orthodontic treatment planning
KR20140111902A (ko) * 2013-03-12 2014-09-22 주식회사 이우소프트 치과용 치료계획 수립방법

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220233287A1 (en) * 2016-06-20 2022-07-28 Carestream Dental Llc Dental restoration assessment using virtual model
US10733805B2 (en) 2016-12-16 2020-08-04 Align Technology, Inc. Augmented reality planning and viewing of dental treatment outcomes
US11317999B2 (en) 2016-12-16 2022-05-03 Align Technology, Inc. Augmented reality enhancements for dental practitioners
WO2018112427A1 (fr) * 2016-12-16 2018-06-21 Align Technology, Inc. Planification et visualisation en réalité augmentée de résultats de traitement dentaire
US11051914B2 (en) 2016-12-16 2021-07-06 Align Technology, Inc. Augmented reality enhancements for dental practitioners
US10997792B2 (en) 2016-12-16 2021-05-04 Align Technology, Inc. Kiosk for viewing of dental treatment outcomes
US10888399B2 (en) 2016-12-16 2021-01-12 Align Technology, Inc. Augmented reality enhancements for dental practitioners
US10467815B2 (en) 2016-12-16 2019-11-05 Align Technology, Inc. Augmented reality planning and viewing of dental treatment outcomes
US10695150B2 (en) 2016-12-16 2020-06-30 Align Technology, Inc. Augmented reality enhancements for intraoral scanning
KR101947258B1 (ko) * 2017-03-10 2019-02-12 박태현 일시적 타투에 기초하여 광고를 수행하기 위한 장치 및 방법
KR20180103540A (ko) * 2017-03-10 2018-09-19 박태현 일시적 타투에 기초하여 광고를 수행하기 위한 장치 및 방법
WO2018164351A1 (fr) * 2017-03-10 2018-09-13 박태현 Appareil et procédé permettant de réaliser une publicité d'après un tatouage temporaire
KR102024988B1 (ko) * 2017-08-11 2019-09-24 서울대학교병원 치아교정용 가이드 시스템 및 이를 이용한 치아교정 가이드 방법
KR20190017377A (ko) * 2017-08-11 2019-02-20 서울대학교병원 치아교정용 가이드 시스템 및 이를 이용한 치아교정 가이드 방법
WO2019031658A1 (fr) * 2017-08-11 2019-02-14 서울대학교병원 Système de guidage orthodontique et procédé de guidage orthodontique l'utilisant

Similar Documents

Publication Publication Date Title
KR102067089B1 (ko) 치아교정 가이드 장치 방법
WO2016200167A1 (fr) Dispositif et procédé de guidage orthodontique
WO2016200177A1 (fr) Dispositif et procédé de guidage de forage
US20200214803A1 (en) System and method for digital tooth imaging
WO2018143497A1 (fr) Procédé de guidage de chirurgie d'implant
JP6458042B2 (ja) 歯および口腔内手術用のナビゲーションシステムおよびその作動方法
WO2021145544A1 (fr) Procédé et appareil de génération d'une forme d'agencement de dents orthodontique
WO2016108452A1 (fr) Procédé de guidage de plan d'implant dentaire, appareil, et support d'enregistrement associés
WO2019112301A2 (fr) Appareil et procédé de désignation de région d'implantation d'implant
US20210306599A1 (en) Dental and medical loupe system for lighting control, streaming, and augmented reality assisted procedures
WO2021157966A1 (fr) Procédé de fourniture d'informations concernant l'orthodontie à l'aide d'un algorithme d'intelligence artificielle d'apprentissage profond, et dispositif l'utilisant
WO2019141106A1 (fr) Procédé et appareil d'assistance intelligente à réalité augmentée d'embellissement dentaire reposant sur une architecture c/s
WO2020184876A1 (fr) Procédé d'établissement d'un plan de chirurgie d'implant utilisant une intelligence artificielle et un dispositif de traitement d'image médicales
US9936166B2 (en) Method for planning a dental treatment
US20180299675A1 (en) Eyeglasses having at least one partially transparent screen, and method for operation of eyeglasses
WO2014073818A1 (fr) Procédé de création d'image d'implant et système de création d'image d'implant
WO2017171295A1 (fr) Système de réalité augmentée dans lequel l'estimation du mouvement de la joue d'un patient est une réalité reflétée et augmentée fournissant un procédé associé
KR20170091847A (ko) 임플란트 수술 가이드 방법
WO2013067606A2 (fr) Procédé de fabrication d'un élément ou auxiliaire dentaire de réglage des dents, de l'arc facial et d'un articulateur
WO2021137573A2 (fr) Procédé et appareil de réglage d'une ligne de marge
WO2021006471A1 (fr) Procédé de planification de chirurgie implantaire par mise en place automatique d'une structure d'implant, procédé de fourniture d'interface utilisateur associé, et dispositif de traitement d'image dentaire associé
Ströbele et al. Augmented reality in orthodontics for bracket placement using conventional mobile devices
JP2022545767A (ja) 複合現実を利用して歯科治療をガイドするためのシステム及び方法
WO2023229135A1 (fr) Marqueur formé d'un seul tenant avec une endoprothèse buccale
WO2019031658A1 (fr) Système de guidage orthodontique et procédé de guidage orthodontique l'utilisant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16807807

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20177035025

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16807807

Country of ref document: EP

Kind code of ref document: A1