WO2023003383A1 - 3차원 스캐너의 스캔 심도 조절을 위한 방법 및 장치 - Google Patents
3차원 스캐너의 스캔 심도 조절을 위한 방법 및 장치 Download PDFInfo
- Publication number
- WO2023003383A1 WO2023003383A1 PCT/KR2022/010687 KR2022010687W WO2023003383A1 WO 2023003383 A1 WO2023003383 A1 WO 2023003383A1 KR 2022010687 W KR2022010687 W KR 2022010687W WO 2023003383 A1 WO2023003383 A1 WO 2023003383A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- scan
- scanner
- data
- scan depth
- depth value
- Prior art date
Links
- 238000000034 method Methods 0.000 title abstract description 34
- 238000005259 measurement Methods 0.000 claims abstract description 19
- 230000002159 abnormal effect Effects 0.000 claims abstract description 17
- 230000015654 memory Effects 0.000 claims abstract description 16
- 238000004891 communication Methods 0.000 claims description 50
- 230000005856 abnormality Effects 0.000 claims description 23
- 230000003247 decreasing effect Effects 0.000 claims description 12
- 230000000007 visual effect Effects 0.000 claims description 4
- 238000003672 processing method Methods 0.000 claims description 2
- 238000012545 processing Methods 0.000 abstract description 14
- 210000000214 mouth Anatomy 0.000 description 84
- 230000006870 function Effects 0.000 description 35
- 210000000515 tooth Anatomy 0.000 description 29
- 238000010586 diagram Methods 0.000 description 12
- 210000004195 gingiva Anatomy 0.000 description 12
- 238000010801 machine learning Methods 0.000 description 10
- 210000004872 soft tissue Anatomy 0.000 description 10
- 238000013473 artificial intelligence Methods 0.000 description 9
- 238000001914 filtration Methods 0.000 description 9
- 230000014509 gene expression Effects 0.000 description 9
- 238000003062 neural network model Methods 0.000 description 8
- 239000000523 sample Substances 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000011109 contamination Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 230000010267 cellular communication Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000011505 plaster Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000007943 implant Substances 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 210000000225 synapse Anatomy 0.000 description 1
- 230000000946 synaptic effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00172—Optical arrangements with means for scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00194—Optical arrangements adapted for three-dimensional imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/24—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
- A61C9/0053—Optical means or methods, e.g. scanning the teeth by a laser or light beam
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30036—Dental; Teeth
Definitions
- the present disclosure relates to a method and apparatus for adjusting a scan depth of a 3D scanner. Specifically, the present disclosure relates to a method and apparatus for automatically adjusting a scan depth of a scanner by analyzing data scanned through a 3D scanner.
- the 3D scanner may scan an object existing within a certain distance from a part (eg, a probe tip of the scanner) that scans the object. In the present disclosure, this scannable distance is referred to as “scan depth”.
- a user of the 3D scanner eg, a dentist
- the scannable area becomes larger, so more data can be obtained through scanning, but the possibility of generating noise data may increase.
- an area eg, the inside of the oral cavity
- the possibility that an object other than the object eg, a user's finger, a treatment tool, etc.
- the scan depth is set shallow, that is, if the scan depth value is set small, the possibility of noise data is reduced, but the acquired data is reduced, making it difficult to create a perfect 3D image, and the scanner must be brought closer to the object. There is an inconvenience. Therefore, it was necessary to determine an appropriate depth according to the situation.
- a user in order to change the scan depth of a scanner, a user must manually adjust the scan depth through a user interface displayed on a display using a mouse or a touch screen.
- a user directly touches a mouse or a display to adjust a scan depth during a dental procedure, for example, there is a possibility of contamination of the user's hand that may enter the patient's oral cavity.
- the present disclosure is to solve the above problems of the prior art, by adjusting the scan depth of the intraoral scanner according to predetermined conditions, preventing contamination that may occur when the user directly manipulates the input device, and at the same time, convenient and rapid scanning. make it possible.
- the distance between the scanner and the object can be naturally adjusted, while parts not to be scanned (eg, oral soft tissue, etc.) can be automatically filtered.
- a method and apparatus for adjusting a scan depth of a 3D scanner may be proposed.
- a method is performed by the one or more processors to scan images of a 3D scanner.
- a processing method comprising: acquiring scan data for a measurement object within a preset scan depth of the 3D scanner through the 3D scanner; determining whether or not the scan data is abnormal based on a scan image obtained from the 3D scanner; and adjusting a scan depth of the 3D scanner when it is determined that there is an abnormality in the scan data.
- An electronic device for adjusting a scan depth of a 3D scanner includes a communication circuit connected to a 3D scanner in communication; display; and one or more processors, wherein the one or more processors obtain scan data for a measurement object within a preset scan depth of the 3D scanner through the 3D scanner; determining whether the scan data is abnormal based on a scan image obtained from the 3D scanner; When it is determined that there is an abnormality in the scan data, it may be configured to adjust the scan depth of the 3D scanner.
- a non-transitory computer readable recording medium recording instructions for controlling a scan depth of a 3D scanner may be proposed.
- a non-transitory computer-readable recording medium according to one aspect of the present disclosure is a non-transitory computer-readable recording medium recording instructions that, when executed by one or more processors, cause the one or more processors to perform an operation, wherein the instructions are , causing the one or more processors to obtain, through the 3D scanner, scan data for a measurement object within a preset scan depth of the 3D scanner; determining whether the scan data is abnormal based on a scan image obtained from the 3D scanner; When it is determined that there is an abnormality in the scan data, the scan depth of the 3D scanner may be adjusted.
- the method and apparatus for adjusting the scan depth of the present disclosure contamination that may occur when a user directly manipulates an input device is prevented by automatically adjusting the scan depth of a 3D scanner based on scan data. At the same time, it enables convenient and quick scanning.
- the method and apparatus for adjusting the scan depth of the present disclosure when scanning of a non-target part is being performed, the user is notified of this and the scan depth is reduced to prevent unnecessary scanning and data processing.
- FIG. 1 is a diagram illustrating how to obtain an image of a patient's oral cavity using a 3D scanner according to various embodiments of the present disclosure.
- 2A is a block diagram of an electronic device and a 3D scanner according to various embodiments of the present disclosure.
- 2B is a perspective view of a 3D scanner according to various embodiments of the present disclosure.
- FIG. 3 is a diagram illustrating a method of generating a 3D image of an oral cavity according to various embodiments of the present disclosure.
- FIG. 4 is a diagram illustrating a scan depth of a 3D scanner according to various embodiments of the present disclosure.
- FIG. 5 is a diagram illustrating a screen of a data acquisition program displayed on a display of an electronic device according to various embodiments of the present disclosure.
- 6A to 6C are diagrams illustrating cases in which a scan depth value is increased or decreased.
- FIG. 7 is a flowchart of a method for automatically adjusting a scan depth value of a 3D scanner according to various embodiments of the present disclosure.
- FIG. 8 is a diagram showing that a portion used for generating 3D data and a portion not used for generating 3D data among images acquired by a 3D scanner are separately displayed according to various embodiments of the present disclosure.
- FIG. 9 is a flowchart of a method for automatically adjusting a scan depth in a specific scan mode according to various embodiments of the present disclosure.
- FIG. 10 is a diagram illustrating an image in which a target object and a non-target object are separated and masked in an obtained 2D image according to various embodiments of the present disclosure.
- Embodiments of the present disclosure are illustrated for the purpose of explaining the technical idea of the present disclosure.
- the scope of rights according to the present disclosure is not limited to the specific description of the embodiments or these embodiments presented below.
- unit used in the present disclosure means software or a hardware component such as a field-programmable gate array (FPGA) or an application specific integrated circuit (ASIC).
- FPGA field-programmable gate array
- ASIC application specific integrated circuit
- unit is not limited to hardware and software.
- a “unit” may be configured to reside in an addressable storage medium and may be configured to reproduce on one or more processors.
- “unit” refers to components such as software components, object-oriented software components, class components and task components, processors, functions, properties, procedures, subroutines, It includes segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays and variables. Functions provided within components and “units” may be combined into fewer components and “units” or further separated into additional components and “units”.
- the expression "based on” is used to describe one or more factors that affect the act or operation of a decision, judgment, described in a phrase or sentence in which the expression is included, which expression It does not preclude additional factors that may affect the decision, the act of judgment, or the action.
- artificial intelligence means a technology that imitates human learning ability, reasoning ability, and perception ability, and implements them with a computer, and may include concepts of machine learning and symbolic logic.
- Machine learning may be an algorithm technology that classifies or learns features of input data by itself.
- Artificial intelligence technology is a machine learning algorithm that analyzes input data, learns the result of the analysis, and can make judgments or predictions based on the result of the learning.
- technologies that use machine learning algorithms to mimic the cognitive and judgmental functions of the human brain can also be understood as the category of artificial intelligence. For example, technical fields of linguistic understanding, visual understanding, inference/prediction, knowledge expression, and motion control may be included.
- machine learning may refer to a process of training a neural network model using experience of processing data.
- computer software could mean improving its own data processing capabilities.
- a neural network model is constructed by modeling a correlation between data, and the correlation may be expressed by a plurality of parameters.
- the neural network model derives a correlation between data by extracting and analyzing features from given data, and optimizing the parameters of the neural network model by repeating this process can be referred to as machine learning.
- a neural network model may learn a mapping (correlation) between an input and an output with respect to data given as an input/output pair.
- the neural network model may learn the relationship by deriving a regularity between given data.
- an artificial intelligence learning model, a machine learning model, or a neural network model may be designed to implement a human brain structure on a computer, and include a plurality of network nodes that simulate neurons of a human neural network and have weights. can do.
- a plurality of network nodes may have a connection relationship between them by simulating synaptic activities of neurons that transmit and receive signals through synapses.
- a plurality of network nodes can send and receive data according to a convolutional connection relationship while being located in layers of different depths.
- the artificial intelligence learning model may be, for example, an artificial neural network model, a convolution neural network model, and the like.
- FIG. 1 is a diagram illustrating how an image of a patient's oral cavity is acquired using a 3D scanner 200 according to various embodiments of the present disclosure.
- the 3D scanner 200 may be a dental medical device for obtaining an image of the oral cavity of the object 20 .
- the 3D scanner 200 may be an intraoral scanner.
- a user 10 eg, a dentist or a dental hygienist
- acquires an image of an oral cavity of an object 20 eg, a patient
- a 3D scanner 200 can do.
- the user 10 may obtain an image of the mouth of the object 20 from a diagnosis model (eg, a plaster model or an impression model) imitating the shape of the mouth of the object 20 .
- a diagnosis model eg, a plaster model or an impression model
- an image of the oral cavity of the object 20 is acquired by scanning the oral cavity of the object 20, but is not limited thereto, and other parts of the object 20 (eg, the object It is also possible to obtain an image for the ear of (20).
- the 3D scanner 200 may have a shape capable of being drawn in and out of the oral cavity, and may be a handheld scanner in which the user 10 can freely adjust a scanning distance and a scanning angle.
- the 3D scanner 200 may acquire an image of the oral cavity by being inserted into the oral cavity of the object 20 and scanning the oral cavity in a non-contact manner.
- the image of the oral cavity may include at least one tooth, a gingiva, and an artificial structure insertable into the oral cavity (eg, an orthodontic device including a bracket and a wire, an implant, a denture, and an orthodontic aid inserted into the oral cavity).
- the 3D scanner 200 may irradiate light to the oral cavity of the object 20 (eg, at least one tooth or gingiva of the object 20) using a light source (or projector), and may irradiate light to the oral cavity of the object 20. Light reflected from the camera may be received through a camera (or at least one image sensor).
- the 3D scanner 200 may obtain an image of the oral cavity diagnostic model by scanning the oral cavity diagnostic model.
- the diagnostic model of the oral cavity is a diagnostic model that imitates the shape of the oral cavity of the object 20
- the image of the oral diagnostic model may be an image of the oral cavity of the object.
- an image of the oral cavity is obtained by scanning the inside of the oral cavity of the object 20 is assumed, but is not limited thereto.
- the 3D scanner 200 may obtain a surface image of the oral cavity of the object 20 as a 2D image based on information received through a camera.
- the surface image of the oral cavity of the object 20 may include at least one of at least one tooth, gingiva, artificial structure, cheek, tongue, or lip of the object 20 .
- the surface image of the oral cavity of the object 20 may be a two-dimensional image.
- the 2D image of the oral cavity obtained by the 3D scanner 200 may be transmitted to the electronic device 100 connected through a wired or wireless communication network.
- the electronic device 100 may be a computer device or a portable communication device.
- the electronic device 100 generates a 3D image of the oral cavity (or a 3D oral image or a 3D oral model) representing the oral cavity in 3D based on the 2D image of the oral cavity received from the 3D scanner 200. can create
- the electronic device 100 may generate a 3D image of the oral cavity by 3D modeling the internal structure of the oral cavity based on the received 2D image of the oral cavity.
- the 3D scanner 200 scans the oral cavity of the object 20 to acquire a 2D image of the oral cavity, and generates a 3D image of the oral cavity based on the acquired 2D image of the oral cavity. and may transmit the generated 3D image of the oral cavity to the electronic device 100 .
- the electronic device 100 may be communicatively connected to a cloud server (not shown).
- the electronic device 100 may transmit a 2D image or a 3D image of the oral cavity of the object 20 to the cloud server, and the cloud server may transmit the object 20 image received from the electronic device 100 to the cloud server.
- the cloud server may transmit the object 20 image received from the electronic device 100 to the cloud server.
- a table scanner (not shown) fixed to a specific position may be used as the 3D scanner in addition to a handheld scanner inserted into the oral cavity of the object 20 for use.
- the table scanner may generate a three-dimensional image of the oral cavity diagnostic model by scanning the oral cavity diagnostic model.
- the light source (or projector) and camera of the table scanner are fixed, the user can scan the oral cavity diagnostic model while moving the arm fixing the oral cavity diagnostic model.
- table scanners are less likely to cause noise due to other objects being intervened between the camera and the diagnostic model during the scanning operation, but the embodiments of the present disclosure are not only handheld scanners, but table scanners and other It can also be applied to 3D scanners.
- FIG. 2A is a block diagram of an electronic device 100 and a 3D scanner 200 according to various embodiments of the present disclosure.
- the electronic device 100 and the 3D scanner 200 may be communicatively connected to each other through a wired or wireless communication network, and may transmit and receive various data to each other.
- the 3D scanner 200 includes a processor 201, a memory 202, a communication circuit 203, a light source 204, a camera 205, an input device 206, and/or a sensor module ( 207) may be included. At least one of the components included in the 3D scanner 200 may be omitted or another component may be added to the 3D scanner 200 . Additionally or alternatively, some of the components may be integrated and implemented, or implemented as a singular or plural entity. At least some of the components in the 3D scanner 200 are connected to each other through a bus, general purpose input/output (GPIO), serial peripheral interface (SPI) or mobile industry processor interface (MIPI), and data and /or send and receive signals.
- GPIO general purpose input/output
- SPI serial peripheral interface
- MIPI mobile industry processor interface
- the processor 201 of the 3D scanner 200 is a component capable of performing calculations or data processing related to control and/or communication of each component of the 3D scanner 200, and is a 3D scanner. It can be operatively connected with the components of 200.
- the processor 201 may load commands or data received from other components of the 3D scanner 200 into the memory 202, process the commands or data stored in the memory 202, and store resultant data.
- the memory 202 of the 3D scanner 200 may store instructions for the operation of the processor 201 described above.
- the communication circuit 203 of the 3D scanner 200 may establish a wired or wireless communication channel with an external device (eg, the electronic device 100) and transmit/receive various data with the external device.
- the communication circuit 203 may include at least one port connected to the external device through a wired cable in order to communicate with the external device by wire.
- the communication circuit 203 may perform communication with an external device connected by wire through at least one port.
- the communication circuit 203 may be configured to be connected to a cellular network (eg, 3G, LTE, 5G, Wibro or Wimax) by including a cellular communication module.
- a cellular network eg, 3G, LTE, 5G, Wibro or Wimax
- the communication circuit 203 may communicate with an external device using a millimeter wave frequency band, for example, a communication band of 30 GHz to 300 GHz.
- the communication circuit 203 may communicate with an external device through a wireless network using a frequency band of 60 GHz.
- the communication circuit 203 includes a short-range communication module to transmit/receive data with an external device using short-range communication (eg, Wi-Fi, Bluetooth, Bluetooth Low Energy (BLE), UWB). It can, but is not limited thereto.
- the communication circuit 203 may include a non-contact communication module for non-contact communication.
- Non-contact communication may include, for example, at least one non-contact type proximity communication technology such as near field communication (NFC) communication, radio frequency identification (RFID) communication, or magnetic secure transmission (MST) communication.
- NFC near field communication
- RFID radio frequency identification
- MST magnetic secure transmission
- the light source 204 of the 3D scanner 200 may radiate light toward the oral cavity of the object 20 .
- the light emitted from the light source 204 may be structured light having a predetermined pattern (eg, a stripe pattern in which straight lines of different colors are continuously appearing).
- the structured light pattern may be generated using, for example, a pattern mask or a digital micro-mirror device (DMD), but is not limited thereto.
- the camera 205 of the 3D scanner 200 may obtain an image of the oral cavity of the object 20 by receiving reflected light reflected by the oral cavity of the object 20 .
- the camera 205 may include, for example, a left camera corresponding to the left eye field of view and a right camera corresponding to the right eye field of view in order to build a 3D image according to the optical triangulation method.
- the camera 205 may include at least one image sensor such as a CCD sensor or a CMOS sensor.
- the input device 206 of the 3D scanner 200 may receive a user input for controlling the 3D scanner 200 .
- the input device 206 may include a button for receiving a push manipulation of the user 10, a touch panel for detecting a touch of the user 10, and a voice recognition device including a microphone.
- the user 10 may control starting or stopping scanning using the input device 206 .
- the sensor module 207 of the 3D scanner 200 detects an operating state of the 3D scanner 200 or an external environmental state (eg, a user's motion), and electrical response corresponding to the detected state. signal can be generated.
- the sensor module 207 may include, for example, at least one of a gyro sensor, an acceleration sensor, a gesture sensor, a proximity sensor, or an infrared sensor.
- the user 10 may control starting or stopping scanning using the sensor module 207 . For example, when the user 10 holds the 3D scanner 200 in his hand and moves it, the 3D scanner 200, when the angular velocity measured through the sensor module 207 exceeds a predetermined threshold value, the processor (201) Control to start a scanning operation.
- the 3D scanner 200 receives a user input for starting a scan through the input device 206 of the 3D scanner 200 or the input device 206 of the electronic device 100, or , according to the processing of the processor 201 of the 3D scanner 200 or the processor 201 of the electronic device 100, scanning may be started.
- the 3D scanner 200 may generate a 2D image of the oral cavity of the object 20, and in real time As a result, a 2D image of the oral cavity of the object 20 may be transmitted to the electronic device 100 .
- the electronic device 100 may display the received 2D image of the oral cavity of the object 20 through the display.
- the electronic device 100 may generate (construct) a 3D image of the oral cavity of the object 20 based on the 2D image of the oral cavity of the object 20, and generate (construct) a 3D image of the oral cavity. can be displayed on the display.
- the electronic device 100 may display the 3D image being created through the display in real time.
- An electronic device 100 may include one or more processors 101 , one or more memories 103 , a communication circuit 105 , a display 107 , and/or an input device 109 . At least one of the components included in the electronic device 100 may be omitted or another component may be added to the electronic device 100 . Additionally or alternatively, some of the components may be integrated and implemented, or implemented as a singular or plural entity. At least some of the components in the electronic device 100 are connected to each other through a bus, general purpose input/output (GPIO), serial peripheral interface (SPI) or mobile industry processor interface (MIPI), etc., and data and/or Or you can send and receive signals.
- GPIO general purpose input/output
- SPI serial peripheral interface
- MIPI mobile industry processor interface
- one or more processors 101 of the electronic device 100 perform operations or data processing related to control and/or communication of each component (eg, memory 103) of the electronic device 100. It may be a configuration that can be performed.
- One or more processors 101 may be operatively connected to components of the electronic device 100 , for example.
- the one or more processors 101 load commands or data received from other components of the electronic device 100 into one or more memories 103, process the commands or data stored in the one or more memories 103, and , the resulting data can be stored.
- one or more memories 103 of the electronic device 100 may store instructions for the operation of one or more processors 101 .
- One or more memories 103 may store correlation models built according to machine learning algorithms.
- the one or more memories 103 may store data received from the 3D scanner 200 (eg, a 2D image of the oral cavity acquired through an oral cavity scan).
- the communication circuit 105 of the electronic device 100 establishes a wired or wireless communication channel with an external device (eg, the 3D scanner 200, a cloud server), and transmits and receives various data with the external device. can do.
- the communication circuit 105 may include at least one port connected to the external device through a wired cable in order to communicate with the external device through a wired connection.
- the communication circuit 105 may perform communication with an external device connected by wire through at least one port.
- the communication circuit 105 may be configured to be connected to a cellular network (eg, 3G, LTE, 5G, Wibro or Wimax) by including a cellular communication module.
- a cellular network eg, 3G, LTE, 5G, Wibro or Wimax
- the communication circuit 105 may include a short-range communication module to transmit/receive data with an external device using short-range communication (eg, Wi-Fi, Bluetooth, Bluetooth Low Energy (BLE), UWB). However, it is not limited thereto.
- the communication circuitry 105 may include a contactless communication module for contactless communication.
- Non-contact communication may include, for example, at least one non-contact type proximity communication technology such as near field communication (NFC) communication, radio frequency identification (RFID) communication, or magnetic secure transmission (MST) communication.
- NFC near field communication
- RFID radio frequency identification
- MST magnetic secure transmission
- the display 107 of the electronic device 100 may display various screens based on the control of the processor 101 .
- the processor 101 displays a 2-dimensional image of the oral cavity of the object 20 received from the 3-dimensional scanner 200 and/or a 3-dimensional image of the oral cavity in which the internal structure of the oral cavity is 3-dimensionally modeled (107). can be displayed through For example, a 2D image and/or a 3D image of the oral cavity may be displayed through a specific application. In this case, the user 10 can edit, save and delete the 2D image and/or the 3D image of the oral cavity.
- the input device 109 of the electronic device 100 transmits a command or data to be used to components (eg, one or more processors 101) of the electronic device 100 to the outside of the electronic device 100 ( Example: user).
- the input device 109 may include, for example, a microphone, mouse or keyboard.
- the input device 109 may be implemented in the form of a touch sensor panel capable of recognizing contact or proximity of various external objects by being combined with the display 107 .
- the 3D scanner 200 may include a main body 210 and a probe tip 220 .
- the body 210 of the 3D scanner 200 may be formed in a shape that is easy for the user 10 to grip and use.
- the probe tip 220 may be formed in a shape that facilitates insertion into and withdrawal from the oral cavity of the object 20 .
- the main body 210 may be combined with and separated from the probe tip 220 .
- components of the 3D scanner 200 described in FIG. 2A may be disposed inside the main body 210.
- An opening may be formed at one end of the main body 210 so that light output from the light source 204 may be irradiated to the object 20 .
- Light irradiated through the opening may be reflected by the target object 20 and introduced again through the opening. Reflected light introduced through the opening may be captured by a camera to generate an image of the object 20 .
- the user 10 may start scanning using the input device 206 (eg, a button) of the 3D scanner 200 . For example, when the user 10 touches or presses the input device 206 , light from the light source 204 may be radiated to the object 20 .
- 3 is a diagram illustrating a method of generating a 3D image 320 of an oral cavity according to various embodiments.
- the user 10 may scan the inside of the mouth of the object 20 while moving the 3D scanner 200.
- the 3D scanner 200 may scan a plurality of 2D images of the mouth of the object 20. (310) can be obtained.
- the 3D scanner 200 may obtain a 2D image of a region including the front teeth of the object 20 and a 2D image of a region including the molar teeth of the object 20 .
- the 3D scanner 200 may transmit the acquired plurality of 2D images 310 to the electronic device 100 .
- the user 10 may scan a diagnostic model of the oral cavity while moving the 3D scanner 200, or acquire a plurality of 2D images of the diagnostic model of the oral cavity.
- a diagnostic model of the oral cavity while moving the 3D scanner 200, or acquire a plurality of 2D images of the diagnostic model of the oral cavity.
- the electronic device 100 may transform each of the plurality of 2D images 310 of the oral cavity of the object 20 into a set of a plurality of points having 3D coordinate values.
- the electronic device 100 may convert each of the plurality of 2D images 310 into a point cloud, which is a set of data points having 3D coordinate values.
- a point cloud set that is a 3D coordinate value based on a plurality of 2D images 310 may be stored as raw data for the oral cavity of the object 20 .
- the electronic device 100 may complete the entire tooth model by aligning a point cloud, which is a set of data points having 3D coordinate values.
- the electronic device 100 may reconstruct (reconstruct) a 3D image of the oral cavity.
- the electronic device 100 reconstructs a plurality of points by merging point cloud sets stored as raw data using a Poisson algorithm, transforms a plurality of points into a closed 3D surface, and places the object 20 in the oral cavity. It is possible to reconstruct a 3D image 320 for
- the 3D scanner 200 may acquire an image of the oral cavity by being inserted into the oral cavity of the object 20 and scanning the oral cavity in a non-contact manner. At this time, a scannable distance by the 3D scanner 200, that is, a distance at which data can be obtained by scanning is referred to as “scan depth”.
- FIG. 4 illustrates a scan depth of the 3D scanner 200 according to various embodiments of the present disclosure.
- a scan area 410 when the 3D scanner 200 having a scan depth of d scans the object 20 is shown.
- a scannable distance d from the camera portion of the 3D scanner 200, for example, the probe tip 220 corresponds to the scan depth.
- a part 420 of the object 20 belongs to the scan area 410, and the surface 422 of the object belonging to the scan area 410 may be obtained as a 2D image.
- the scan depth refers to a distance set by software so that the camera 205 of the 3D scanner 200 acquires data within the maximum possible distance. In this case, among the data acquired by the actual camera, only data falling within the scan depth is used for subsequent data processing, and adjusting the scan depth may include adjusting the distance to data used for subsequent data processing with software. there is.
- the scan depth refers to a distance that the camera 205 of the 3D scanner 200 can capture. In this case, all of the data acquired by the actual camera is used for subsequent data processing, and adjusting the scan depth adjusts the distance that the camera 205 can capture by adjusting the hardware of the camera 205 of the 3D scanner 200.
- the term scan depth means a distance at which a 3D scanner can obtain data, and may include both of the above cases. For example, when the scan depth value is 18.5 mm, data up to a distance of 18.5 mm from the scan portion (probe tip) of the 3D scanner is acquired for subsequent processing.
- the electronic device 100 may execute a data acquisition program by the processor 101 .
- the data acquisition program may be stored in the memory 103 as an application program.
- the electronic device 100 displays the 2D image 310 received from the intraoral scanner 200 and the 3D image generated based thereon on the display 107.
- the electronic device 100 may display images scanned by the intraoral scanner 100 and provide a user interface for processing and processing them.
- the screen 500 of the data acquisition program includes a data display area 510, a live view area 520, a model view area 530, a function box area 540, a function options area 550, and icons. It includes user interfaces including the display area 560 .
- the above user interfaces are exemplary, and any additional user interfaces may be included in the screen 500 of the data acquisition program as needed.
- the data display area 510 is an area for displaying an image received from the intraoral scanner and a 3D image generated based thereon.
- the data display area 510 includes a live view area 520 and a model view area 530 .
- the data display area 510 includes a function options area 550 .
- the live view area 520 displays an image received from the intraoral scanner 100 .
- the electronic device 100 may display a 2D image of the oral cavity currently being scanned by the intraoral scanner 200 in the live view area 520 in real time.
- the live view area 520 can be moved and resized by a user, and can be separated from the screen 500 of the data acquisition program.
- the user may set the live view area 520 not to be displayed.
- the electronic device 100 may display a 3D image model generated from a 2D image obtained from the intraoral scanner 100 on the model view area 530 .
- the function box area 540 includes a user interface for providing a function for modifying/editing or analyzing a displayed 3D image model and a user interface for displaying a device state.
- the function box area 540 includes a trimming function interface 542 for selecting and deleting unnecessary data obtained during the scanning process, and a function tool interface for editing or saving the generated 3D image. 544, a treatment information interface 546 for displaying an image of treatment information for each tooth, and a device status display interface 548 for displaying the device status of the intraoral scanner 200.
- the electronic device 100 may display detailed options in the function option area 550 in response to a user input for selecting one function interface from the function box area 540 .
- the electronic device 100 displays an option for playing a recorded screen in the function option area 550; For example, a play/pause button, an interface for controlling play position/speed, and the like may be displayed.
- the function options area 550 may include a scan filtering function option 552 .
- the scan filtering function includes a function of generating a scanned image by automatically removing unnecessary parts (eg, oral soft tissue) during scanning.
- the no-filtering mode 552-1 of the scan filtering function option 552 when the no-filtering mode 552-1 of the scan filtering function option 552 is selected, unnecessary parts such as soft tissues are not removed from the data obtained by scanning, and all images are created.
- the tooth and gingiva mode 552-2 of the scan filtering function option 552 when the tooth and gingiva mode 552-2 of the scan filtering function option 552 is selected, only the data corresponding to the tooth and the gingival portion adjacent to the tooth are retained while data corresponding to the soft tissue portion is removed.
- the tooth mode 552-3 of the scan filtering function option 552 When the tooth mode 552-3 of the scan filtering function option 552 is selected, only the data corresponding to the tooth is left and the remaining data is removed.
- the acquired data corresponds to teeth, gingiva, or soft tissues may be automatically determined using an AI learning model trained using intraoral images as learning data.
- the function option area 550 When a function that does not require detailed options is selected in the function box area 540, the function option area 550 may
- the icon display area 560 is an area that provides functions for screen recording and capture, and may include a recording area setting icon 562 and a recording start/end icon 564 .
- a recording area setting icon 562 provides an interface for setting a screen area to be recorded.
- the recording start/stop icon 564 provides an interface for starting or ending recording of the screen 400 of the data acquisition program.
- the scan depth control bar 570 provides an interface for adjusting the scan depth of the 3D scanner 200 .
- the user can increase or decrease the scan depth of the 3D scanner.
- Upper and lower ends of the scan depth control bar 570 indicate the maximum and minimum values of the current depth range, and the position of the scan depth control button 571 may indicate the current scan depth value.
- a numerical value (eg, 0 mm) or a percentage (eg, 0%) corresponding to the minimum value of the scan depth is displayed on the lower part of the scan depth control bar 570, and a numerical value corresponding to the maximum value of the scan depth is displayed on the upper part. (eg 24 mm) or percentage (100%) may be displayed.
- the user may increase the scan depth value by moving the scan depth adjustment button 571 upward through the input device 206, and may increase the scan depth value by moving the scan depth adjustment button 571 downward. can reduce
- the scan depth adjustment bar 570 when the scan depth value of the 3D scanner 200 is automatically adjusted, the scan depth adjustment bar 570 reflects the automatically adjusted scan depth value. can be adjusted to For example, the scan depth control button 571 of the scan depth control bar 570 may move to a position corresponding to the automatically adjusted scan depth value in real time.
- the data acquisition program 500 provides a user interface capable of turning on/off a function for automatically adjusting the scan depth value of the 3D scanner 200 as described below. can include more.
- a function of automatically adjusting the scan depth value of the 3D scanner 200 as described below may be turned off, that is, released.
- a user interface capable of turning on/off a function of automatically adjusting the above scan depth value may be turned off.
- the user interface of the data acquisition program 500 described above is just an example, and additional user interfaces and/or other types of user interfaces may be included in the data acquisition program.
- the user of the 3D scanner 200 may adjust the basic value of the scan depth or the scan depth range by changing the settings of the scanner as needed.
- the basic scan depth value of the 3D scanner is 18.5 mm, and the range of the scan depth value may be set from a minimum of 0 mm to a maximum of 12 to 24 mm, but is not limited thereto.
- a user may set an adjustable scan depth value range within a scan depth value range allowed by the 3D scanner 200 .
- the user sets the range of scan depth values adjustable through the user interface of the data acquisition program 500 to a smaller range, For example, it can be set to 0 to 12 mm, 5 to 12 mm, 8.5 to 15 mm, or other various ranges. Numerical values or percentages corresponding to the maximum and minimum values of the set scan depth value range may be displayed on the scan depth control bar 570 as described above.
- the scan depth value can be adjusted through the user interface of the data acquisition program 500 .
- the user may adjust the scan depth value of the 3D scanner 200 by adjusting the scan depth adjustment button 571 through the input device 206, for example, a mouse or a touch panel.
- the scan depth value may be increased by moving the scan depth adjustment button 571 upward, and the scan depth value may be decreased by moving the scan depth adjustment button 571 downward.
- the scan depth value d1 may be a basic scan depth value basically used when the 3D scanner 200 operates.
- the basic scan depth value may be a predetermined setting value or may be a scan depth value last used in a previous scan of the 3D scanner 200 .
- the surface of the part 610 of the object 20 belonging to the scan depth d1 may be acquired as a 2D image.
- the scan depth value may be decreased by moving the scan depth adjustment button 571 of the data acquisition program 500 downward.
- the surface of the part 620 of the object 20 belonging to the scan depth d2 may be obtained as a 2D image. Due to the reduced scan depth value d2, the part 620 of the object belonging to the scan depth d2 may be smaller than the part 610 of the object belonging to the scan depth d1 of FIG. 6A.
- the scan depth value may be increased by moving the scan depth control button 571 of the data acquisition program 500 upward.
- the surface of the part 630 of the object 20 belonging to the scan depth d3 may be obtained as a 2D image. Due to the increased scan depth value d3, the part 630 of the object belonging to the scan depth d3 may be larger than the part 610 of the object belonging to the scan depth d1 of FIG. 6A.
- FIG. 7 is a flowchart of a method 700 for automatically adjusting a scan depth value of a 3D scanner 200 according to various embodiments of the present disclosure.
- At least part of the filtering method according to the present disclosure may be a method implemented by a computer, for example, the electronic device 100 .
- the filtering method according to the present disclosure is not limited thereto and may be implemented by other devices, for example, the 3D scanner 200 .
- each step of the method according to the present disclosure is described in a sequential order in the illustrated flowchart, each step may be performed in an order that may be arbitrarily combined in accordance with the present disclosure, in addition to being performed sequentially.
- the description according to this flowchart does not exclude changes or modifications to the method or algorithm, and does not imply that any step is necessary or desirable.
- at least some of the steps may be performed in parallel, iteratively or heuristically.
- at least some steps may be omitted or other steps may be added.
- step 710 the scanning operation of the 3D scanner 200 is initiated. Initiating the scan operation may include initiating the operation of software associated with the 3D scanner 200 .
- the 3D scanner 200 may perform a scan with a basic scan depth value.
- the basic scan depth value may be a predetermined setting value or may be a scan depth value last used in a previous scan of the 3D scanner 200 .
- the object 20 is scanned using the 3D scanner 200.
- the object may include an oral cavity or a diagnostic model (eg, plaster, impression, etc.).
- a scan image of the object is obtained from a measuring unit of the 3D scanner, for example, the probe tip 220 of the 3D scanner 200.
- the obtained scan image is a 2D image of the surface of the object existing within the scan depth
- obtaining the scan image of the object includes image data about the object acquired by the camera that exists within the scan depth. It includes selecting and acquiring only image data.
- the obtained 2D scan image may be transmitted to the electronic device 100 and displayed on the live view area 520 of the data acquisition program 500 .
- a portion used for generating 3D data and a portion not used for generating 3D data may be separately displayed in the live view area 520.
- FIG. 8 shows a part used for generating 3D data and a part not used for generating 3D data among images acquired by the 3D scanner 200 according to various embodiments, and displayed in the live view area 800.
- the live view area 800 may correspond to the live view area 520 of the data acquisition program 500 .
- a portion 802 not used for generating 3D data is masked.
- a portion used for generating 3D data may be masked, or both a portion used for generating 3D data and a portion not used may be masked in a separate manner.
- a portion used for generating 3D data and a portion not used for generating 3D data may be classified according to various criteria.
- a portion used for generating 3D data corresponds to an area set as a target.
- the target object may be a region corresponding to a tooth or a region corresponding to a tooth and gingiva.
- the obtained scan image is a 2D image obtained by a 3D scanner regardless of the scan depth, and only scan images existing within the scan depth can be used as data obtained for generating a 3D image.
- all 2D images acquired by the 3D scanner are displayed in the live view area 800, and only data within a scan depth set by software is used to generate a 3D image. It is used as the acquired data for , and the area outside the scan depth can be masked and displayed as data not used for 3D image generation.
- 3D data (3D image) of the object is generated based on the acquired data.
- an align step for connecting and aligning the generated 3D data is additionally performed.
- the generated 3D data may be transmitted to the display 107 of the electronic device 100 and displayed on the model view area 530 of the data acquisition program 500 .
- the density of data eg, 2D image data
- N% a preset reference value
- the acquired scan data is not applied to 3D data generation. Accordingly, it is possible to prevent occurrence of an “align issue” in which 3D data is not properly connected and aligned due to a scan area in which a small amount of data is obtained.
- step 750 it is determined whether or not the obtained data is abnormal based on the generated 3D data.
- the 3D data does not include information about the measurement object, that is, when the 3D data is not generated, it is determined that there is an abnormality in the data as no object exists within the current scan depth. do.
- information about a measurement object is not included in the 3D data for a predetermined time or for a predetermined number of scan operations, it is determined that there is an abnormality in the data as no object exists within the current scan depth. .
- the generated 3D data lacks information about the measurement object, for example, when discontinuous data exists, holes occur, or the density of data is less than a certain standard value, the current scan depth It can be determined that there is an abnormality in the data as the object is not sufficiently entered.
- information on a measurement object is insufficient in 3D data generated over a predetermined period of time or a predetermined number of scan operations, it is determined that the object is not sufficiently within the current scan depth and that the data has an abnormality.
- noise data refers to data generated by scanning an object that is not an object to be scanned, such as a tongue, soft tissue such as the inside of a cheek, a finger, a treatment/diagnostic instrument, and other external foreign substances. Whether the generated data is noise data may be automatically determined using an AI learning model trained using intraoral images as learning data.
- step 750 all of these cases may be determined to have an error in the data.
- the scan operation may continue to be performed by returning to step 720.
- step 760 when it is determined that there is an abnormality in the data, the scan depth of the 3D scanner is automatically adjusted. Adjusting the scan depth may include adjusting the scan depth value in software or hardware. In an embodiment, when it is determined that no object exists within the current scan depth or that the object does not sufficiently enter the current scan depth, a scan depth value is increased to enable scanning of a longer distance. In one embodiment, increasing the scan depth value includes increasing the scan depth value by a preset amount. In another embodiment, increasing the scan depth value includes changing the scan depth value to a preset scan depth value, such as a maximum scan depth value.
- the scan depth value when noise data is included in the generated 3D data, it is determined that an object that is not a measurement target exists between the 3D scanner and the object, and the scan depth value is automatically reduced.
- decreasing the scan depth value includes reducing the scan depth value by a preset amount.
- decreasing the scan depth includes changing the scan depth value to a preset scan depth value.
- a distance between the 3D scanner and the object causing the noise may be determined based on the noise data, and the scan depth value may be changed to a value smaller than the distance.
- the scan depth value is automatically reduced in this way, it is no longer possible to measure the object at the position of the existing 3D scanner due to the reduced scan depth. Accordingly, since the user moves the 3D scanner closer to the object in order to narrow the distance between the object and the 3D scanner, a situation in which noise occurs, for example, the user's hand or the patient's tongue or cheek, etc., is placed between the teeth and the scanner. intervention can be prevented.
- the electronic device 100 or the 3D scanner 200 when the scan depth value is automatically increased or decreased, indicates that the scan depth value is automatically changed through various visual, auditory, and tactile methods. Notifications can be used to inform users. For example, the electronic device 100 may display, through the data acquisition program 500, the change of the scan depth value to the user through text or a picture/video on the screen, or a voice message or alarm (eg, a beep sound). etc. can be informed. Alternatively or additionally, vibration or an audible alarm may be provided to the user through the 3D scanner 200 . In an embodiment, a method of notifying the user that the scan depth value is automatically changed may be different between a case of automatically increasing the scan depth value and a case of automatically increasing the scan depth value.
- the scan depth adjustment button 571 of the scan depth adjustment bar 570 of the data acquisition program 500 It moves in real time to the position corresponding to the automatically adjusted scan depth value.
- the function of automatically adjusting the scan depth value of the 3D scanner 200 as described below is turned off, that is, released. It can be.
- steps 720 to 760 may be performed at a predetermined time period or a predetermined scan (imaging) period. In one embodiment, steps 750 and 760 are not performed if the function to automatically adjust the scan depth value is turned off.
- step 750 it is possible to determine whether the data is abnormal based on the 2D image data obtained in step 730.
- the scan depth value is increased in step 760 accordingly.
- noise data is included in the acquired 2D image data, it is determined that there is an abnormality in the data, and accordingly, the scan depth value is reduced in step 760 .
- the determination of whether there is an abnormality in the data may be performed partly based on 2D image data and partly based on 3D data. For example, whether there is no or insufficient information about the measurement object in the obtained data may be determined based on 2D image data, and whether or not noise data is included may be determined based on 3D data, and vice versa. case is possible.
- a scan depth value may be automatically adjusted according to a scan mode of a 3D scanner and/or software related to the 3D scanner (eg, the data acquisition program 500).
- the scan mode is three modes of the scan filtering function option 552 described with respect to FIG. 5, that is, a mode 552-1 that does not remove unnecessary parts such as soft tissue from data obtained by scanning, a tooth and a mode (552-2) in which data corresponding to the soft tissue portion is removed while leaving only data corresponding to the gingival portion adjacent to the tooth (552-2), and a mode (552-3) in which only data corresponding to the tooth is left and remaining data is removed (552-3). ) may be included.
- scan data of the target object is not obtained. It may be necessary to automatically adjust the scan depth.
- FIG. 9 is a flow diagram of a method 900 for automatically adjusting scan depth in a particular scan mode, according to various embodiments of the present disclosure.
- step 910 a scan operation of the 3D scanner 200 is started, and a mode for removing scan data other than the target object is selected.
- step 920 the object 20 is scanned using the 3D scanner 200.
- a scan image of an object existing within a scan depth is obtained from a measurement unit of the 3D scanner, for example, the probe tip 220 of the 3D scanner 200.
- the obtained scan image is a 2D image of a surface of an object existing within a scan depth.
- the obtained 2D scan image may be transmitted to the electronic device 100 and displayed on the live view area 520 of the data acquisition program 500 .
- a target object and a non-target area are distinguished and masked. For example, when the mode 552-2 leaving the teeth and gingiva is selected, the teeth and gingiva parts and other parts are separated and masked. Alternatively, when the mode 552-3 for leaving only teeth is selected, the teeth and other parts are separated and masked. In one embodiment, only the target object may be masked or only a part other than the target object may be masked.
- Whether the acquired data is a target object (teeth or gingiva) or other soft tissues may be automatically determined using an AI learning model trained using intraoral images as learning data.
- FIG. 10 illustrates an image in which a target object and a non-target object are separated and masked in the obtained 2D image 1000 according to various embodiments of the present disclosure.
- a portion 1010 that is not a target target, that is, a soft tissue portion corresponding to the inside of the lips is separated from portions corresponding to teeth 1020 and gingiva 1030 and is masked.
- the embodiment of FIG. 10 shows a case where the mode leaving teeth and gingiva is selected, but the same method can be performed even when the mode leaving only teeth is selected.
- the masked image may be displayed in the live view area 520 of the data acquisition program 500.
- step 950 when it is determined that the target object does not exist in the acquired 2D image, the scan depth value is reduced.
- a scan depth value may not be reduced when at least a part of a target object is scanned in a 2D image acquired for a predetermined period of time or for a predetermined number of scan operations.
- reducing the scan depth value includes reducing the scan depth value to such an extent that scanning is substantially impossible.
- the scan depth value may be changed to a minimum scan depth value (eg, 0.01 mm). Alternatively, you can change the scan depth value to 0 or less.
- the user when scanning is substantially impossible by reducing the scan depth value, the user may be notified of this through various methods of visual, auditory, and tactile notifications as described above.
- the user moves the scanner to a part where the target object exists and restarts the scan operation, or the 3D scanner (200 ) or the scan depth control bar 570 of the data acquisition program 500, etc., the scan depth value can be readjusted to a value capable of being scanned.
- the scan depth value can be readjusted to a value capable of being scanned.
- Various embodiments of the present disclosure may be implemented as software recorded on a machine-readable recording medium.
- Software may be software for implementing various embodiments of the present disclosure described above.
- Software can be inferred from various embodiments of this disclosure by programmers skilled in the art.
- software can be machine-readable instructions (eg, code or code segments) or programs.
- a device is a device capable of operating according to a command called from a recording medium, and may be, for example, a computer.
- the device may be device 100 according to embodiments of the present disclosure.
- the processor of the device may execute the invoked command so that components of the device perform functions corresponding to the command.
- the processor may be one or more processors 101 according to embodiments of the present disclosure.
- the recording medium may refer to any type of recording medium that can be read by a device and stores data.
- the recording medium may include, for example, ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
- the recording medium may be one or more memories 103 .
- the recording medium may be implemented in a distributed form such as a computer system connected by a network.
- the software may be distributed, stored, and executed on a computer system or the like.
- the recording medium may be a non-transitory recording medium.
- a non-transitory recording medium refers to a tangible medium regardless of whether data is semi-permanently or temporarily stored, and does not include a transitory signal.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Medical Informatics (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Epidemiology (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Quality & Reliability (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Endoscopes (AREA)
Abstract
Description
Claims (20)
- 하나 이상의 프로세서 및 상기 하나 이상의 프로세서에 의해 실행되기 위한 명령들이 저장된 하나 이상의 메모리를 포함하는 전자 장치에서, 상기 하나 이상의 프로세서에 의해 수행되는 3차원 스캐너의 스캔 이미지 처리 방법으로서,상기 3차원 스캐너를 통하여, 미리 설정된 상기 3차원 스캐너의 스캔 심도 내의 측정 물체에 대한 스캔 데이터를 획득하는 단계;상기 3차원 스캐너로부터 획득한 스캔 이미지에 기초하여 상기 스캔 데이터의 이상 여부를 판단하는 단계; 및상기 스캔 데이터에 이상이 있다고 판단된 경우, 상기 3차원 스캐너의 스캔 심도를 조절하는 단계를 포함하는 방법.
- 제1항에 있어서,상기 3차원 스캐너로부터 획득한 스캔 이미지에 기초하여 상기 스캔 데이터의 이상 여부를 판단하는 단계는,상기 측정 물체가 상기 스캔 심도 내에 존재하지 않는 경우 또는 상기 획득된 스캔 이미지의 밀도가 미리 결정된 값 이하인 경우, 상기 스캔 데이터에 이상이 있는 것으로 판단하는 단계를 포함하는, 방법.
- 제2항에 있어서,상기 스캔 심도를 조절하는 단계는 상기 3차원 스캐너의 스캔 심도 값을 증가시키는 단계를 포함하는, 방법.
- 제3항에 있어서,상기 스캔 심도 값을 증가시키는 단계는 상기 스캔 심도 값을 미리 설정된 스캔 심도 값으로 설정하는 단계 또는 상기 스캔 심도 값을 미리 결정된 값만큼 증가시키는 단계를 포함하는, 방법.
- 제1항에 있어서,상기 3차원 스캐너로부터 획득한 스캔 이미지에 기초하여 상기 스캔 데이터의 이상 여부를 판단하는 단계는,상기 획득된 스캔 이미지에 노이즈 데이터가 포함되어 있는 경우, 상기 스캔 데이터에 이상이 있는 것으로 판단하는 단계를 포함하는, 방법.
- 제5항에 있어서,상기 스캔 심도를 조절하는 단계는 상기 3차원 스캐너의 스캔 심도 값을 감소시키는 단계를 포함하는, 방법.
- 제3항에 있어서,상기 스캔 심도 값을 감소시키는 단계는 상기 스캔 심도 값을 미리 설정된 스캔 심도 값으로 설정하는 단계 또는 상기 스캔 심도 값을 미리 결정된 값만큼 감소시키는 단계를 포함하는, 방법.
- 제1항에 있어서,상기 3차원 스캐너로부터 획득한 스캔 이미지에 기초하여 상기 스캔 데이터의 이상 여부를 판단하는 단계는,상기 획득한 스캔 이미지에 타겟 영역이 포함되어 있지 않은 경우 상기 스캔 데이터에 이상이 있는 것으로 판단하는 단계를 포함하는, 방법.
- 제8항에 있어서,상기 스캔 심도를 조절하는 단계는 상기 스캔 심도를 최소 스캔 심도 값으로 설정하는 단계를 포함하는, 방법.
- 제1항에 있어서,상기 스캔 심도를 조절하는 단계는 사용자에게 시각적, 촉각적, 또는 청각적 알림을 제공하는 단계를 포함하는, 방법.
- 전자 장치에 있어서,3차원 스캐너와 통신 연결되는 통신 회로;디스플레이; 및하나 이상의 프로세서를 포함하고,상기 하나 이상의 프로세서는,상기 3차원 스캐너를 통하여, 미리 설정된 상기 3차원 스캐너의 스캔 심도 내의 측정 물체에 대한 스캔 데이터를 획득하고;상기 3차원 스캐너로부터 획득한 스캔 이미지에 기초하여 상기 스캔 데이터의 이상 여부를 판단하고; 및상기 스캔 데이터에 이상이 있다고 판단된 경우, 상기 3차원 스캐너의 스캔 심도를 조절하도록 구성된, 전자 장치.
- 제11항에 있어서,상기 3차원 스캐너로부터 획득한 스캔 이미지에 기초하여 상기 스캔 데이터의 이상 여부를 판단하는 것은,상기 측정 물체가 상기 스캔 심도 내에 존재하지 않는 경우 또는 상기 획득된 스캔 이미지의 밀도가 미리 결정된 값 이하인 경우, 상기 스캔 데이터에 이상이 있는 것으로 판단하는 것을 포함하는, 전자 장치.
- 제12항에 있어서,상기 스캔 심도를 조절하는 것은 상기 3차원 스캐너의 스캔 심도 값을 증가시키는 것을 포함하는, 전자 장치.
- 제13항에 있어서,상기 스캔 심도 값을 증가시키는 것은 상기 스캔 심도 값을 미리 설정된 스캔 심도 값으로 설정하는 것 또는 상기 스캔 심도 값을 미리 결정된 값만큼 증가시키는 것을 포함하는, 전자 장치.
- 제11항에 있어서,상기 3차원 스캐너로부터 획득한 스캔 이미지에 기초하여 상기 스캔 데이터의 이상 여부를 판단하는 것은,상기 획득된 스캔 이미지에 노이즈 데이터가 포함되어 있는 경우, 상기 스캔 데이터에 이상이 있는 것으로 판단하는 것을 포함하는, 전자 장치.
- 제15항에 있어서,상기 스캔 심도를 조절하는 것은 상기 3차원 스캐너의 스캔 심도 값을 감소시키는 것을 포함하는, 전자 장치.
- 제13항에 있어서,상기 스캔 심도 값을 감소시키는 것은 상기 스캔 심도 값을 미리 설정된 스캔 심도 값으로 설정하는 것 또는 상기 스캔 심도 값을 미리 결정된 값만큼 감소시키는 것을 포함하는, 전자 장치.
- 제11항에 있어서,상기 3차원 스캐너로부터 획득한 스캔 이미지에 기초하여 상기 스캔 데이터의 이상 여부를 판단하는 것은,상기 획득한 스캔 이미지에 타겟 영역이 포함되어 있지 않은 경우 상기 스캔 데이터에 이상이 있는 것으로 판단하는 것을 포함하는, 전자 장치.
- 제18항에 있어서,상기 스캔 심도를 조절하는 것은 상기 스캔 심도를 최소 스캔 심도 값으로 설정하는 것을 포함하는, 전자 장치.
- 하나 이상의 프로세서에 의한 실행 시, 상기 하나 이상의 프로세서가 동작을 수행하도록 하는 명령들을 기록한 비일시적 컴퓨터 판독 가능 기록 매체에 있어서,상기 명령들은, 상기 하나 이상의 프로세서로 하여금,3차원 스캐너를 통하여, 미리 설정된 상기 3차원 스캐너의 스캔 심도 내의 측정 물체에 대한 스캔 데이터를 획득하고;상기 3차원 스캐너로부터 획득한 스캔 이미지에 기초하여 상기 스캔 데이터의 이상 여부를 판단하고; 및상기 스캔 데이터에 이상이 있다고 판단된 경우, 상기 3차원 스캐너의 스캔 심도를 조절하도록 하는, 컴퓨터 판독 가능 기록 매체.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280049502.6A CN117677337A (zh) | 2021-07-21 | 2022-07-21 | 用于调节三维扫描仪的扫描深度的方法和装置 |
US18/571,838 US20240289954A1 (en) | 2021-07-21 | 2022-07-21 | Method and apparatus for adjusting scan depth of three-dimensional scanner |
EP22846253.7A EP4374769A1 (en) | 2021-07-21 | 2022-07-21 | Method and apparatus for adjusting scan depth of three-dimensional scanner |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20210095453 | 2021-07-21 | ||
KR10-2021-0095453 | 2021-07-21 | ||
KR1020220061439A KR20230014621A (ko) | 2021-07-21 | 2022-05-19 | 3차원 스캐너의 스캔 심도 조절을 위한 방법 및 장치 |
KR10-2022-0061439 | 2022-05-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023003383A1 true WO2023003383A1 (ko) | 2023-01-26 |
Family
ID=84979288
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/010687 WO2023003383A1 (ko) | 2021-07-21 | 2022-07-21 | 3차원 스캐너의 스캔 심도 조절을 위한 방법 및 장치 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240289954A1 (ko) |
EP (1) | EP4374769A1 (ko) |
WO (1) | WO2023003383A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101097607B1 (ko) * | 2010-01-12 | 2011-12-22 | 삼성메디슨 주식회사 | 스캔 각도, 스캔 깊이 및 스캔 속도를 설정하는 초음파 시스템 및 방법 |
KR20170103713A (ko) * | 2017-06-14 | 2017-09-13 | 이영종 | 출력비트 조정 고속 이미지센서를 이용한 공초점 3차원 구강스캐너 |
KR20180010252A (ko) * | 2015-05-22 | 2018-01-30 | 시로나 덴탈 시스템스 게엠베하 | 치아 객체의 3차원 측정을 위한 카메라 및 방법 |
KR101903424B1 (ko) * | 2017-01-10 | 2018-11-13 | 한국광기술원 | 광단층영상시스템 기반 3d 구강 스캐너 및 이를 이용한 치아 상태 진단 방법 |
KR102091897B1 (ko) * | 2015-12-28 | 2020-03-20 | 전자부품연구원 | 3d 구강 스캐너 및 이를 이용한 3d 구강 스캐닝 방법 |
-
2022
- 2022-07-21 EP EP22846253.7A patent/EP4374769A1/en active Pending
- 2022-07-21 WO PCT/KR2022/010687 patent/WO2023003383A1/ko active Application Filing
- 2022-07-21 US US18/571,838 patent/US20240289954A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101097607B1 (ko) * | 2010-01-12 | 2011-12-22 | 삼성메디슨 주식회사 | 스캔 각도, 스캔 깊이 및 스캔 속도를 설정하는 초음파 시스템 및 방법 |
KR20180010252A (ko) * | 2015-05-22 | 2018-01-30 | 시로나 덴탈 시스템스 게엠베하 | 치아 객체의 3차원 측정을 위한 카메라 및 방법 |
KR102091897B1 (ko) * | 2015-12-28 | 2020-03-20 | 전자부품연구원 | 3d 구강 스캐너 및 이를 이용한 3d 구강 스캐닝 방법 |
KR101903424B1 (ko) * | 2017-01-10 | 2018-11-13 | 한국광기술원 | 광단층영상시스템 기반 3d 구강 스캐너 및 이를 이용한 치아 상태 진단 방법 |
KR20170103713A (ko) * | 2017-06-14 | 2017-09-13 | 이영종 | 출력비트 조정 고속 이미지센서를 이용한 공초점 3차원 구강스캐너 |
Also Published As
Publication number | Publication date |
---|---|
US20240289954A1 (en) | 2024-08-29 |
EP4374769A1 (en) | 2024-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020204366A2 (ko) | 스캔 가이드 제공 방법 이를 위한 영상 처리장치 | |
WO2017191878A1 (ko) | 치아 움직임 추적 장치 및 그 방법 | |
KR20230014621A (ko) | 3차원 스캐너의 스캔 심도 조절을 위한 방법 및 장치 | |
WO2023003383A1 (ko) | 3차원 스캐너의 스캔 심도 조절을 위한 방법 및 장치 | |
WO2022255720A1 (ko) | 3차원 스캐너의 스캔 이미지 처리 방법 및 장치 | |
WO2023014107A1 (ko) | 3차원 스캐너의 스캔 이미지 처리에 있어서의 노이즈 필터링을 위한 방법 및 장치 | |
WO2023277391A1 (ko) | 구강 이미지 처리 장치 및 구강 이미지 처리 방법 | |
WO2022250403A1 (ko) | 전자 장치 및 그의 이미지 처리 방법 | |
WO2023003192A1 (ko) | 이미지 처리 장치, 및 이미지 처리 방법 | |
WO2023204509A1 (ko) | 3차원 스캐너의 3차원 이미지 모델의 생성과 정렬을 위한 전자 장치, 방법 및 기록 매체 | |
WO2023200128A1 (ko) | 구강 스캐너의 스캔 이미지 처리 방법, 그 장치 및 그 명령을 기록한 기록 매체 | |
US20230386141A1 (en) | Method, apparatus and recording medium storing commands for processing scanned images of 3d scanner | |
WO2023038313A1 (ko) | 전자 장치 및 그의 3차원 스캐너의 스캔 이미지 처리 방법 | |
WO2023063671A1 (ko) | 3차원 스캐너의 스캔 이미지 정렬 방법, 장치 및 명령을 기록한 기록매체 | |
KR102472127B1 (ko) | 구강 이미지 처리 장치 및 구강 이미지 처리 방법 | |
CN117677337A (zh) | 用于调节三维扫描仪的扫描深度的方法和装置 | |
WO2023177213A1 (ko) | 객체의 색상을 결정하는 방법, 그 장치 및 그 명령을 기록한 기록 매체 | |
US20230096570A1 (en) | Electronic device and method for processing scanned image of three dimensional scanner | |
WO2024144035A1 (ko) | 홀로그램 디스플레이 시스템 및 그 제어방법 | |
WO2023238987A1 (ko) | 초음파 진단 장치 및 그 동작 방법 | |
WO2022164203A1 (ko) | 3차원 스캔 시스템, 및 그 동작 방법 | |
EP4417161A1 (en) | Data processing apparatus and data processing method | |
WO2022191665A1 (ko) | 구강 이미지 내의 선택 영역을 식별하는 방법 및 이를 위한 장치 | |
WO2024096385A1 (ko) | 복수의 초음파 변환기의 동작 제어를 통해 유방 내 종괴를 검출하는 휴대형 초음파 장치 | |
WO2024096384A1 (ko) | 초음파 변환기의 이동 속도를 감지하여 유방 내 종괴를 검출하는 휴대형 초음파 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22846253 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18571838 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280049502.6 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022846253 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022846253 Country of ref document: EP Effective date: 20240221 |