WO2022191665A1 - 구강 이미지 내의 선택 영역을 식별하는 방법 및 이를 위한 장치 - Google Patents
구강 이미지 내의 선택 영역을 식별하는 방법 및 이를 위한 장치 Download PDFInfo
- Publication number
- WO2022191665A1 WO2022191665A1 PCT/KR2022/003459 KR2022003459W WO2022191665A1 WO 2022191665 A1 WO2022191665 A1 WO 2022191665A1 KR 2022003459 W KR2022003459 W KR 2022003459W WO 2022191665 A1 WO2022191665 A1 WO 2022191665A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- oral
- image
- area
- oral cavity
- selection
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 71
- 210000000214 mouth Anatomy 0.000 claims description 211
- 238000012545 processing Methods 0.000 claims description 170
- 230000008569 process Effects 0.000 claims description 9
- 239000003086 colorant Substances 0.000 claims description 8
- 238000002156 mixing Methods 0.000 claims description 7
- 238000004891 communication Methods 0.000 description 40
- 238000010586 diagram Methods 0.000 description 28
- 238000013473 artificial intelligence Methods 0.000 description 7
- 238000003860 storage Methods 0.000 description 7
- 239000013598 vector Substances 0.000 description 7
- 238000004590 computer program Methods 0.000 description 4
- 210000004195 gingiva Anatomy 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 239000007943 implant Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 239000003826 tablet Substances 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000011960 computer-aided design Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30036—Dental; Teeth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/004—Annotating, labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
Definitions
- Embodiments of the present disclosure relate to a method and an apparatus for identifying a selection region in an oral cavity image, and more particularly, to a method and an apparatus for identifying a selection region in a three-dimensional model of an oral cavity.
- Dental CAD/CAM Densicle Computer Aided Design/Computer Aided Manufacturing
- CAD/CAM Complementary Computer Aided Design/Computer Aided Manufacturing
- the most important thing in dental treatment using CAD/CAM is to acquire precise 3D data on the shape of an object, such as a patient's teeth, gums, and jawbone.
- 3D data obtained from an object there is an advantage that accurate calculation can be performed by a computer by using 3D data obtained from an object.
- CT computed tomography
- MRI magnetic resonance imaging
- optical scanning may be used.
- the 3D scanning apparatus may acquire 3D surface shape information using light reflected from the object.
- the 3D surface data may be recorded in the form of a polygon mesh, and the 3D surface data may include position information of vertices of the surface of the object and connection relationship information between each vertex.
- the 3D surface data may be recorded in the form of a point cloud.
- the oral cavity image processing apparatus When the oral cavity image processing apparatus acquires an oral cavity image from the scan data, the user may select a region in the oral cavity image using the acquired oral cavity image.
- the prior art there was a problem in that a lot of time is required or an unnecessary area is set in the process of identifying a selected area in the oral cavity image.
- a technique for accurately determining an area within an oral image may be required.
- the disclosed embodiment is a method for identifying a selected region in an oral cavity image so that the region in the oral cavity image can be accurately specified according to a user input, an apparatus for performing the operation according thereto, and a computer storing a program or instructions for performing the method.
- An object of the present invention is to provide a readable recording medium.
- a method is a method of identifying a selection region in an oral cavity image, the method comprising: acquiring an oral cavity image; determining a reference point of the oral image; determining a brush based on the reference point and the at least one distance information; identifying an area overlapping the area defined by the brush in the oral image as a selection area; and displaying a selection area in the oral image; may include.
- a method for identifying a selection region in an oral cavity image according to the disclosed embodiment, an apparatus for performing an operation according thereto, and a computer-readable recording medium storing a program or instructions for performing the method include a reference point of the oral cavity image and at least one By identifying the selection region based on the distance information of , it is possible to provide a user interface screen with increased accuracy of region selection in the oral cavity image.
- FIG. 1 is a view for explaining an oral image processing system according to the disclosed embodiment.
- FIG. 2 is a diagram for explaining a method of identifying a selection region in an oral cavity image.
- FIG. 3 shows a block diagram of an apparatus for processing an oral image according to an embodiment.
- FIG. 4 is an example of a detailed block diagram of an apparatus for processing an oral image according to an embodiment.
- FIG. 5 is a flow diagram illustrating an embodiment of a method for identifying a selection region within an oral cavity image.
- FIG. 6 is a flowchart for explaining an operation of identifying an area overlapping an area determined by a brush in an oral image as a selection area.
- FIG. 7 is a reference diagram for explaining an operation of identifying a selection region in an oral cavity image when the brush is a two-dimensional brush.
- FIG. 8 is a reference diagram for explaining an operation of identifying a selection region in an oral cavity image when the brush is a two-dimensional brush.
- FIG. 9 is a reference diagram for explaining an operation of identifying a selection region in an oral image when the brush is a three-dimensional brush.
- FIG. 10 is a reference diagram for explaining an operation of identifying at least one boundary of a selection region of an oral cavity image.
- 11 is a reference diagram for explaining an embodiment of identifying at least one object of an oral cavity image based on location information of a reference point.
- FIG. 12 is a diagram for explaining an embodiment in which a selection region is further identified based on the region of at least one identified object and an embodiment in which a selection region is displayed with a different color according to the identified at least one object, following FIG. 11 . It is also a reference.
- FIG. 13 is a reference diagram for explaining an example of identifying at least one object of an oral cavity image based on location information of a reference point.
- FIG. 14 is a diagram for explaining an embodiment in which a selection region is further identified based on the region of at least one identified object and an embodiment in which a selection region is displayed with a different color according to the identified at least one object, following FIG. 13 . It is also a reference.
- 15 is a reference diagram for explaining an embodiment of displaying a selection region in an oral image when a selection region is identified based on a three-dimensional brush.
- 16 is a reference diagram for explaining an embodiment of displaying a selection area in which a hole is displayed when there is a hole in an area identified based on a three-dimensional brush.
- a method provides a method of identifying a selection region in an oral cavity image, the method comprising: acquiring an oral cavity image; determining a reference point of the oral image; determining a brush based on the reference point and the at least one distance information; identifying an area overlapping the area defined by the brush in the oral image as a selection area; and displaying a selection area in the oral image; may include.
- the step of identifying the selection region includes: based on position information on a plurality of points of an area defined by the brush and position information on a plurality of points of an oral image, identifying at least one boundary on which the region defined by the overlapping oral image; and identifying the selection region based on the at least one boundary.
- the area defined by the brush may include an area obtained by extending the two-dimensional brush in a normal direction of the display unit.
- the identified selection region may include at least one boundary and a region of the oral cavity image located within the identified at least one boundary.
- the displaying of the selection area may include: displaying the selection area by emphasizing at least one boundary of the selection area; may include.
- the displaying of the selection area by emphasizing at least one boundary of the selection area includes displaying at least one boundary of the selection area with a preset color and located within the at least one boundary. Displaying the region of the oral cavity image by overlapping or blending the color of the oral image and a preset color; may include.
- the identifying the selection region may include: identifying at least one object of the oral cavity image based on location information of a reference point; and identifying the selection region further based on the identified region of the at least one object; may include.
- the displaying of the selection region may include: identifying at least one object of the oral cavity image included in the selection region; and displaying the selection area with different colors according to at least one object. may include.
- the at least one distance information may include distance information between the reference point and another point in the oral image.
- An apparatus for processing an oral image according to a second aspect of the present disclosure includes: a display unit for displaying an oral image; a memory storing one or more instructions; and at least one processor executing one or more instructions. wherein the at least one processor acquires an oral cavity image, determines a reference point of the oral cavity image, determines a brush based on the reference point and at least one distance information, and determines an area defined by the brush in the oral cavity image
- the display unit may be controlled to identify a region overlapping with the selected region as a selection region and display the selection region in the oral cavity image.
- the recording medium according to the third aspect of the present disclosure may include a computer-readable recording medium in which a program for performing the method in a computer is recorded.
- an 'object' is an object to be photographed, and the object may include a person, an animal, or a part thereof.
- the object may include a body part (such as an organ or an organ), an artificial structure attachable to or insertable into the object, or a phantom.
- the subject may have teeth, gingiva, at least a portion of the oral cavity, and/or artificial structures insertable into the oral cavity (eg, orthodontic devices including brackets and wires, implants, artificial teeth, inlays and onlays, etc.) including dental restorations, orthodontic aids inserted into the oral cavity), teeth or gingiva to which artificial structures are attached.
- an 'oral image' may refer to a two-dimensional image of an object or a three-dimensional image as an oral image three-dimensionally representing the object.
- the image may include at least one tooth or an image representing an oral cavity including at least one tooth (hereinafter, 'oral image').
- the oral image may include both a two-dimensional frame and a three-dimensional frame.
- the oral image may include a two-dimensional frame including two-dimensional images obtained from different viewpoints with respect to the object, a three-dimensional frame expressed in a point cloud form, or a polygonal mesh form.
- 'data' may refer to information necessary to represent an object in two or three dimensions, for example, raw data obtained from at least one image sensor.
- the raw data may be data acquired to generate an oral image, and the raw data is at least included in the oral scanner when scanning the inside of the patient's mouth, which is an object, using an intraoral scanner. It may be data (eg, two-dimensional data) obtained from one image sensor.
- Raw data obtained from an intraoral scanner may be referred to as scan data or two-dimensional image data.
- FIG. 1 is a view for explaining an oral image processing system according to the disclosed embodiment.
- an oral image processing system may include a scanner 100 and an oral image processing device 300 .
- the scanner 100 is a medical device for acquiring an image in an oral cavity.
- the scanner 100 may be a device for acquiring an image of the oral cavity including at least one tooth by being inserted into the oral cavity and scanning teeth in a non-contact manner.
- the scanner 100 may have a form that can be drawn in and out of the oral cavity, and can scan the inside of the patient's oral cavity using at least one image sensor (eg, an optical camera, etc.).
- the scanner 100 is at least one of teeth, gingiva, and artificial structures insertable into the oral cavity (eg, orthodontic devices including brackets and wires, implants, artificial teeth, orthodontic aids inserted into the oral cavity, etc.)
- surface information on the object may be acquired as two-dimensional image data.
- the two-dimensional image data obtained by the scanner 100 may be raw data obtained to generate an oral image.
- the scanner 100 is illustrated as an oral scanner having a shape that is inserted into the oral cavity, but the scanner 100 applied to the embodiments disclosed in the present disclosure may include a model scanner that scans a tooth model.
- the two-dimensional image data obtained by the scanner 100 may be transmitted to the oral image processing apparatus 300 connected through a wired or wireless communication network.
- the oral image processing device 300 is connected to the scanner 100 through a wired or wireless communication network, and may receive two-dimensional image data obtained by scanning the oral cavity from the scanner 100.
- the oral cavity image processing apparatus 300 may generate a 3D oral cavity image according to the following embodiments.
- the scanner 100 may transmit raw data obtained through an oral scan to the oral image processing apparatus 300 or process and transmit the raw data.
- the oral cavity image processing apparatus 300 may generate a 3D oral cavity image representing the oral cavity in three dimensions according to the received raw data. Since the '3D oral image' can be generated by modeling the internal structure of the oral cavity in three dimensions based on the received raw data, the '3D oral image' is a '3D model' It may be referred to as a 'three-dimensional model', a 'three-dimensional oral model' or a 'three-dimensional oral image'.
- a model or image representing the oral cavity in two or three dimensions is collectively referred to as a 'mouth image'.
- the scanner 100 may obtain raw data through an oral scan and process the obtained raw data to generate an image corresponding to the oral cavity, which is an object.
- the scanner 100 may transmit the oral cavity image generated to correspond to the oral cavity to the oral cavity image processing apparatus 300 .
- the oral cavity image processing device 300 may be any electronic device capable of generating, processing, displaying, and/or transmitting an oral cavity image based on the two-dimensional image data received from the scanner 100. Specifically, the oral image processing device 300 generates at least one of the information generated by processing the two-dimensional image data and the oral image generated by processing the two-dimensional image data, and displays at least one of the generated information and the oral image to the display unit 310 can be displayed through
- the oral image processing device 300 is a device for analyzing, processing, displaying, and/or transmitting the received image, and may be a computing device such as a smart phone, a laptop computer, a desktop computer, a PDA, a tablet PC, and the like. not limited
- the oral image processing device 300 may be in the form of a server (or server device) for processing the oral cavity image.
- the oral image processing apparatus 300 may store and execute dedicated software linked to the scanner 100 .
- the dedicated software may be called a dedicated program or a dedicated application.
- dedicated software stored in the oral image processing device 300 may be connected to the scanner 100 to receive data acquired through object scan in real time.
- the dedicated software may be stored in the memory of the oral image processing device 300 .
- the dedicated software may provide a user interface for use of data acquired by the scanner 100 .
- the user interface screen provided by the dedicated software may include an oral image or a three-dimensional model.
- the oral cavity image processing apparatus 300 may display a selection area identified according to the reference point of the oral cavity image and at least one piece of distance information on the display unit 310 using dedicated software.
- the oral image processing device 300 is an electronic device capable of generating an oral cavity image three-dimensionally representing an oral cavity including one or more teeth, and displaying the oral cavity image in which a selection area is displayed through the display unit 310, It will be described in detail below.
- FIG. 2 is a diagram for explaining a method of identifying a selection region in an oral cavity image.
- a user may hover or hover over an oral image 210 using a graphic tool called a brush in order to select an area 211 of the oral image 210, and the oral image processing apparatus uses a brush.
- a screen including a graphic showing a preview of a selection area determined by an area in which the brush is positioned or passed by the brush may be output.
- the area 211 selected by the brush is an area 212 or three-dimensional area corresponding to the back gum in addition to the area the user wants to select with the brush.
- the region 213 in which the point data of the model does not exist may also be included. Accordingly, the area 212 and the area 213 may be unnecessarily selected areas. According to the related art, there is a problem in that an unnecessary area is set in identifying a selected area in a 3D image.
- the oral image 220 is a three-dimensional oral image or a three-dimensional oral model
- the selection area 221 may be displayed as if it were two-dimensional. Therefore, according to the related art, in identifying the selection region in the three-dimensional image, although the oral image 220 is a three-dimensional image, the selection region is displayed in two dimensions, there is a problem in that it is difficult to accurately select the selection region.
- the oral image processing apparatus 300 proposes a method for identifying and displaying a selection area that a user wants to select with high speed and high accuracy.
- FIG. 3 shows a block diagram of an apparatus for processing an oral image according to an embodiment.
- the oral cavity image processing apparatus 300 may be an electronic device for processing an oral cavity image and displaying the processed oral cavity image.
- the oral image processing apparatus 300 may include a display 310 , a memory 320 , and a processor 330 .
- the components will be described in turn.
- the display unit 310 may display a screen, and may display a predetermined screen under the control of the processor 330 .
- the display unit 310 may include a display panel and a controller (not shown) for controlling the display panel, and the display unit 310 may display a display built in the oral image processing apparatus 300 .
- the display panel may be implemented in various types of displays, such as a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an active-matrix organic light-emitting diode (AM-OLED), a plasma display panel (PDP), and the like.
- the display panel may be implemented to be flexible, transparent, or wearable.
- the display 310 may be provided as a touch screen by being combined with a touch panel of a user input unit (not shown).
- the touch screen may include an integrated module in which the display panel and the touch panel are combined in a stacked structure.
- the display unit 310 may display a screen, and may display a predetermined screen under the control of the processor 330 .
- the display unit 310 may display a user interface screen including an oral cavity image generated based on data obtained by scanning the patient's oral cavity with the scanner 100 .
- the display unit 310 may display a user interface screen including information related to a patient's dental treatment.
- the display 310 may provide a user interface screen including the selected area to the user by displaying the 3D oral model in which the area selected by the user is displayed.
- the memory 320 may store a program for processing and controlling the processor 330 .
- the memory 320 according to an embodiment of the present disclosure may store one or more instructions.
- the processor 330 may control the overall operation of the oral image processing apparatus 300, and may control the operation of the oral image processing apparatus 300 by executing one or more instructions stored in the memory 320.
- the processor 330 executes one or more instructions stored in the memory 320 to obtain an oral cavity image, determine a reference point of the oral cavity image, and determine a brush based on the reference point and at least one distance information and identify an area overlapping the area determined by the brush in the oral cavity image as the selection region, and control the display 310 to display the selection region in the oral cavity image.
- the processor 330 executes one or more instructions stored in the memory 320, based on the location information for a plurality of points in the area defined by the brush and location information for a plurality of points in the oral image, At least one boundary where the region defined by the brush and the oral cavity image overlap may be identified, and the selection region may be identified based on the at least one boundary.
- the processor 330 may control the display 310 to display the selection area by emphasizing at least one boundary of the selection area by executing one or more instructions stored in the memory 320 .
- the processor 330 By executing one or more instructions stored in the memory 320, the processor 330 according to an embodiment of the present disclosure displays at least one boundary of the selection region with a preset color, and the region of the oral image located within the at least one boundary is the oral image.
- the display unit 310 may be controlled to display the color and the preset color by overlapping or blending.
- the processor 330 executes one or more instructions stored in the memory 320, and identifies at least one object of the oral cavity image based on the position information of the reference point, and in the area of the identified at least one object.
- the selection region may be identified further based on it.
- the processor 330 By executing one or more instructions stored in the memory 320, the processor 330 according to an embodiment of the present disclosure identifies at least one object of an oral image included in the selection area, and selects the selection area with a different color according to the at least one object
- the display unit 310 may be controlled to display.
- the oral image processing apparatus 300 may be implemented by more components than the illustrated components, and the oral image processing apparatus 300 may be implemented by fewer components.
- the oral image processing apparatus 300 may include a display unit 310 , a memory 320 , a processor 330 , a communication unit 340 , and a user input unit 350 .
- FIG. 4 is an example of a detailed block diagram of an apparatus for processing an oral image according to an embodiment.
- the oral cavity image processing apparatus 300 may be an electronic device for processing an oral cavity image and displaying the processed oral cavity image.
- the oral image processing device 300 may be a computing device such as a smart phone, a laptop computer, a desktop computer, a PDA, or a tablet PC, but is not limited thereto.
- the oral image processing apparatus 300 may further include a communication unit 340 and a user input unit 350 in addition to the display unit 310, the memory 320, and the processor 330.
- the components will be described in turn. Since the description of the display unit 310 is the same as that described in FIG. 2 , it will be omitted.
- the memory 320 may store a program for processing and controlling the processor 330 .
- the memory 320 may store one or more instructions.
- the memory 320 may include at least one of an internal memory (not shown) and an external memory (not shown).
- the memory 320 may store various programs and data used for the operation of the oral image processing apparatus 300 .
- the memory 320 may store the generated three-dimensional oral image, and may store raw data or two-dimensional image data obtained from the scanner 100 .
- dedicated software linked to the scanner 100 may be stored.
- the dedicated software may be a program or application for providing a user interface for use of data acquired by the scanner 100 .
- the memory 320 may store position information of vertices of the surface of at least one object of the 3D oral image from the scanner 100 and connection relationship information between the vertices.
- the memory 320 may store surface data in the form of a point cloud, and the point cloud may include location information of vertices of the object surface and connection relationship information between the vertices.
- the memory 320 may include information for distinguishing at least one object of the 3D oral cavity image.
- the memory 320 may include color information on at least one object of the 3D oral cavity image.
- the memory 320 may include color information for displaying the selection area.
- the memory 320 may determine at least one piece of distance information used to determine a brush.
- the at least one piece of distance information may indicate distance information between the reference point and another point of the three-dimensional oral image.
- the built-in memory includes, for example, a volatile memory (eg, dynamic RAM (DRAM), static RAM (SRAM), synchronous dynamic RAM (SDRAM), etc.), a non-volatile memory (eg, One Time Programmable ROM (OTPROM)). ), Programmable ROM (PROM), Erasable and Programmable ROM (EPROM), Electrically Erasable and Programmable ROM (EEPROM), Mask ROM, Flash ROM, etc.), hard disk drive (HDD), or solid state drive (SSD).
- a volatile memory eg, dynamic RAM (DRAM), static RAM (SRAM), synchronous dynamic RAM (SDRAM), etc.
- a non-volatile memory eg, One Time Programmable ROM (OTPROM)).
- OTPROM One Time Programmable ROM
- PROM Programmable ROM
- EPROM Erasable and Programmable ROM
- EEPROM Electrically Erasable and Programmable ROM
- Mask ROM Mask ROM
- Flash ROM Flash ROM
- SSD
- the external memory includes, for example, at least one of CF (Compact Flash), SD (Secure Digital), Micro-SD (Micro Secure Digital), Mini-SD (Mini Secure Digital), xD (extreme Digital), and Memory Stick.
- CF Compact Flash
- SD Secure Digital
- Micro-SD Micro Secure Digital
- Mini-SD Mini Secure Digital
- xD Extreme Digital
- Memory Stick may include
- the processor 330 may include at least one of RAM, ROM, CPU, GPU, and a bus. RAM, ROM, CPU and GPU, etc. can be connected to each other via a bus.
- the processor 320 may include an AI-only processor for identifying, as a selection region, a region overlapping a region defined by a brush in an oral image.
- the artificial intelligence-only processor may be controlled to process input data according to a predefined operation rule or an artificial intelligence model stored in a memory, or may be designed as a hardware structure specialized for processing a specific artificial intelligence model.
- a predefined action rule or artificial intelligence model can be created through learning, and being created through learning means that the basic artificial intelligence model is learned by using a plurality of learning data by a learning algorithm, so that the desired characteristic (or purpose) ) means that a predefined action rule or artificial intelligence model set to perform is created.
- Such learning may be performed in the device itself on which artificial intelligence according to the present disclosure is performed, or may be performed through a separate server and/or system.
- the communication unit 340 may communicate with at least one external electronic device through a wired or wireless communication network. Specifically, the communication unit 340 may communicate with the scanner 100 under the control of the processor 330 .
- the oral image processing apparatus 300 and the oral image processing apparatus 300 may include one or more components that enable communication between a plurality of devices or servers located in the vicinity.
- the communication unit 340 may include one or more components that enable communication between the oral image processing apparatus 300 and the server. Also, the communication unit 340 may include a short-range communication unit 341 .
- the communication unit 340 may further include a long-distance communication module for performing communication with a server for supporting long-distance communication according to a telecommunication standard.
- the communication unit 340 may include a long-distance communication module for performing communication through a network for Internet communication.
- the communication unit 340 may include a long-distance communication module for performing communication through a communication network conforming to a communication standard such as 3G, 4G, and/or 5G.
- the communication unit 340 may include at least one port for connecting to an external electronic device by a wired cable in order to communicate with an external electronic device (eg, an intraoral scanner, etc.) by wire. Accordingly, the communication unit 340 may communicate with an external electronic device connected by wire through at least one port.
- the communication unit 340 may receive data obtained from the scanner 100, for example, raw data obtained through an intraoral scan under the control of the processor 330 .
- the user input unit 350 may receive various commands from the user, and the user input unit 350 may mean a means for the user to input data for controlling the oral image processing apparatus 300 .
- the user input unit 350 includes a key pad, a dome switch, and a touch pad (contact capacitive method, pressure resistance film method, infrared sensing method, surface ultrasonic conduction method, integral tension measurement method, piezo effect) method, etc.), a jog wheel, or a jog switch, but is not limited thereto.
- the key may include various types of keys, such as mechanical buttons and wheels, which are formed in various areas such as the front, side, and rear of the body exterior of the oral image processing apparatus 300 .
- the touch panel may detect a user's touch input and output a touch event value corresponding to the sensed touch signal.
- the touch panel When the touch panel is combined with the display panel to form a touch screen (not shown), the touch screen may be implemented with various types of touch sensors such as capacitive type, pressure sensitive type, and piezoelectric type.
- the user input unit 350 includes a touch panel for detecting a user's touch, a button for receiving a user's push operation, a mouse or a keyboard for designating or selecting a point on the user interface screen, and the like. It may include, but is not limited to, a user input device.
- the user interface 120 may include a voice recognition device for voice recognition.
- the voice recognition device may be a microphone, and the voice recognition device may receive a user's voice command or voice request. Accordingly, the processor may control an operation corresponding to the voice command or the voice request to be performed.
- FIG. 5 is a flow diagram illustrating an embodiment of a method for identifying a selection region within an oral cavity image.
- the oral image processing apparatus may acquire an oral cavity image.
- the scanner 100 may transmit raw data acquired through the oral cavity scan to the oral image processing device 300, and the oral image processing device 300 receives the raw data can be processed to generate or acquire an oral image.
- the generated oral image is a three-dimensional oral image, and may be generated by modeling the internal structure of the oral cavity based on raw data in three dimensions.
- the scanner 100 obtains raw data through an oral scan, and processes the obtained raw data to generate an image corresponding to the oral cavity as an object. have.
- the scanner 100 may transmit raw raw data and an image generated based on the raw data together to the oral image processing apparatus 300 .
- the oral cavity image processing apparatus 300 may acquire the oral cavity image from the scanner 100, but is not limited thereto.
- the oral image processing device 300 may store the generated oral image in the memory 320 of the oral image processing device 300, and then, when the oral image processing device 300 receives a command for determining a region in the oral image from the user, in the memory 320 Stored oral images may also be used.
- the oral image of the present application may refer to an oral image of the user's oral cavity generated from a scan image of the user's oral cavity taken in real time.
- step S520 the oral image processing apparatus according to an embodiment of the present disclosure may determine a reference point of the oral cavity image.
- the oral image processing apparatus 300 may receive a user's input through the user input unit 350, and the oral image processing apparatus 300 uses the user's input as a reference point of the oral cavity image output on the user interface screen. can be decided
- the user may select a reference point of the oral cavity image through the user input unit 350 .
- the user may use the user input unit 350 including a mouse or keyboard to select a point on the user interface screen output on the display unit 310 .
- the oral image processing apparatus 300 may determine a point on the user interface screen as a reference point of the oral cavity image.
- the oral image processing apparatus 300 may determine the plurality of points on the user interface screen as a plurality of reference points of the oral cavity image.
- a point where the vector identified based on the mouse pointer on the user interface screen meets the oral image on the user interface screen may be a reference point, and the reference point may be composed of a plurality of points.
- the identified vector may mean a vector in a normal direction of the user interface screen among vectors passing through the mouse pointer on the user interface screen.
- the reference point herein may refer to a point where the determined vector and the oral image on the user interface screen first meet.
- the reference point herein may mean at least one point that is directly displayed on the user interface screen among points where the determined vector and the oral image on the user interface screen meet, but is not limited thereto. Reference point herein may refer to both one reference point and a plurality of points.
- the oral image processing apparatus may determine a brush based on a reference point and at least one piece of distance information.
- a brush according to an embodiment of the present disclosure may include both a two-dimensional brush and a three-dimensional brush.
- a 2D brush may mean a circle, an ellipse, or a rectangle
- the 3D brush may be a sphere or an ellipsoid, but is not limited thereto.
- the oral image processing apparatus 300 may determine a brush based on a reference point and at least one piece of distance information.
- the at least one piece of distance information may mean distance information between the reference point and another point of the oral image. Also, for example, the at least one piece of distance information may mean distance information between reference points, but is not limited thereto.
- the preset distance information may be determined in various ways.
- the preset distance information may be determined according to a value set as a default in the system or a user input for changing the size of the brush.
- the oral image processing apparatus 300 may determine a three-dimensional brush sphere by identifying a reference point determined by the user as the center of the sphere and identifying preset distance information as a radius of the sphere.
- the oral image processing apparatus 300 identifies the reference point determined by the user as the center of the ellipsoid, and according to a plurality of preset distance information, the x-axis semiprincipal axis, By determining the y-axis semi-major axis and the z-axis semi-major axis, an ellipsoid that is a three-dimensional brush can be determined.
- the oral cavity image processing apparatus may identify a region overlapping the region determined by the brush in the oral cavity image as the selection region.
- the area defined by the brush may mean an area obtained by extending the two-dimensional brush in a normal direction of the display unit. Also, when the brush is a three-dimensional brush, the area defined by the brush may mean a surface area of the three-dimensional brush.
- the oral cavity image processing apparatus 300 is based on location information on a plurality of points of an area defined by a brush and location information on a plurality of points of an oral image, an area determined by a brush and an oral cavity At least one boundary on which the image overlaps may be identified. Also, the oral cavity image processing apparatus 300 according to an embodiment of the present disclosure may identify a selection region of the oral cavity image based on the identified at least one boundary. An operation of identifying a selection region in a specific oral image will be described in detail with reference to FIGS. 6 to 9 .
- step S550 the oral image processing apparatus according to an embodiment of the present disclosure may display a selection area in the oral cavity image.
- the oral cavity image processing apparatus 300 may display an oral cavity image in which the identified region is displayed through the display unit 310 .
- the oral image processing apparatus 300 may display the selection region by emphasizing at least one boundary of the selection region so that the user can clearly recognize and distinguish the selection region of the oral cavity image.
- a selection region herein may be divided into at least one boundary and a region of the oral image located within the at least one boundary.
- the at least one boundary may mean an area including near a boundary located outside the selection area, and the area of the oral cavity image located within the at least one boundary may mean an area excluding at least one boundary among the selection areas in the oral cavity image.
- the oral image processing apparatus 300 displays at least one boundary of the selection area with a preset color, The region may be displayed by overlapping or blending the color of the oral cavity image and a preset color.
- the oral image processing apparatus 300 may output a preset region located outside at least one boundary according to a preset color, and the oral image processing apparatus 300 may display and provide the selected region more appropriately to the user. .
- FIG. 6 is a flowchart for explaining an operation of identifying an area overlapping an area determined by a brush in an oral image as a selection area.
- the oral image processing apparatus may obtain position information on a plurality of points in an area determined by a brush and position information on a plurality of points in the oral image.
- the location information on the plurality of points of the oral cavity image may include position information of points located on the surfaces of at least one object of the oral cavity image and/or connection relationship information between the respective points.
- location information on a plurality of points in an area defined by the brush may mean location information of points located on the surface of the 3D brush.
- the oral image processing apparatus 300 may identify a three-dimensional figure in which the brush is projected in a normal direction on the user interface screen, and location information on a plurality of points in an area determined by the brush may mean position information of points located on the identified 3D figure surface.
- the location information herein may be stored in the memory 320 of the oral image processing apparatus 300, and the oral image processing apparatus 300 may use the location information stored in the memory 320 to identify the selection area.
- the oral image processing apparatus 300 obtains position information for a plurality of points in the region defined by the brush and position information for a plurality of points in the oral image, thereby forming at least one of the region defined by the brush and the oral image. An area where the objects overlap may be accurately identified.
- step S620 the oral image processing apparatus according to an embodiment of the present disclosure may identify at least one boundary where the region defined by the brush and the oral cavity image overlap.
- the oral cavity image processing apparatus 300 may identify at least one point in which position information on a plurality of points in an area determined by a brush and position information on a plurality of points in the oral image match .
- the position information for the plurality of points in the area determined by the brush and the position information for the plurality of points in the oral image coincide with the oral cavity.
- each axis of the three-dimensional model of the oral cavity may mean an x-axis, a y-axis, and a z-axis.
- the location information for a plurality of points in the area defined by the brush and location information for a plurality of points in the oral image coincide with the distance between the plurality of points in the area defined by the brush and the plurality of points in the oral image It may mean that is less than or equal to a preset threshold, but is not limited thereto.
- the oral image processing apparatus may identify a selection region of the oral cavity image based on at least one boundary.
- the oral cavity image processing apparatus 300 may identify a selection region of the oral cavity image according to at least one boundary. Specifically, the oral cavity image processing apparatus 300 may identify at least a plurality of points surrounded by at least one boundary among a plurality of points of the oral cavity image. Accordingly, the oral image processing apparatus 300 may identify a region including a plurality of points identified as being surrounded by at least one boundary and at least one boundary identified in step S620 as the determined region of the oral cavity image.
- FIG. 7 is a reference diagram for explaining an operation of identifying a selection region in an oral cavity image when the brush is a two-dimensional brush.
- FIG. 7 an embodiment in which a selection area 730 in the oral cavity image 700 is identified according to the two-dimensional brush 720 of the oral cavity image 700 is illustrated.
- the oral image 700 displayed on the display unit 310 may include objects such as teeth 701, teeth including teeth 702, and gums 703, and the oral image processing apparatus 300 includes at least the oral image 700 included in the oral image 700.
- Location information on one object and color information on at least one object may be stored.
- point data including location information or color information may not exist in an area where a separate object of the oral image 700 is not present. Accordingly, the selected region in the oral cavity image may be characterized in that the region in which point data does not exist is excluded.
- the two-dimensional brush 720 may be determined based on the reference point 710 and at least one piece of distance information. Specifically, the oral image processing apparatus 300 projects the two-dimensional brush 720 determined based on the reference point 710 located on the gum 703 and the at least one distance information in the normal direction 715, so that the three-dimensional area generated by the extension of the two-dimensional brush is projected. can be identified.
- the three-dimensional region generated by extending the two-dimensional brush may be referred to as a region defined by the two-dimensional brush.
- the two-dimensional brush 760 may be determined based on the reference point 750 positioned on the object 780 of the oral cavity image and distance information stored in the memory 320 .
- the two-dimensional brush 760 may be a circular brush.
- the 3D area 770 generated by extending the 2D brush may have a shape in which the 2D brush 760 is extended in the direction of a normal vector 765 .
- the selection area within the object 780 may refer to an area where the 3D area 770 generated by extending the 2D brush and the object 780 overlap.
- the oral cavity image processing apparatus 300 may identify the selection region 730 based on position information on a plurality of points of the region determined by the two-dimensional brush 720 and position information on a plurality of points of the oral cavity image 700 . Also, the selection area 730 may not include the area 740 without point data.
- FIG. 8 is a reference diagram for explaining an operation of identifying a selection region in an oral cavity image when the brush is a two-dimensional brush.
- FIG. 8 an embodiment of identifying a selection region in the oral cavity image 800 according to the two-dimensional brush 830 in the oral cavity image 800 is illustrated.
- the oral image processing apparatus 300 may identify the two-dimensional brush 830 based on a reference point 810 of the oral cavity image 800 and a distance 820 according to preset distance information.
- the oral image processing apparatus 300 may set a region where the 3D model extended with the 2D brush 830 and the oral cavity image 800 overlap as a selection region within the oral cavity image 800 .
- the selection area in the oral cavity image 800 may include an area 850 that is a partial area of the tooth 851 and an area 860 that is a partial area of the tooth 861.
- the selection region in the oral cavity image 800 may include a region 840 that is a partial region of the gum.
- the reference point 810 since the reference point 810 is located on the tooth 861 of the object, it may be appropriate for the area 850 and the area 860 to be included in the selection area within the oral image 800, considering that the area selected by the user may be limited to the tooth. On the other hand, considering that the reference point 810 selected by the user through the user input unit 350 is located on the tooth 861, so that the area selected by the user may be limited to the tooth, the area 840, which is a part of the gum, is the selected area in the oral image 800 It may be more appropriate not to include
- the oral image processing apparatus 300 may change the type of brush to a three-dimensional brush.
- the oral image processing apparatus 300 may identify a selection region in the oral cavity image more suitable for the user and display it on the display unit 310 .
- the operation of changing the type of the brush to the 3D brush by the oral image processing apparatus 300 may include an operation of suggesting to change the type of the brush to the 3D brush.
- the changed region in the oral cavity image 800 may not include the region 840, which is a partial region of the gums.
- the oral cavity image processing apparatus 300 may provide the user with a changed selection area within the oral cavity image 800 .
- FIG. 9 is a reference diagram for explaining an operation of identifying a selection region in an oral image when the brush is a three-dimensional brush.
- FIG. 9 an embodiment of identifying a selection region 930 in the oral cavity image 900 according to the three-dimensional brush 920 of the oral cavity image 900 is illustrated.
- the oral image 900 displayed on the display unit 310 may include teeth including teeth 911 and teeth 912 and objects such as gums.
- the oral cavity image processing apparatus 300 may determine a reference point of the oral cavity image 900 according to a user's input. Referring to FIG. 9 , the oral cavity image processing apparatus 300 may determine a reference point 910 of the oral cavity image 900 according to a user input. Specifically, the reference point 910 may correspond to a point located on the surface of the tooth 911 among the objects.
- the oral cavity image processing apparatus 300 may determine the 3D brush 920 based on the reference point 910 of the oral cavity image 900 and at least one piece of distance information. Referring to FIG. 9 , the oral image processing apparatus 300 may determine a radius of the 3D brush 920 based on at least one piece of distance information. Accordingly, the oral image processing apparatus 300 may identify the 3D brush 920 by setting the reference point 910 as the center of the sphere and determining the radius of the 3D brush 920 based on at least one piece of distance information.
- the oral image processing apparatus 300 is based on position information on a plurality of points located in an area determined by the three-dimensional brush 920 and position information on a plurality of points of the oral image 900, within the oral image.
- the selection area 930 may be identified.
- the oral image processing apparatus 300 may identify at least one boundary where the region defined by the three-dimensional brush 920 and the oral image 900 overlap, and identify the selection region 930 based on the identified at least one boundary. have.
- the area defined by the brush 920 may mean a surface area of the brush 920 .
- the selection area 930 may correspond to an area in which up to the curved portions of the surfaces of the teeth 911 and 912 are expressed in detail.
- the oral image processing apparatus 300 displays at least one boundary of the selection area 930 with a preset color, and displays the area of the selection area 930 except for the at least one boundary with the color of the tooth 911 and the color of the tooth 912 and the preset color. It can be displayed by overlapping or blending. Accordingly, the oral image processing apparatus 300 may display the selection area 930 through the display unit 310 to highlight the area selected by the user of the oral cavity image 900 .
- FIG. 10 is a reference diagram for explaining an operation of identifying at least one boundary of a selection region of an oral cavity image.
- FIG. 10 an embodiment of identifying at least one boundary 1020 of a selection area 1030 from an oral image 1000 is illustrated.
- the oral image processing apparatus 300 may identify a reference point 1010 from among a plurality of points located on the surface of a tooth 1070 that is an object.
- the oral image processing apparatus 300 may determine a 3D brush according to a radius determined based on the reference point 1010 and at least one piece of distance information. Accordingly, the oral image processing apparatus 300 may identify at least one boundary 1020 in which the determined 3D brush and the tooth 1070 of the oral image 1000 overlap, and may determine the selection area 1030 based on the at least one boundary 1020.
- the at least one boundary 1020 may include points separated from the reference point 1010 by a preset radius.
- a distance between a point 1040 and a reference point 1010 may coincide with a preset radius r.
- a plurality of points in which a distance between the reference points 1010 coincides with a preset r may be included in at least one boundary 1020.
- a plurality of points located outside the at least one boundary 1020 may mean points having a distance from the reference point 1010 greater than a preset r.
- the distance between the reference point 1010 and the point 1050 of the tooth 1080 located next to the tooth 1070 may be greater than a preset r.
- the point 1050 may be a point located outside the selection area 1030 .
- a plurality of points located within at least one boundary 1020 may mean points having a distance from the reference point 1010 smaller than a preset r.
- a distance between a point 1060 and a reference point 1010 may be smaller than a preset r.
- the point 1060 may be a point located within the selection area 1030 .
- the oral image processing apparatus 300 may output a plurality of points separated by a preset radius r from the reference point 1010 among a plurality of points located on the surface of the oral cavity image 1000 in a preset color. Specifically, referring to FIG. 10 , the oral image processing apparatus 300 selects a plurality of points closer than a reference point 1010 and a preset radius r among a plurality of points located on the surface of the oral image 1000 to the color of the tooth 1070 and the preset color. By blending, it is possible to output translucently.
- 11 is a reference diagram for explaining an embodiment of identifying at least one object of an oral cavity image based on location information of a reference point.
- the oral cavity image processing apparatus 300 may identify the at least one object of the oral cavity image 1100 according to color information corresponding to the position information for the reference point 1110. For example, a plurality of objects included in the oral image 1100, such as teeth and gums, may have different color information according to the type of the object, and such color information may be related to a plurality of points located in the oral image 1100 together with location information. It may be stored in the memory 320 in the form of a point cloud.
- the oral image processing apparatus 300 may identify that the object on which the reference point 1110 is located is a tooth 1105 based on color information of the reference point 1110 . Also, the oral image processing apparatus 300 may identify at least one boundary 1130 of the object on which the reference point 1110 is located, based on the color information of the tooth 1105 and the color information of the other object. However, the present invention is not limited thereto, and the oral image processing apparatus 300 may identify curvature information between points of the oral image 1100 according to the position information, and identify that the object in which the reference point 1110 is located is the tooth 1105.
- FIG. 12 is a diagram for explaining an embodiment in which a selection region is further identified based on the region of at least one identified object and an embodiment in which a selection region is displayed with a different color according to the identified at least one object, following FIG. 11 . It is also a reference.
- the oral cavity image processing apparatus 300 may display a selection region in the oral cavity image according to various embodiments, based on the at least one object identified in FIG. 11 .
- the oral cavity image processing apparatus 300 may further identify the selection area 1210 of the oral cavity image 1200 based on at least one identified object of the oral cavity image 1200 . Specifically, the oral image processing apparatus 300 selects the oral image 1200 according to the position information on the plurality of points of the brush 1230, the position information on the plurality of points of the oral image 1200, and the identified at least one object of the oral image 1200 Region 1210 may be identified.
- the selection area 1210 is identified in consideration of at least one identified object, and a partial boundary of the selection area 1210 may correspond to a boundary between the tooth 1220, which is the object, and the gum.
- the oral image processing apparatus 300 may provide a more suitable selection area to the user by displaying the identified selection area 1210 on the display unit 310 in consideration of the boundary between the tooth 1220 and the gum.
- the oral cavity image processing apparatus 300 may display the selection region in different colors according to at least one identified object of the oral cavity image 1250 . Specifically, the oral cavity image processing apparatus 300 may identify the selection region based on position information on a plurality of points in the region determined by the brush 1260 and position information on a plurality of points in the oral cavity image 1250 . Also, the oral image processing apparatus 300 may display the selection area in different colors according to at least one object included in the selection area. For example, as shown in FIG.
- the oral image processing apparatus 300 may display the area 1270 corresponding to the tooth 1290 among the selected areas in the oral image with a first color, and the oral image processing apparatus 300 may display the area in the oral image.
- the area 1280 corresponding to the middle gum 1295 may be displayed in the second color. That is, the oral image processing apparatus 300 may display the area 1270 and the area 1280 in different colors based on the boundary between the tooth 1290 and the gum 1295 .
- the oral image processing apparatus 300 may provide a more suitable selection region to the user by displaying the region in the oral cavity image on the display unit 310 in consideration of the boundary between the tooth 1290 and the gum 1295 together.
- FIG. 13 is a reference diagram for explaining an example of identifying at least one object of an oral cavity image based on location information of a reference point.
- the oral cavity image processing apparatus 300 may identify the at least one object of the oral cavity image 1300 according to color information corresponding to the position information for the reference point 1310.
- the oral image processing apparatus 300 may identify that the object on which the reference point 1310 is located is the gum 1320 based on the color information on the reference point 1310 .
- the oral image processing apparatus 300 may identify at least one boundary of the object on which the reference point 1310 is located, based on the color information of the gum 1320 and the color information on other objects (eg, teeth 1330 and 1340). have.
- the at least one boundary may mean a boundary between the teeth 1330 , 1340 , and the gum 1320 .
- FIG. 14 is a diagram for explaining an embodiment in which a selection region is further identified based on the region of at least one identified object and an embodiment in which a selection region is displayed with a different color according to the identified at least one object, following FIG. 13 . It is also a reference.
- the oral cavity image processing apparatus 300 may display a selection region in the oral cavity image according to various embodiments, based on the at least one object identified in FIG. 13 .
- the oral image processing apparatus 300 may further identify the selection region 1420 of the oral cavity image 1400 based on at least one identified object of the oral cavity image 1400 have. Specifically, the oral image processing device 300 according to the position information for a plurality of points in the area determined by the brush 1410, the position information for the plurality of points of the oral image 1400 and the identified at least one object of the oral image 1400, A selection region 1420 of the oral image 1400 may be identified. 14 , a partial boundary of the selection area 1420 may correspond to a boundary between a tooth (it may include a tooth 1430 and a tooth 1431) and a gum 1432 as an object. Also, the selection area 1420 may not include an area in which point data of the oral cavity image does not exist.
- the oral image processing apparatus 300 displays the identified selection area 1420 on the display unit 310 in consideration of the boundary between the teeth (which may include the teeth 1430 and 1431) and the gums 1432, so that more suitable according to the user's input is provided. You can provide a selection area. For example, the oral image processing apparatus 300 may display the selection area 1420 on the display unit 310 by emphasizing only a portion corresponding to the gum 1432 .
- the oral image processing apparatus 300 may display the selection region 1470 in different colors according to at least one identified object of the oral cavity image 1450 .
- the oral cavity image processing apparatus 300 may identify the selection area 1470 based on location information on a plurality of points of the brush 1460 and the oral cavity image 1450 .
- the oral image processing apparatus 300 may display the selection area 1470 in different colors according to at least one object included in the selection area 1470 . For example, as shown in FIG.
- the oral image processing apparatus 300 may display the area 1481 corresponding to the tooth 1491 among the selection areas 1470 in the oral cavity image as a first color, and the oral image processing apparatus 300 may display the area 1481 in the oral cavity image.
- the area 1482 corresponding to the tooth 1492 among the selection areas 1470 may be displayed as a second color, and the oral image processing apparatus 300 may display the area corresponding to the gum 1493 among the selection areas 1470 in the mouth image as a third color.
- the area corresponding to the gum 1493 may correspond to an area other than the area 1481 and the area 1482 in the selection area 1470 .
- the oral image processing apparatus 300 may display the area 1480 in a different color based on the boundary between the teeth (it may include the teeth 1491 and 1492) and the gums 1493 . Accordingly, the oral image processing apparatus 300 may identify the tooth 1491 , the tooth 1492 , and the gum 1493 which are at least one object within the selection area 1470 , and the oral image processing apparatus 300 determines the selection area 1470 according to the teeth 1491 , the tooth 1492 , and the gum 1493 . can be displayed in different colors. The oral image processing apparatus 300 may provide a more suitable selection region to the user by displaying the region within the oral cavity image on the display unit 310 in consideration of the boundary between the teeth 1491, 1492, and the gum 1493 together.
- 15 is a reference diagram for explaining an embodiment of displaying a selection region in an oral image when a selection region is identified based on a three-dimensional brush.
- FIG. 15 an embodiment of displaying a selection area in an oral image by the oral image processing apparatus 300 is illustrated.
- the oral cavity image processing apparatus 300 may identify a selection region according to a reference point of the oral cavity image and location information for a plurality of points, and display the selection region in the oral cavity image. For example, referring to FIG. 15 , the oral cavity image processing apparatus 300 may identify a reference point 1510 of the oral cavity image 1500 . When the brush is a sphere, the oral image processing apparatus 300 may identify the sphere 1520, which is a three-dimensional brush, based on the reference point 1510 and a preset distance. Since the oral image 1500 is an oral image corresponding to the front part of the tooth, the user may not be able to clearly identify whether the rear part of the tooth 1520 and the rear part of the tooth 1522 and the three-dimensional brush sphere 1520 overlap.
- the oral image 1550 may be an image that can clearly identify whether the rear portion of the tooth 1520 and the rear portion of the tooth 1522 of the oral image 1500 overlap with the sphere 1520, which is a three-dimensional brush.
- the oral cavity image processing apparatus 300 may identify the selection area 1570 according to the reference point of the oral cavity image 1550 and location information on a plurality of points, and display the selection area 1570 of the oral cavity image.
- a boundary of the selection area 1570 may include a point A 1530 and a point B 1540 .
- a point A 1530 located on the tooth 1521 of the oral cavity image 1550 and a point B 1540 located on the tooth 1522 may be points located at the top of the front portion of each tooth.
- the tooth 1521, the sub-region 1560 located at the rear portion of the tooth 1522, and the boundary 1580 corresponding to the sub-region 1560 may be a region of a tooth that is not observed from the front. Therefore, through the oral image 1550, the oral image processing apparatus 300 clearly identifies that the sub-region 1560 located on the rear part of the teeth 1521 and 1522 is included in the selection region 1570 in the oral image 1550 determined by the sphere 1520, which is the three-dimensional brush. can do.
- the user can identify the sub-region 1560, which is an area not clearly observed in the front part of the tooth, as a part of the selected area 1570 in the oral image. can
- 16 is a reference diagram for explaining an embodiment of displaying a selected area in which a hole is displayed when there is a hole in an area identified based on a three-dimensional brush.
- the oral image processing apparatus 300 may identify a hole 1620 located in a selection area 1630 identified according to a reference point 1631 , and the oral image processing apparatus 300 may display the hole 1620 in the selection area 1630 .
- a hole in an oral image according to an embodiment of the present disclosure may be generated during a dental treatment process, and a hole may mean an area in which point data does not exist according to a technical defect.
- the oral image processing apparatus 300 When the oral image processing apparatus 300 is identified in the selected area, it may be necessary to display the hole in the area and provide it to the user.
- the oral image processing apparatus 300 may identify a hole 1620 existing in the tooth 1610 corresponding to the location information of the reference point 1631 . Specifically, the oral image processing apparatus 300 may determine at least one area in which point data does not exist among a plurality of points located in the selection area 1630 as the hole 1620 of the selection area 1630 .
- the oral cavity image processing apparatus 300 may display the selection area 1630 in the oral cavity image 1600 more suitable for the user by displaying the hole 1620 together with the selection area 1630 .
- the method of processing an oral image according to an embodiment of the present disclosure may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium.
- an embodiment of the present disclosure may be a computer-readable storage medium in which one or more programs including at least one instruction for executing a method of processing an oral image are recorded.
- the device-readable storage medium may be provided in the form of a non-transitory storage medium.
- 'non-transitory storage medium' is a tangible device and only means that it does not contain a signal (eg, electromagnetic wave). It does not distinguish the case where it is stored as
- the 'non-transitory storage medium' may include a buffer in which data is temporarily stored.
- the method according to the various embodiments disclosed in this document may be provided by being included in a computer program product.
- Computer program products may be traded between sellers and buyers as commodities.
- the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store or between two user devices (eg smartphones). It can be distributed directly or online (eg, downloaded or uploaded).
- at least a portion of the computer program product eg, a downloadable app
- a machine-readable storage medium such as a memory of a manufacturer's server, a server of an application store, or a relay server. It may be temporarily stored or temporarily created.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Architecture (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Processing Or Creating Images (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
- Endoscopes (AREA)
Abstract
Description
Claims (17)
- 구강 이미지 내의 선택 영역을 식별하는 방법에 있어서,상기 구강 이미지를 획득하는 단계;상기 구강 이미지의 기준 포인트를 결정하는 단계;상기 기준 포인트 및 적어도 하나의 거리 정보에 기초하여, 브러시를 결정하는 단계;상기 구강 이미지에서 상기 브러시에 의해 정해지는 영역과 중첩되는 영역을 상기 선택 영역으로 식별하는 단계; 및상기 구강 이미지에서 상기 선택 영역을 표시하는 단계; 를 포함하는 구강 이미지 내의 선택 영역을 식별하는 방법.
- 제 1항에 있어서, 상기 선택 영역을 식별하는 단계는,상기 브러시에 의해 정해지는 영역의 복수 개의 포인트에 대한 위치 정보 및 상기 구강 이미지의 복수 개의 포인트에 대한 위치 정보에 기초하여, 상기 브러시에 의해 정해지는 영역과 상기 구강 이미지가 중첩되는 적어도 하나의 경계를 식별하는 단계; 및상기 적어도 하나의 경계에 기초하여, 상기 선택 영역을 식별하는 단계; 를 포함하는 구강 이미지 내의 선택 영역을 식별하는 방법.
- 제 2항에 있어서,상기 선택 영역은 상기 적어도 하나의 경계 및 상기 식별된 적어도 하나의 경계 내에 위치한 상기 구강 이미지의 영역을 포함하는 것인, 구강 이미지 내의 선택 영역을 식별하는 방법.
- 제 1항에 있어서, 상기 선택 영역을 표시하는 단계는,상기 선택 영역의 적어도 하나의 경계를 강조하여 상기 선택 영역을 표시하는 단계; 를 포함하는 구강 이미지 내의 선택 영역을 식별하는 방법.
- 제 4항에 있어서, 상기 선택 영역의 적어도 하나의 경계를 강조하여 상기 선택 영역을 표시하는 단계는,상기 선택 영역의 적어도 하나의 경계를 기 설정된 색상으로 표시하고, 상기 적어도 하나의 경계 내에 위치한 상기 구강 이미지의 영역은 상기 구강 이미지의 색상 및 상기 기 설정된 색상을 중첩 또는 블렌딩하여 표시하는 단계; 를 포함하는 구강 이미지 내의 선택 영역을 식별하는 방법.
- 제 1항에 있어서, 상기 선택 영역을 식별하는 단계는,상기 기준 포인트의 위치 정보에 기초하여, 상기 구강 이미지의 적어도 하나의 오브젝트를 식별하는 단계; 및상기 식별된 적어도 하나의 오브젝트의 영역에 더 기초하여, 상기 선택 영역을 식별하는 단계; 를 포함하는 구강 이미지 내의 선택 영역을 식별하는 방법.
- 제 1항에 있어서, 상기 선택 영역을 표시하는 단계는,상기 선택 영역에 포함되는 상기 구강 이미지의 적어도 하나의 오브젝트를 식별하는 단계; 및상기 적어도 하나의 오브젝트에 따라 상이한 색상으로 상기 선택 영역을 표시하는 단계; 를 포함하는 구강 이미지 내의 선택 영역을 식별하는 방법.
- 제 1항에 있어서,상기 적어도 하나의 거리 정보는 상기 기준 포인트와 구강 이미지의 다른 포인트 간의 거리 정보를 포함하는 것인, 구강 이미지 내의 선택 영역을 식별하는 방법.
- 구강 이미지를 처리하는 장치에 있어서,상기 구강 이미지를 표시하는 디스플레이부;하나 이상의 인스트럭션을 저장하는 메모리; 및상기 하나 이상의 인스트럭션을 실행하는 적어도 하나의 프로세서; 를 포함하며,상기 적어도 하나의 프로세서는 상기 구강 이미지를 획득하고,상기 구강 이미지의 기준 포인트를 결정하고,상기 기준 포인트 및 적어도 하나의 거리 정보에 기초하여, 브러시를 결정하고,상기 구강 이미지에서 상기 브러시에 의해 정해지는 영역과 중첩되는 영역을 선택 영역으로 식별하고, 및상기 구강 이미지에서 상기 선택 영역을 표시하도록 상기 디스플레이부를 제어하는, 구강 이미지를 처리하는 장치.
- 제 9항에 있어서,상기 프로세서는 상기 메모리에 저장된 상기 하나 이상의 인스트럭션을 실행함으로써,상기 브러시에 의해 정해지는 영역의 복수 개의 포인트에 대한 위치 정보 및 상기 구강 이미지의 복수 개의 포인트에 대한 위치 정보에 기초하여, 상기 브러시에 의해 정해지는 영역과 상기 구강 이미지가 중첩되는 적어도 하나의 경계를 식별하고, 및상기 적어도 하나의 경계에 기초하여, 상기 선택 영역을 식별하는, 구강 이미지를 처리하는 장치.
- 제 10항에 있어서,상기 선택 영역은 상기 적어도 하나의 경계 및 상기 식별된 적어도 하나의 경계 내에 위치한 상기 구강 이미지의 영역을 포함하는 것인, 구강 이미지를 처리하는 장치.
- 제 9항에 있어서,상기 프로세서는 상기 메모리에 저장된 상기 하나 이상의 인스트럭션을 실행함으로써,상기 선택 영역의 적어도 하나의 경계를 강조하여 상기 선택 영역을 표시하도록 상기 디스플레이부를 제어하는, 구강 이미지를 처리하는 장치.
- 제 12항에 있어서,상기 프로세서는 상기 메모리에 저장된 상기 하나 이상의 인스트럭션을 실행함으로써,상기 선택 영역의 적어도 하나의 경계를 기 설정된 색상으로 표시하고, 상기 적어도 하나의 경계 내에 위치한 상기 구강 이미지의 영역은 상기 구강 이미지의 색상 및 상기 기 설정된 색상을 중첩 또는 블렌딩하여 표시하도록 상기 디스플레이부를 제어하는, 구강 이미지를 처리하는 장치.
- 제 9항에 있어서,상기 프로세서는 상기 메모리에 저장된 상기 하나 이상의 인스트럭션을 실행함으로써,상기 기준 포인트의 위치 정보에 기초하여, 상기 구강 이미지의 적어도 하나의 오브젝트를 식별하고, 및상기 식별된 적어도 하나의 오브젝트의 영역에 더 기초하여, 상기 선택 영역을 식별하는, 구강 이미지를 처리하는 장치.
- 제 9항에 있어서,상기 프로세서는 상기 메모리에 저장된 상기 하나 이상의 인스트럭션을 실행함으로써,상기 선택 영역에 포함되는 상기 구강 이미지의 적어도 하나의 오브젝트를 식별하고, 및상기 적어도 하나의 오브젝트에 따라 상이한 색상으로 상기 선택 영역을 표시하도록 상기 디스플레이부를 제어하는, 구강 이미지를 처리하는 장치.
- 제 9항에 있어서,상기 적어도 하나의 거리 정보는 상기 기준 포인트와 구강 이미지의 다른 포인트 간의 거리 정보를 포함하는 것인, 구강 이미지를 처리하는 장치.
- 제1항 내지 제8항 중 어느 한 항의 방법을 컴퓨터에서 수행하기 위한 프로그램이 기록된 컴퓨터로 읽을 수 있는 기록매체.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280020214.8A CN116964639A (zh) | 2021-03-11 | 2022-03-11 | 识别口腔图像内的选择区域的方法及用于其的装置 |
US18/281,114 US20240161278A1 (en) | 2021-03-11 | 2022-03-11 | Method for identifying selection area in oral image and device therefor |
EP22767561.8A EP4307240A1 (en) | 2021-03-11 | 2022-03-11 | Method for identifying selection area in oral image and device therefor |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2021-0031871 | 2021-03-11 | ||
KR20210031871 | 2021-03-11 | ||
KR10-2022-0030315 | 2022-03-10 | ||
KR1020220030315A KR20220127764A (ko) | 2021-03-11 | 2022-03-10 | 구강 이미지 내의 선택 영역을 식별하는 방법 및 이를 위한 장치 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022191665A1 true WO2022191665A1 (ko) | 2022-09-15 |
Family
ID=83226936
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/003459 WO2022191665A1 (ko) | 2021-03-11 | 2022-03-11 | 구강 이미지 내의 선택 영역을 식별하는 방법 및 이를 위한 장치 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240161278A1 (ko) |
EP (1) | EP4307240A1 (ko) |
WO (1) | WO2022191665A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018064936A (ja) * | 2016-10-14 | 2018-04-26 | 株式会社モリタ製作所 | 医療用x線撮影装置の操作パネル表示装置、医療用x線撮影装置、医療用x線撮影装置の操作パネル表示装置における表示方法及び表示プログラム |
KR102004449B1 (ko) * | 2017-12-15 | 2019-07-26 | 주식회사 디디에스 | 가상 보철물 설계방법 |
JP6684221B2 (ja) * | 2014-02-04 | 2020-04-22 | シロナ・デンタル・システムズ・ゲゼルシャフト・ミット・ベシュレンクテル・ハフツング | デジタル3dモデルのコンピュータ支援編集のための方法 |
KR20200120036A (ko) * | 2019-04-11 | 2020-10-21 | 주식회사 디오 | 치아 오브젝트를 이용한 영상 정합 방법 및 장치 |
KR102176490B1 (ko) * | 2018-08-24 | 2020-11-10 | 이재우 | 구강 상태 진단, 예측 또는 관리를 위한 치아영상의 분할 및 처리 방법 |
-
2022
- 2022-03-11 US US18/281,114 patent/US20240161278A1/en active Pending
- 2022-03-11 WO PCT/KR2022/003459 patent/WO2022191665A1/ko active Application Filing
- 2022-03-11 EP EP22767561.8A patent/EP4307240A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6684221B2 (ja) * | 2014-02-04 | 2020-04-22 | シロナ・デンタル・システムズ・ゲゼルシャフト・ミット・ベシュレンクテル・ハフツング | デジタル3dモデルのコンピュータ支援編集のための方法 |
JP2018064936A (ja) * | 2016-10-14 | 2018-04-26 | 株式会社モリタ製作所 | 医療用x線撮影装置の操作パネル表示装置、医療用x線撮影装置、医療用x線撮影装置の操作パネル表示装置における表示方法及び表示プログラム |
KR102004449B1 (ko) * | 2017-12-15 | 2019-07-26 | 주식회사 디디에스 | 가상 보철물 설계방법 |
KR102176490B1 (ko) * | 2018-08-24 | 2020-11-10 | 이재우 | 구강 상태 진단, 예측 또는 관리를 위한 치아영상의 분할 및 처리 방법 |
KR20200120036A (ko) * | 2019-04-11 | 2020-10-21 | 주식회사 디오 | 치아 오브젝트를 이용한 영상 정합 방법 및 장치 |
Also Published As
Publication number | Publication date |
---|---|
US20240161278A1 (en) | 2024-05-16 |
EP4307240A1 (en) | 2024-01-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020149581A1 (en) | Electronic device for generating avatar and method thereof | |
WO2018012945A1 (en) | Method and device for obtaining image, and recording medium thereof | |
WO2021242050A1 (ko) | 구강 이미지의 처리 방법, 그에 따른 동작을 수행하는 구강 진단 장치, 및 그 방법을 수행하는 프로그램이 저장된 컴퓨터 판독 가능 저장 매체 | |
WO2022085966A1 (ko) | 구강 이미지 처리 장치 및 구강 이미지 처리 방법 | |
WO2022191665A1 (ko) | 구강 이미지 내의 선택 영역을 식별하는 방법 및 이를 위한 장치 | |
WO2022065756A1 (ko) | 구강 이미지 처리 장치, 및 구강 이미지 처리 방법 | |
WO2022092627A1 (ko) | 3차원 모델로부터 대상체 영역을 결정하는 방법 및 3차원 모델 처리 장치 | |
WO2022014965A1 (ko) | 구강 이미지 처리 장치 및 구강 이미지 처리 방법 | |
WO2021242053A1 (ko) | 3차원 데이터 획득 방법, 장치 및 그 방법을 수행하는 프로그램이 저장된 컴퓨터 판독 가능 저장 매체 | |
WO2023277391A1 (ko) | 구강 이미지 처리 장치 및 구강 이미지 처리 방법 | |
WO2023027500A1 (ko) | 이미지 처리 장치, 및 이미지 처리 방법 | |
WO2022225332A1 (ko) | 구강 이미지 처리 장치 및 구강 이미지 처리 방법 | |
WO2021145731A1 (en) | Electronic device for analyzing skin image and method for controlling the same | |
WO2023003192A1 (ko) | 이미지 처리 장치, 및 이미지 처리 방법 | |
WO2018139707A1 (ko) | 대상체에 관한 횡파 탄성 데이터를 표시하는 초음파 진단 장치 그 동작 방법 | |
WO2020122513A1 (ko) | 2차원 이미지 처리 방법 및 이 방법을 실행하는 디바이스 | |
KR20220059908A (ko) | 데이터 처리 장치 및 데이터 처리 방법 | |
WO2023282619A1 (ko) | 3차원 모델 상에 텍스트를 추가하는 방법 및 3차원 모델 처리 장치 | |
WO2023063767A1 (ko) | 구강 이미지 처리 장치, 및 구강 이미지 처리 방법 | |
WO2023063805A1 (ko) | 구강 이미지 처리 장치, 및 구강 이미지 처리 방법 | |
WO2022270903A1 (ko) | 구강에 대한 마진 라인 정보를 제공하는 방법 및 그를 수행하는 전자 장치 | |
WO2022270889A1 (ko) | 구강 이미지 처리 장치, 및 구강 이미지 처리 방법 | |
WO2023282579A1 (ko) | 구강 모델을 처리하는 데이터 처리 장치 및 그 동작 방법 | |
KR20220127764A (ko) | 구강 이미지 내의 선택 영역을 식별하는 방법 및 이를 위한 장치 | |
WO2022092594A1 (ko) | 구강 이미지 처리 장치, 및 구강 이미지 처리 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22767561 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18281114 Country of ref document: US Ref document number: 202280020214.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022767561 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022767561 Country of ref document: EP Effective date: 20231011 |