WO2022216056A1 - 데이터 처리 장치, 스캐너 및 그 동작 방법 - Google Patents
데이터 처리 장치, 스캐너 및 그 동작 방법 Download PDFInfo
- Publication number
- WO2022216056A1 WO2022216056A1 PCT/KR2022/004974 KR2022004974W WO2022216056A1 WO 2022216056 A1 WO2022216056 A1 WO 2022216056A1 KR 2022004974 W KR2022004974 W KR 2022004974W WO 2022216056 A1 WO2022216056 A1 WO 2022216056A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- scanner
- image
- data processing
- display
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 288
- 238000000034 method Methods 0.000 title claims abstract description 97
- 230000006870 function Effects 0.000 claims abstract description 130
- 238000004891 communication Methods 0.000 claims abstract description 62
- 230000004044 response Effects 0.000 claims abstract description 12
- 230000003287 optical effect Effects 0.000 claims description 43
- 230000008569 process Effects 0.000 claims description 35
- 238000011084 recovery Methods 0.000 claims description 5
- 238000011017 operating method Methods 0.000 claims description 3
- 210000000214 mouth Anatomy 0.000 description 45
- 238000012905 input function Methods 0.000 description 20
- 238000013507 mapping Methods 0.000 description 20
- 238000004590 computer program Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 18
- 230000008859 change Effects 0.000 description 13
- 238000001914 filtration Methods 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 210000004195 gingiva Anatomy 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 239000007943 implant Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003252 repetitive effect Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000003826 tablet Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
- A61B1/0004—Operational features of endoscopes provided with input arrangements for the user for electronic operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00172—Optical arrangements with means for scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/24—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
- A61B5/0013—Medical image data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0088—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
- A61C9/0053—Optical means or methods, e.g. scanning the teeth by a laser or light beam
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
- A61B5/0068—Confocal scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7475—User input or interface means, e.g. keyboard, pointing device, joystick
- A61B5/748—Selection of a region of interest, e.g. using a graphics tablet
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
Definitions
- the disclosed embodiment relates to a data processing apparatus, a scanner, and a method of operating the same. Specifically, the disclosed embodiment relates to a data processing apparatus, a scanner, and an operating method thereof for processing image data acquired by a scanner or enabling more convenient control when adjusting settings of a scanner.
- an oral scanner that is inserted into the patient's oral cavity to acquire an intraoral image is used.
- maxillary scan data By scanning the patient's mouth using an oral scanner, maxillary scan data, mandibular scan data, and maxillary and mandibular occlusion scan data may be acquired.
- a 3D virtual model may be generated using the acquired scan data, and treatment or correction of teeth may be performed using the generated 3D virtual model.
- a user using the oral scanner may check the data acquired using the oral scanner on the display of the data processing device.
- the user can check the data displayed on the display and, if the data contains unnecessary or incorrect data, delete the data and then scan again.
- the data may be deleted using a separate external input device such as a mouse or keyboard capable of inputting data into the data processing device.
- a separate external input device such as a mouse or keyboard capable of inputting data into the data processing device.
- An object of the disclosed embodiment is to provide a scanner, a data processing apparatus, and a method for providing a function of adjusting data acquired through the scanner using the scanner without using a separate external input device.
- An object of the disclosed embodiment is to provide a scanner that provides a function to change settings of a scanner without using a separate external input device, and an operating method thereof.
- a data processing apparatus includes a display, a communication interface, a memory including one or more instructions, and a processor for executing the one or more instructions, wherein the processor executes the one or more instructions, thereby scanning an object controlling the communication interface to receive the acquired scan data of the object from a scanner, controlling the display to display an image corresponding to three-dimensional data generated based on the scan data, and while displaying the image, the Control the communication interface to receive a control signal generated according to a user input sensed through one or more user interfaces of the scanner from the scanner, and perform a function of adjusting the generated three-dimensional data according to the control signal, The display is controlled to display an image based on the adjusted three-dimensional data.
- the processor executes the one or more instructions, so that the control signal corresponds to an undo command, so that the most recently inputted number of frames from among the frames constituting the 3D data is retrieved. can be hidden.
- the processor may recover and process a predetermined number of frames that have been most recently hidden as the control signal corresponds to a redo command.
- the 3D data may include at least one of voxel data generated based on the scan data and mesh data generated based on the scan data.
- the processor executes the one or more instructions to display a first image corresponding to the first 3D data generated based on the scan data corresponding to the first area of the object on the sub-screen. and control the display to display a second image corresponding to second 3D data generated based on scan data corresponding to a second area of the object spaced apart from the first area on a main screen, and Perform a function of adjusting 3D data corresponding to the second image displayed on the main screen according to a control signal received from the scanner, and control the display to display a second image based on the adjusted 3D data have.
- the processor may set a locking function to prevent a data adjustment function from being performed on the 3D data corresponding to the first image displayed on the sub-screen by executing the one or more instructions.
- the processor executes the one or more instructions, so that as the control signal corresponds to an undo command, a preset number of the most recently input among frames constituting the second 3D data is
- the display may be controlled to display a first image corresponding to the first three-dimensional data corresponding to the sub-screen on the main screen.
- a scanner includes an optical unit that performs a scan operation, one or more sensors, a communication interface, and a processor, wherein the processor receives scan data of the object obtained by scanning the object using the optical unit controlling the communication interface to transmit to a data processing device, obtaining a user input sensed through the one or more sensors, controlling the optical unit to stop the scanning operation according to the user input, and corresponding to the user input
- the communication interface is controlled to transmit a control signal that controls to perform a function or to perform a function corresponding to the user input to the data processing device.
- each sensor of the one or more sensors may correspond to one or more functions among a plurality of functions.
- the plurality of functions may include at least one of a data adjustment function, a locking function, and a function for adjusting settings of the optical unit.
- a method of operating a data processing apparatus includes: receiving scan data of the object obtained by scanning the object from a scanner; displaying an image corresponding to 3D data generated based on the scan data; While displaying the image, an operation of receiving a control signal generated according to a user input sensed through one or more user interfaces of the scanner from the scanner, and a function of adjusting the generated three-dimensional data according to the control signal and displaying an image based on the adjusted three-dimensional data.
- a method of operating a scanner including an optical unit and one or more sensors includes: transmitting scan data of the object obtained by scanning the object using the optical unit to a data processing device; an operation of acquiring a user input sensed through the user input, an operation of controlling the optical unit to stop the scanning operation according to the user input, and a function corresponding to the user input or to perform a function corresponding to the user input and transmitting a controlling control signal to the data processing device.
- the method of operation of the data processing apparatus includes scanning an object An operation of receiving the scan data of the object obtained by performing an operation from a scanner, an operation of displaying an image corresponding to three-dimensional data generated based on the scan data, an operation of displaying the image, through one or more user interfaces of the scanner An operation of receiving a control signal generated according to a sensed user input from the scanner, an operation of performing a function of adjusting the generated 3D data according to the control signal, and displaying an image based on the adjusted 3D data includes action.
- the scan data when it is desired to adjust scan data while scanning an object using a scanner, the scan data may be adjusted by using a scanner that scans the object without using a separate user input device.
- the scanner setting when a scanner setting is changed while an object is scanned using the scanner, the scanner setting may be changed by using the scanner for scanning the object without using a separate user input device.
- FIG. 1 is a view for explaining a digital oral model processing system according to the disclosed embodiment.
- FIG. 2 is an example of a detailed block diagram of a system including a scanner and a data processing apparatus according to an embodiment.
- FIG 3 illustrates an example of user input-function mapping information 300 according to an embodiment.
- FIG. 4 is an example of a flowchart illustrating an operation process of a scanner and a data processing apparatus according to an exemplary embodiment.
- FIG. 5 is a reference diagram for explaining a process of displaying a three-dimensional virtual model generated using two-dimensional data obtained through a scanner in a data processing apparatus according to an exemplary embodiment
- FIG. 7 is a detailed flowchart of an operation method performed by a data processing apparatus according to an embodiment.
- FIG. 8 is a reference diagram for explaining a method in which the data processing apparatus operates according to the undo command received from the scanner, according to an exemplary embodiment.
- FIG. 9 is a reference diagram for explaining a method in which a data processing apparatus operates according to a revive command received from a scanner, according to an exemplary embodiment.
- FIG. 10 is a reference diagram for explaining a method in which a data processing apparatus operates according to a locking command received from a scanner according to an embodiment.
- FIG. 11 is a reference diagram for explaining an operation in which the data processing apparatus displays images corresponding to a plurality of regions of an object, according to an exemplary embodiment.
- FIG. 12 is a reference diagram for explaining an operation of displaying 3D data corresponding to a first region of an object on a display of a data processing apparatus according to an exemplary embodiment.
- FIG. 13 is a reference diagram for explaining an operation in which the data processing apparatus 100 automatically moves an image of the sub screen 1220 to the main screen 1210 when all images displayed on the main screen 1210 are hidden, according to an exemplary embodiment.
- the image may include at least one tooth or an image representing an oral cavity including at least one tooth (hereinafter, 'oral image').
- an image may be a two-dimensional image of an object or a three-dimensional model or three-dimensional image representing the object three-dimensionally.
- an image may refer to data necessary to represent an object in two or three dimensions, for example, raw data obtained from at least one image sensor.
- raw data is data acquired to generate an oral image, and when scanning the inside of the oral cavity of a patient, which is an object, using an intraoral scanner, at least one image sensor included in the intraoral scanner. It may be data to be obtained (eg, two-dimensional data).
- Raw data obtained from an intraoral scanner may be referred to as scan data or two-dimensional image data.
- 'object' refers to teeth, gingiva, at least a portion of the oral cavity, and/or an artificial structure insertable into the oral cavity (eg, an orthodontic device, an implant, an artificial tooth, an orthodontic aid inserted into the oral cavity, etc.) ) and the like.
- the orthodontic device may include at least one of a bracket, an attachment, an orthodontic screw, a lingual orthodontic device, and a removable orthodontic maintenance device.
- FIG. 1 is a view for explaining a digital oral model processing system according to the disclosed embodiment.
- the digital oral model processing system may include a scanner 200 and a data processing device 100 .
- the scanner 200 is a device for scanning an object.
- the scanner 200 may include, for example, an oral scanner that scans the patient's teeth by inserting it into the patient's mouth, or a model scanner that installs a tooth model and scans while moving around the installed tooth model.
- the scanner 200 may use a triangulation technique or a confocal method as a method of measuring three-dimensional information of an object.
- Optical triangulation is a technique for acquiring three-dimensional information of an object through triangulation calculation using a triangle formed by a light source, an object irradiated with light irradiated from the light source, and an image sensor to which light reflected from the object is input.
- the focus method is a method of acquiring three-dimensional information of an object based on the position of a point found through the maximum intensity of reflected light according to the refractive index of a lens that passes light irradiated to the object.
- the oral scanner 301 may be a device for acquiring an image of an oral cavity including at least one tooth by being inserted into the oral cavity and scanning teeth in a non-contact manner.
- the oral scanner 301 may have a form that can be drawn in and out of the oral cavity, and scans the inside of the patient's mouth using at least one image sensor (eg, an optical camera, etc.).
- the oral scanner 301 may include at least one of teeth, gingiva, and artificial structures insertable into the oral cavity (eg, orthodontic devices including brackets and wires, implants, artificial teeth, orthodontic aids inserted into the oral cavity, etc.)
- surface information about the object may be obtained as raw data.
- the image data acquired by the scanner 200 may be transmitted to the data processing device 100 connected through a wired or wireless communication network.
- the scanner may transmit scan data of the object obtained by scanning the object using the optical unit to the data processing apparatus.
- the scanner may obtain a user input sensed through one or more sensors provided in the scanner, and control the optical unit to stop a scanning operation according to the user input.
- the scanner may obtain a user input sensed through one or more sensors provided in the scanner, and perform a function corresponding to the user input according to the user input.
- the function corresponding to the user input may include a function of changing the setting of the optical unit provided in the scanner.
- the scanner may acquire a user input sensed through one or more sensors provided in the scanner and transmit a control signal for controlling to perform a function corresponding to the user input according to the user input to the data processing device.
- the function corresponding to the user input sent to the data processing device may include a function of adjusting the scan data.
- the scan data adjustment function may include a scan data undo processing function, a restore processing function, a locking function, and the like.
- the data processing device 100 is connected to the scanner 200 through a wired or wireless communication network, receives a two-dimensional image obtained by scanning an oral cavity from the scanner 200, and generates, processes, displays and/or an oral image based on the received two-dimensional image Alternatively, it may be any electronic device capable of transmitting.
- the data processing device 100 generates at least one of information generated by processing the two-dimensional image data and an oral image generated by processing the two-dimensional image data, based on the two-dimensional image data received from the scanner 200, and the generated information and the oral image can be displayed through the display.
- the data processing device 100 may be a computing device such as a smart phone, a laptop computer, a desktop computer, a PDA, or a tablet PC, but is not limited thereto.
- the data processing device 100 may exist in the form of a server (or server device) for processing an oral image.
- the scanner 200 may transmit scan data (or may be referred to as raw data) acquired through the scan to the data processing device 100 as it is.
- the data processing apparatus 100 may generate three-dimensional data representing the oral cavity in three dimensions based on the received scan data.
- 'Three-dimensional data' can be generated by modeling the internal structure of the oral cavity in three dimensions based on the received scan data, so '3D oral model', 'digital oral model', or '3D oral cavity model' It may also be called 'image'.
- the data processing device 100 may analyze, process, display, and/or transmit 3D data generated based on the scan data to an external device.
- the scanner 200 may obtain raw data through a scan, process the obtained raw data to generate 3D data corresponding to an oral cavity, which is an object, and transmit it to the data processing apparatus 100 .
- the data processing device 100 may analyze, process, display, and/or transmit the received 3D data.
- the data processing device 100 is an electronic device capable of generating and displaying three-dimensional data representing an oral cavity including one or more teeth in three dimensions, which will be described in detail below.
- the data processing apparatus 100 may process the received scan data to generate 3D data.
- the data processing apparatus 100 may receive scan data of the object obtained by scanning the object from the scanner.
- the data processing apparatus 100 may display an image corresponding to 3D data generated based on scan data received from a scanner in real time.
- the data processing apparatus 100 receives a control signal generated according to a user input sensed through one or more sensors of the scanner from the scanner while displaying an image, and adjusts the 3D data generated according to the control signal It can perform a function and display an image using the adjusted three-dimensional data.
- the data processing apparatus 100 may hide a predetermined number of frames that are most recently input among frames constituting the 3D data.
- the data processing apparatus 100 may recover and process a predetermined number of frames that have been most recently hidden.
- the data processing apparatus 100 may determine and process 3D data based on 3D data in a current state so that hiding processing or recovery processing is no longer permitted. .
- the data processing apparatus 100 may divide the screen of the display into a plurality of screens, and display images corresponding to different regions of the object on each of the plurality of screens.
- the data processing apparatus 100 controls the display to display a first image corresponding to first three-dimensional data generated based on scan data corresponding to the first area of the object on the sub screen, and includes the first area and the first area.
- a second image corresponding to the second three-dimensional data generated based on the scan data corresponding to the second region of the spaced apart object is displayed on the main screen, and the second image displayed on the main screen according to the control signal received from the scanner It may perform a function of adjusting the second 3D data corresponding to , and display the second image using the adjusted 3D data.
- the data processing apparatus 100 may set a locking function to prevent the data adjustment function from being performed on the 3D data corresponding to the first image displayed on the sub-screen.
- the data processing apparatus 100 in response to the control signal corresponding to the undo command, hides and processes the most recently inputted number of frames among the frames constituting the second 3D data.
- the first image corresponding to the first 3D data corresponding to the sub screen may be displayed on the main screen.
- FIG. 2 is an example of a detailed block diagram of a system including a scanner and a data processing apparatus according to an embodiment.
- the system may include a scanner 200 , a data processing device 100 , and a communication network 50 enabling the scanner 200 and the data processing device 100 to communicate.
- the scanner 200 transmits two-dimensional scan data obtained by scanning a patient's oral or dental cast model to the data processing device 100 through the communication network 50, and the data processing device 100 processes the two-dimensional scan data received from the scanner 200 to obtain three-dimensional data may be generated, and an image corresponding to the generated 3D data may be displayed on a display or transmitted to an external device.
- the three-dimensional data generated based on the scan data may be referred to as a three-dimensional virtual model or a three-dimensional oral model.
- the scanner 200 may include an optical unit 210 , a sensor 220 , a communication interface 230 , a memory 240 , and a processor 250 .
- the optical unit 210 may include a projector that projects light from a light source and one or more cameras.
- the optical unit 210 includes, for example, an L camera corresponding to a left field of view and an R camera corresponding to a right field of view in order to restore a three-dimensional image according to an optical triangulation method.
- Each of the L camera and the R camera may acquire L image data corresponding to a left field of view and R image data corresponding to a right field of view.
- the raw data including the L image data and the R image data obtained from the optical unit 210 may be transmitted to the processor 250 for data processing.
- the user interface 220 may have various modalities for receiving a user input that controls settings of the scanner 200 or controls adjustment of scan data acquired by the scanner 200 .
- the user interface 220 may include a sensor unit 221 including one or more sensors capable of detecting a motion of the scanner 200 corresponding to a user input, a user input unit 222, or a microphone for receiving a user's voice.
- the sensor unit 221 may include a gyro sensor that detects information about the movement of the scanner 200 .
- the gyro sensor may sense information about the operation of the scanner 200 based on x, y, and z axes. For example, if a user shakes the scanner 200 while performing a scan operation, the gyro sensor may detect it and detect it as an input signal. Alternatively, it may be detected as an input signal according to the angle of the scanner at which the scanning operation is stopped.
- the sensor unit 221 may detect the number of times the scanner 200 is shaken, a specific angle at which the scanner 200 is positioned, and a motion of a specific pattern performed by holding the scanner 200 using different input signals.
- the sensor unit 221 may include an acceleration sensor capable of detecting acceleration in a two-axis (x, y) direction or a three-axis (x, y, z) direction.
- the acceleration sensor may detect it as an input signal.
- the acceleration sensor may detect them as different input signals.
- the sensor unit 221 may include a touch pad or a touch sensor.
- the touch sensor may detect this as an input signal.
- the touch sensor may detect different input signals according to the drag direction, drag time, and touch time or number of times.
- the sensor unit 221 may sense rotation, angular displacement, tilt, position, orientation, and the like.
- the user input unit 222 may include one or more of a keypad, a button, one or more direction buttons or one or more direction keys, a scroll key or a jog dial, a rotation wheel, or a rotation ring.
- the user input unit 222 includes a hard key button
- the user can input a user command to control the settings of the scanner 200 or control the adjustment of scan data acquired by the scanner 200 through a push operation of the hard key button.
- the user input unit 222 may detect the type of the button, the number of times the button is pressed, the time the button is pressed, and the like as different input signals.
- the user input unit 222 includes the rotating wheel
- the user may sense different input signals according to the number of times the rotating wheel is rotated, the rotating direction of the rotating wheel, the rotating angle of the rotating wheel, and the like.
- the one or more direction buttons or one or more direction keys may be detected as different input signals.
- the user input unit 222 may include various types of input means that the user can operate, such as a scroll key or a jog key.
- the microphone 223 may receive a user's voice input.
- the user's voice received by the microphone 223 may be recognized by a voice recognition device or the like and used as a user input.
- the communication interface 230 may communicate with the data processing device 100 through a wired or wireless communication network. Specifically, the communication interface 230 may communicate with the data processing device 100 under the control of the processor 100 .
- the communication interface is at least one short-distance communication module for performing communication according to communication standards such as Bluetooth, Wi-Fi, BLE (Bluetooth Low Energy), NFC/RFID, Wifi Direct, UWB, or ZIGBEE, long-distance It may include a telecommunication module for performing communication with a server for supporting long-distance communication according to a communication standard, and at least one port for connecting to an external electronic device through a wired cable in order to communicate by wire.
- communication standards such as Bluetooth, Wi-Fi, BLE (Bluetooth Low Energy), NFC/RFID, Wifi Direct, UWB, or ZIGBEE, long-distance
- a telecommunication module for performing communication with a server for supporting long-distance communication according to a communication standard, and at least one port for connecting to an external electronic device through a wired cable in order to communicate by wire.
- the image processing unit 240 may perform operations for generating and/or processing an image. Specifically, the image processing unit 240 may image-process the two-dimensional image data obtained from the optical unit 210 . The image processing unit 240 may output the two-dimensional image data obtained from the optical unit 210 to the communication interface 230 for transmission to the data processing apparatus by performing only a processing operation for data transmission.
- the memory 250 may store at least one instruction. Also, the memory 260 may store at least one instruction to be executed by the processor. Also, the memory may store at least one program executed by the processor 260 . Also, the memory 250 may temporarily store the two-dimensional image data received from the optical unit 210 to transmit it to the data processing device 100 .
- the memory 250 may store user input-function mapping information 300 defining a function corresponding to a user input received through the user interface 220 .
- the user input-function mapping information 300 includes information defining at least one user input and a function corresponding to the user input received through various means included in the user interface 220, that is, the sensor unit 221, button 222, and microphone 223.
- the function corresponding to the user input may largely include at least one of a function of controlling the setting of the optical unit 210 of the scanner 200 or a function of controlling the scan data obtained by the scanner 200 to be adjusted by the data processing apparatus 100 .
- the user input-function mapping information 300 will be described with reference to FIG. 3 .
- FIG 3 illustrates an example of user input-function mapping information 300 according to an embodiment.
- user input-function mapping information 300 may include information in which a user input input through each means of the user interface is mapped with a function corresponding thereto.
- the user interface may include one or more means of the user interface 220 included in the scanner 200 .
- the user input may represent a user input input through a corresponding user interface.
- the function may define a function performed in response to a user input.
- the function corresponding to the user input may largely include a function of controlling the setting of the optical unit 210 of the scanner 200 or a function of controlling the scan data obtained by the scanner 200 to be adjusted by the data processing apparatus 100 .
- a sensor or button included in the user interface, a user input corresponding thereto, and a corresponding function may be appropriately determined in consideration of the situation of the system.
- the user input-function mapping information 300 includes a first user input through a first sensor to a first function, a second user input through a first sensor to a second function, and a second user input through the first sensor.
- 3User input to the third function, the fourth user input through the second sensor to the fourth function, the fifth user input through the second sensor to the fifth function, and the sixth user input through the button to the sixth function For example, it may be information defined so that the seventh user input through the button is mapped to the seventh function.
- the motion of shaking once (user input) through the gyro sensor corresponds to the undo function
- the motion of shaking twice through the gyro sensor (user input) input) corresponds to the redo function
- the operation of placing the scanner at a predetermined angle through the gyro sensor corresponds to the locking function
- the single tap operation (user input) through the acceleration sensor changes the scan resolution
- double tapping through the accelerometer (user input) corresponds to the scan depth change function
- pushing the button once (user input) corresponds to the scan light source color change function
- the operation of pushing twice through (user input) can correspond to the filtering setting change function.
- FIG. 3 is an example, and various functions may be mapped in response to user input through various sensors.
- the user input-function mapping information 300 may be stored in a memory when the scanner 200 is manufactured.
- such user input-function mapping information 300 may be downloaded or updated through a server after the scanner 200 is manufactured or sold.
- the user input-function mapping information 300 may be customized according to a user's setting.
- the user input-function mapping information 300 may be transmitted to the scanner 200 after a user customizes a setting through the scanner 200 or after customizing a setting in the data processing device 100 .
- the processor 260 may control at least one component included in the scanner to perform an intended operation. Accordingly, although a case in which the processor performs predetermined operations is described as an example, it may mean that the processor controls at least one component included in the data processing apparatus so that the predetermined operations are performed.
- the processor 260 may control the optical unit 210 to stop the scanning operation when detecting and receiving a user input from the user interface 220 while scanning the object. For example, when the user holds the scanner 200 and scans the patient's mouth while changing the setting of the optics or wants to adjust the scan data, the user may manipulate the scanner 200 so that the user interface 220 detects a user input. In this case, the processor 260 may control to stop the scanning operation of the optical unit 210 in order to process the user input received through the user interface 220 . After stopping the scanning operation of the optical unit 210 and performing a function corresponding to the user input, the processor 260 may resume the scanning operation of the optical unit 210 according to the user input for resuming the scanning operation.
- a user input for resuming the scan operation may be received in various forms.
- the scanner 200 may be provided with a button for toggling starting and stopping a scan operation.
- the processor 260 may resume the scanning operation of the optical unit 210 according to the reception of a user input of pressing the button prepared as described above.
- the processor 260 may identify a function corresponding to the user input received from the user interface 220 with reference to the user input-function mapping information 251 stored in the memory 250 .
- the processor 260 may control the setting of the optical unit 210 according to a user input.
- the function of controlling the setting of the optical unit 210 may include one or more of a scan resolution change, a scan depth change, a scan light source color change, and a filtering setting change.
- the function of controlling the setting of the optical unit 210 may include changing the scan resolution.
- the scan resolution may include standard resolution (SD) and high resolution (HD), and changing the scan resolution may include changing the standard resolution to a high resolution or changing the high resolution to a standard resolution.
- the function of controlling the setting of the optical unit 210 may include changing the scan depth.
- the scan depth may indicate a value indicating a distance or depth from a tip portion in which an optical unit of the scanner is located to an area of a region to be scanned during a scanning operation of the scanner 200 .
- the scan depth of the scanner may be adjusted, for example, in the range of 12 mm to 21 mm, and an exemplary scan depth default may be 8.5 mm.
- such a scan depth may be set using an input device for the data processing device 100 by providing a graphic user interface capable of changing the scan depth in the data processing device 100 .
- the scan depth may be adjusted using a user interface provided in the scanner 200 .
- the sequence when one input signal is repeated using the user interface 220 of the scanner 200, the sequence may be changed from 12mm -> 15mm -> 18mm -> 21mm.
- the user interface 220 includes a gyro sensor and the motion of shaking the scanner 200 once corresponds to one user input signal, shake it once, change to 12mm, shake it twice, change it to 15mm, shake it three times to change it to 18mm , can be changed to 21mm by shaking 4 times, and changed to 12mm from 5 times again.
- the user interface 220 may include two direction keys, the first direction key may correspond to a change from 12mm -> 15mm, and the second direction key may correspond to a change from 15mm -> 12mm.
- the function of controlling the setting of the optical unit 210 may include changing the color of the scan light source.
- Changing the scan light source color may include changing the color of the light source of the projector from blue light to white light or from white light to blue light.
- the function of controlling the setting of the optical unit 210 may include changing the filtering setting.
- the filtering setting may include transmitting both tooth and gingival data without filtering the entire oral cavity, or filtering the gingiva from the acquired scan data and transmitting only the tooth data.
- the processor 260 transmits a control signal instructing to perform a function of processing the scan data to the data processing device 100 to transmit the communication interface 230 can control
- the function of processing the scan data may include, for example, a function of adjusting the scan data.
- the scanner 200 may include only some of the components illustrated in FIG. 2 , or may include more components in addition to the components illustrated in FIG. 2 .
- the data processing apparatus 100 may include a communication interface 110 , a user interface 120 , a display 130 , an image processing unit 140 , a memory 150 , and a processor 160 .
- the communication interface 110 may communicate with at least one external electronic device through a wired or wireless communication network. Specifically, the communication interface 110 may communicate with the scanner 200 under the control of the processor 160 . The communication interface 110 may communicate with an external electronic device or server connected through a wired/wireless communication network under the control of the processor.
- the communication interface includes at least one short-distance communication module for performing communication according to communication standards such as Bluetooth, Wi-Fi, BLE (Bluetooth Low Energy), NFC/RFID, Wifi Direct, UWB, or ZIGBEE. can do.
- communication standards such as Bluetooth, Wi-Fi, BLE (Bluetooth Low Energy), NFC/RFID, Wifi Direct, UWB, or ZIGBEE. can do.
- the communication interface 110 may further include a telecommunication module for performing communication with a server for supporting telecommunication according to a telecommunication standard.
- the communication interface 110 may include a remote communication module for performing communication through a network for Internet communication.
- the communication interface 110 may include a long-distance communication module for performing communication through a communication network conforming to a communication standard such as 3G, 4G, and/or 5G.
- the communication interface 110 may include at least one port for connecting to an external electronic device by a wired cable in order to communicate with an external electronic device (eg, intraoral scanner, etc.) by wire. Accordingly, the communication interface 110 may communicate with an external electronic device connected by wire through at least one port.
- an external electronic device eg, intraoral scanner, etc.
- the user interface 120 may receive a user input for controlling the data processing device 100 .
- the user interface 120 is a user input including a touch panel for sensing a user's touch, a button for receiving a user's push operation, and a mouse or keyboard for designating or selecting a point on the user interface screen.
- device may include, but is not limited to.
- the user interface 120 may include a voice recognition device for voice recognition.
- the voice recognition device may be a microphone, and the voice recognition device may receive a user's voice command or voice request. Accordingly, the processor may control an operation corresponding to the voice command or the voice request to be performed.
- the display 130 displays a screen. Specifically, the display 130 may display a predetermined screen under the control of the processor 160 . Specifically, the display 130 may display a user interface screen including an oral cavity image generated based on data obtained by scanning the patient's oral cavity with the scanner 200 . Alternatively, the display 130 may display a user interface screen including information related to a patient's dental treatment.
- the image processing unit 140 may perform operations for generating and/or processing an image. Specifically, the image processing unit 140 may receive raw data obtained from the scanner 200, and generate an oral image based on the received data. Specifically, the image processing unit 140 may obtain a three-dimensional oral model by processing the scan data received from the scanner 200 to generate three-dimensional data in a mesh form.
- the image processing unit 140 generates voxel data by processing the scan data received from the scanner 200 in real time to display the scan data in a three-dimensional form to the user, and displays an image corresponding to the generated voxel data on the display 130 can be displayed
- the data processing apparatus 100 may directly process the scan data into mesh data and display an image corresponding to the mesh data on the display 130 in real time.
- the image processing unit 140 is illustrated as a separate component in FIG. 2 , in another example, a program corresponding to the operation performed by the image processing unit 140 is stored in the memory 150 and the processor 160 executes the program stored in the memory 150, thereby performing an image processing operation can also be performed.
- the memory 150 may store at least one instruction. Also, the memory 150 may store at least one instruction to be executed by the processor. Also, the memory may store at least one program executed by the processor 160 . Also, the memory 150 may store data received from the scanner 200 (eg, raw data obtained through an intraoral scan, etc.). Alternatively, the memory may store an oral cavity image representing the oral cavity in three dimensions.
- the memory 150 may store all or part of the user input-function mapping information 300 . Specifically, the memory 150 may store all of the user input-function mapping information 300 or only information related to a control function performed by the data processing device 100 among the user input-function mapping information 300 .
- the processor 160 performs at least one instruction stored in the memory 150 to control an intended operation to be performed.
- at least one instruction may be stored in an internal memory included in the processor 160 or a memory 150 included in the data processing device separately from the processor.
- the processor 160 may control at least one configuration included in the data processing apparatus to perform an intended operation by executing at least one instruction. Accordingly, although a case in which the processor performs predetermined operations is described as an example, it may mean that the processor controls at least one component included in the data processing apparatus so that the predetermined operations are performed.
- the processor 160 controls the communication interface to receive scan data of the object obtained by scanning the object from the scanner by executing one or more instructions stored in the memory 150, and generated based on the scan data. control the display to display an image corresponding to three-dimensional data, and during displaying the image, the communication interface to receive from the scanner a control signal generated according to a user input sensed through one or more user interfaces of the scanner may control the display, perform a function of adjusting the generated 3D data according to the control signal, and control the display to display an image using the adjusted 3D data.
- the processor 160 executes one or more instructions stored in the memory 150, so that the control signal corresponds to an undo command, so that the most recently input preset number of frames constituting the 3D data frames can be hidden.
- the processor 160 executes one or more instructions stored in the memory 150, so that the control signal corresponds to a redo command, so that a predetermined number of frames that were most recently hidden can be recovered and processed. .
- the processor 160 executes one or more instructions stored in the memory 150, so that the hiding process or the recovery process is no longer allowed as the control signal corresponds to a locking command. Based on the data, three-dimensional data can be confirmed and processed.
- the processor 160 displays a first image corresponding to the first 3D data generated based on the scan data corresponding to the first area of the object on the sub screen by executing one or more instructions stored in the memory 150 and control the display to display a second image corresponding to second 3D data generated based on scan data corresponding to a second area of the object spaced apart from the first area on the main screen. and performing a function of adjusting three-dimensional data corresponding to the second image displayed on the main screen according to a control signal received from the scanner, and setting the display to display a second image using the adjusted three-dimensional data. can be controlled
- the processor 160 may set a locking function so that the data adjustment function is not performed on the 3D data corresponding to the first image displayed on the sub-screen by executing one or more instructions stored in the memory 150 .
- the processor 160 executes one or more instructions stored in the memory 150, so that the control signal corresponds to an undo command, so that the most recently input among frames constituting the second 3D data is preset.
- the display is controlled to display a first image corresponding to the first three-dimensional data corresponding to the sub-screen on the main screen. can do.
- the processor 160 includes at least one internal processor and a memory device (eg, RAM, ROM, etc.) for storing at least one of a program, an instruction, a signal, and data to be processed or used by the internal processor. It may be implemented in a form including.
- a memory device eg, RAM, ROM, etc.
- the processor 160 may include a graphic processing unit (Graphic Processing Unit) for processing a graphic corresponding to a video.
- the processor may be implemented as a system on chip (SoC) in which a core and a GPU are integrated.
- SoC system on chip
- the processor may include a single core or multiple cores.
- the processor may include a dual-core, triple-core, quad-core, hexa-core, octa-core, deca-core, dodeca-core, hexa-dash-vale core, and the like.
- the processor 160 may generate an oral cavity image based on the two-dimensional image data received from the oral cavity scanner 500 .
- the communication interface 110 may receive data obtained from the intraoral scanner 500, for example, raw data obtained through an intraoral scan.
- the processor 160 may generate a three-dimensional oral cavity image representing the oral cavity in three dimensions based on the raw data received from the communication interface.
- the intraoral scanner may include an L camera corresponding to a left field of view and an R camera corresponding to a right field of view in order to restore a three-dimensional image according to the optical triangulation method.
- the intraoral scanner may acquire L image data corresponding to the left field of view and R image data corresponding to the right field of view from the L camera and the R camera, respectively.
- the intraoral scanner (not shown) may transmit raw data including L image data and R image data to the communication interface of the data processing device 100 .
- the communication interface 110 may transmit the received raw data to the processor, and the processor may generate an oral cavity image representing the oral cavity in three dimensions based on the received raw data.
- the processor 160 may control the communication interface to directly receive an oral image representing the oral cavity from an external server, a medical device, or the like.
- the processor may acquire a three-dimensional oral image without generating a three-dimensional oral image based on raw data.
- the processor 160 performing operations means that the processor 160 directly executes at least one instruction to perform the above-described operations, as well as in the above-mentioned cases. It may include controlling other components to perform one operation.
- the data processing apparatus 100 may include only some of the components illustrated in FIG. 2 , or may include more components in addition to the components illustrated in FIG. 2 .
- the data processing device 100 may store and execute dedicated software linked to the scanner.
- the dedicated software may be called a dedicated program, a dedicated tool, or a dedicated application.
- dedicated software stored in the data processing device 100 may be connected to the scanner 200 to receive real-time data acquired through an oral scan.
- Medit has produced and distributed 'Medit Link', a software for processing, managing, using, and/or transmitting data acquired from an intraoral scanner (eg, i500).
- 'dedicated software refers to a program, tool, or application that can be operated in conjunction with the oral scanner, so various intraoral scanners developed and sold by various manufacturers may be commonly used.
- the above-described dedicated software may be produced and distributed separately from the oral scanner that performs the oral scan.
- the data processing device 100 may store and execute dedicated software corresponding to the i500 product.
- the transmission software may perform at least one operations to acquire, process, store, and/or transmit the oral image.
- the dedicated software may be stored in the processor.
- the dedicated software may provide a user interface for use of data acquired from the intraoral scanner.
- the user interface screen provided by the dedicated software may include an oral image generated according to the disclosed embodiment.
- FIG. 4 is an example of a flowchart illustrating an operation process of a scanner and a data processing apparatus according to an exemplary embodiment.
- the scanner 200 may scan an object and transmit scan data obtained by the scan to the data processing device 100 .
- the subject may be, but is not limited to, a model of the patient's mouth or teeth.
- the object may be any body part or object to be scanned.
- a physician may scan a patient's mouth by holding the scanner 200 and moving the scanner 200 along the patient's mouth.
- the scanned data may be transmitted to the data processing device 100 in real time.
- the data processing device 100 may receive scan data from the scanner 200 in real time, and display an image corresponding to the 3D data based on the received scan data. Specifically, the data processing apparatus 100 may generate 3D data corresponding to the oral model of the patient based on the scan data received from the scanner 200, and may display an image corresponding to the generated 3D data on the display.
- the 3D data may include voxel data or mesh data.
- FIG. 5 is a reference diagram for explaining a process of displaying a three-dimensional virtual model generated using two-dimensional data obtained through a scanner in a data processing apparatus according to an exemplary embodiment
- the scanner 200 projects light to the patient's oral cavity 510 through a projector, and acquires two-dimensional image data of the oral cavity through one or more cameras.
- the scanner 200 may project light onto a scan region of interest (ROI) 520 and acquire two-dimensional image data corresponding to the scan region 520 .
- Data obtained by scanning the scanner 200 while moving along the teeth of the oral cavity 510 may be transmitted to the data processing device 100 in real time.
- ROI scan region of interest
- the data processing apparatus 100 may receive 2D image data corresponding to the scan ROI 520 from the scanner 200 in real time, and may generate 3D data by matching one or more received 2D image data.
- the data processing device 100 may display data on the display screen 130 through two windows, that is, a main window 530 and a sub window 540 .
- the data processing device 100 displays an image corresponding to the 3D virtual model generated by the data processing device 100 in the main window (or the first window) 530 in the main window (or the first window), and in the sub-window (or the second window) 540 in the current scanner.
- An image of 3D data corresponding to the scanned ROI may be displayed.
- the data processing apparatus 100 processes the scan data of the scan ROI 520 received from the scanner 200 in real time and displays an image corresponding to the 3D data obtained in the sub-window 540, and the scan ROI received from the scanner 200 in real time.
- an image 550 of the generated 3D virtual model may be displayed on the main window 530 .
- a box 560 for indicating a scan ROI in the 3D virtual model may be displayed.
- 3D data corresponding to an image displayed on the main window 530 or sub-window 540 of the display 130 may include voxel data and/or mesh data.
- FIG. 6 is a reference diagram for explaining voxel data and mesh data processed based on scan data according to an exemplary embodiment.
- the data processing apparatus 100 may obtain a 3D virtual model by receiving and processing scan data from the scanner 200 and generating mesh data including polygons and vertices.
- the data processing device 100 may establish and apply a plan for oral treatment or correction based on the three-dimensional virtual model in the form of a mesh.
- the data processing apparatus 100 may calculate coordinates of a plurality of illuminated surface points using a triangulation method. As the amount of scan data increases by scanning while moving the surface of the object using the scanner, coordinates of the surface points may be accumulated. As a result of this image acquisition, a point cloud of vertices can be identified to represent the extent of the surface. Points in the point cloud may represent actual measured points on the three-dimensional surface of the object.
- the surface structure can be approximated by forming a polygonal mesh in which adjacent vertices of a point cloud are connected by line segments.
- the polygonal mesh may be variously determined, such as a triangular, quadrangular, or pentagonal mesh.
- the relationship between the polygons of the mesh model and the neighboring polygons may be used to extract features of a tooth boundary, for example, a curvature, a minimum curvature, an edge, a spatial relationship, and the like.
- 3D data may be configured as a triangular mesh generated by connecting a plurality of vertices constituting a point cloud and adjacent vertices with a line. Each vertex may include position information and color information as its properties.
- the position information that each vertex has as an attribute may be composed of X, Y, and Z coordinates on a three-dimensional coordinate system.
- the color information that each vertex has as an attribute may have an RGB value indicating a color obtained by a camera or an image sensor provided in the scanning device. In this way, the shape, contour, and color of the three-dimensional oral model can be expressed by the properties of each vertex, that is, position information and color information.
- the data processing apparatus 100 may receive scan data in real time to generate and display a three-dimensional virtual model in the form of a mesh. However, since high processing speed and high performance of the data processing device 100 are required in order for the data processing device 100 to receive the scan data and process it into a mesh-shaped 3D virtual model in real time, the data processing device 100 displays it on the screen of the display in real time. Data to be used may use voxel data instead of mesh data.
- the data processing apparatus 100 may generate volume data including volume elements (ie, voxels) that are basic data used to represent a 3D object by using scan data.
- a voxel refers to a pixel having a Z coordinate which is a third coordinate in addition to the x coordinate and the y coordinate in the Cartesian coordinate system.
- voxels represent equal-sized cubes forming a discretely defined 3D space.
- a typical voxel-based 3D scene may include one or more voxel sets, and each voxel set includes one or more voxels.
- the 3D voxel data is rendered to create a 2D image on an appropriate output device, such as a display.
- Rendering refers to generating a 2D graphic image on an output device from a 3D voxel data file, and includes generating an image using color or texture to give the image a sense of reality.
- the data processing apparatus 100 uses voxel data to display the image of the scan data received from the scan data in real time. data is available.
- the data processing apparatus 100 may display an image corresponding to 3D data in real time using voxel data or mesh data based on scan data.
- the scanner 200 may detect a user input through the user interface while scanning the object.
- the scanner 200 may detect a user input through one or more sensors included in the user interface, a button, a microphone, or the like.
- the scanner 200 may identify a function corresponding to a user input detected through the user interface. Specifically, the scanner 200 may store information in which functions corresponding to various user inputs detected through the user interface are mapped. For example, the scanner 200 may store user input-function mapping information 300 as shown in FIG. 3 . Accordingly, when the scanner 200 detects a user input, it may find a corresponding function with reference to the user input-function mapping information 300 as shown in FIG. 3 . The scanner 200 may identify whether the function corresponding to the user input is related to the scanner setting function or the scan data processing function. If it is determined that the function corresponding to the user input corresponds to the scanner setting function, operation 450 may be performed.
- the scanner 200 may stop the scan operation.
- the scanner 200 may perform a scanner setting operation corresponding to a user input.
- the scanner 200 may refer to the user input-function mapping information 300 as shown in FIG. 3 , identify a scanner setting function corresponding to the sensed user input, and perform the identified scanner setting function.
- the scanner setting function is a function of changing the settings of the scanner, and may include, for example, changing a scan resolution, changing a scan depth, changing a color of a scan light source, changing a filtering setting, and the like.
- operation 470 may be performed.
- the scanner 200 may stop the scan operation. Transmission of scan data in real time may also be stopped according to the interruption of the scan operation.
- the scanner 200 may transmit a control signal corresponding to the user input to the data processing device 100 .
- the scanner 200 refers to the user input-function mapping information 300 as shown in FIG. 3 , identifies a scan data processing function corresponding to the sensed user input, and instructs to perform the identified scan data processing function control signal may be transmitted to the data processing device 100 .
- the scan data processing function may include a return function, a restore function, and a locking function.
- the data processing apparatus 100 may receive a control signal corresponding to a user input from the scanner 200, and process 3D data according to the received control signal.
- the processing of the 3D data may include a reverting process, a restoring process, a locking process, etc. of the 3D data corresponding to the image displayed on the display of the data processing apparatus 100 .
- the data processing apparatus 100 may display an image corresponding to the processed 3D data.
- FIG. 7 is a detailed flowchart of an operation method performed by a data processing apparatus according to an embodiment.
- the data processing device 100 may receive scan data of a scanned object from the scanner 200 in real time.
- the data processing device 100 generates 3D data based on the scan data received from the scanner 200 in order to display the part of the object currently being scanned on the display in real time, and generates an image corresponding to the generated 3D data. It can be displayed on the display in real time.
- the data processing device 100 may receive a control signal from the scanner 200 .
- the control signal received from the scanner 200 may include a command instructing the scanner 200 to perform a data processing function in the data processing device 100 according to a user input detected through the user interface of the scanner 200 .
- the data processing device 100 parses the control signal received from the scanner 200 to return the control signal to an undo command, a redo command, or a locking command. It can be determined whether If it is determined that the control signal received from the scanner 200 corresponds to an undo command, operation 750 may be performed.
- the data processing apparatus 100 may hide the most recently inputted number of frames among the frames constituting the 3D data according to an undo command.
- FIG. 8 is a reference diagram for explaining a method in which the data processing apparatus operates according to the undo command received from the scanner, according to an exemplary embodiment.
- the data processing device 100 may generate 3D data based on scan data received from the scanner 200 and display an image 811 corresponding to the 3D data 810 on the display 130 in real time.
- the 3D data 810 may include a plurality of frames.
- the 3D data 810 may include frames F1 - F11.
- the data processing apparatus 100 may display an image 811 corresponding to 3D data including frames F1 - F11 on the display 130 .
- the user may wish to hide the recently scanned data for various purposes or reasons.
- the user may issue a data processing command by inputting a user using the user interface provided in the scanner 200 .
- the user interface provided in the scanner 200 .
- the user input is provided to the data processing device 100, and the data processing device 100 According to the undo command, a predetermined number of frames among frames constituting the generated 3D data 810 may be hidden.
- the data processing device 100 obtains the three-dimensional data 820 by hiding the most recently input frames F9, F10, and F11 among the frames of the three-dimensional data 810, An image 821 corresponding to the 3D data 820 in which the frames F9, F10, and F11 are hidden may be displayed on the display 130 .
- the data processing device 10 selects an example of a predetermined number of frames in the three-dimensional data 820 For example, three-dimensional data 830 is obtained by hiding three frames F6, F7, and F8, and in addition to frames F9, F10, and F11, a total of six frames, including frames F6, F7, and F8, corresponding to the hidden three-dimensional data 830
- the image 831 may be displayed on the display 130 .
- the hiding process according to the undo command is invisible to the user by not displaying some of the frames constituting the 3D data, and is different from the deletion process of the frame itself. Therefore, the user can make the hidden frames visible according to the reviving command.
- the 3D data to be hidden according to the undo command by the data processing apparatus 100 may include voxel data or mesh data generated based on scan data.
- the data processing apparatus 100 may determine variously the predetermined number of frames to be hidden in the 3D data according to the undo command.
- a user input corresponding to a return command may be received by continuously pressing a button provided on the scanner 200 or continuously shaking the scanner 200, and the data processing device 100 continuously presses the button from the scanner 200 A control signal corresponding to the pushing operation may be received.
- the data processing device 100 may continuously hide and display the 3D data from recently input frames until the button push operation is released, and when the button push operation is released, the hiding processing operation may be stopped at that time.
- the data processing apparatus 100 may receive information about a time when a user presses a button provided on the scanner 200 from the scanner 200, and determine the number of frames to be hidden in the 3D data in proportion to the time the button is pressed. For example, if the button is pressed for a short time, the data processing device 100 may hide a small number of frames, and if the button is pressed for a long time, the data processing device 100 may hide a large number of frames.
- operation 760 may be performed.
- the data processing apparatus 100 may restore a predetermined number of frames that have been most recently hidden among frames constituting the 3D data according to a redo command.
- FIG. 9 is a reference diagram for explaining a method in which a data processing apparatus operates according to a revive command received from a scanner, according to an exemplary embodiment.
- the data processing apparatus 100 may display an image 831 corresponding to 3D data 830 in which frames F6-F11 are hidden according to a two-fold operation as illustrated in FIG. 8 , on the display 130 .
- the user may wish to restore the recently hidden portion for various purposes or reasons.
- it is not hygienic to use the input device of the data processing device 100 for recovery processing of hidden data while holding the scanner 200 to scan the patient's oral cavity, and it is not hygienic and not efficient in processing the scan operation.
- the user may issue a data processing command by inputting a user using the user interface provided in the scanner 200 .
- the image 831 is displayed on the display 130 of the data processing device 100 and the user inputs a user input corresponding to a revive command to the scanner 200
- the user input is provided to the data processing device 100, and the data processing device 100
- the restore command a predetermined number of frames among the hidden frames among the frames constituting the 3D data 830 may be restored and processed.
- the data processing device 100 obtains the three-dimensional data 820 by restoring the most recently hidden frames F6, F7, and F8 among the frames of the three-dimensional data 830, and , an image 821 corresponding to the 3D data 820 in which the frames F6, F7, and F8 have been restored may be displayed on the display 130 .
- the data processing device 100 displays the frames most recently hidden in the 3D data 820 .
- the three-dimensional data 810 is obtained by recovering F9, F10, and F11, and an image 811 corresponding to the three-dimensional data 810 in which a total of six frames including frames F9, F10, and F11 are restored and processed is displayed 130 in addition to frames F6, F7, and F8. can be displayed in
- the 3D data to be restored by the data processing apparatus 100 according to the restore command may include voxel data or mesh data generated based on scan data.
- the data processing apparatus 100 may variously determine the predetermined number of frames to be restored and processed in the 3D data according to a revive command.
- a user input corresponding to a revive command may be received by continuously pressing a button provided on the scanner 200 or by continuously shaking the scanner 200, and the data processing device 100 continuously presses the button from the scanner 200 A control signal corresponding to the pushing operation may be received.
- the data processing device 100 may continuously restore and display the three-dimensional data from recently hidden frames until the button push operation is released, and when the button push operation is released, the restoration processing operation may be stopped.
- the data processing apparatus 100 may receive information about the time the user presses a button provided in the scanner 200 from the scanner 200, and determine the number of frames to be restored and processed in the 3D data in proportion to the time the button is pressed. For example, when the button is pressed for a short time, the data processing device 100 recovers a small number of frames, and when the button is pressed for a long time, the data processing device 100 recovers a large number of frames.
- operation 770 may be performed.
- the data processing device 100 may determine and process the 3D data in the current state so that the 3D data is no longer hidden or restored according to a locking command.
- FIG. 10 is a reference diagram for explaining a method in which a data processing apparatus operates according to a locking command received from a scanner according to an embodiment.
- the data processing apparatus 100 may display an image 821 corresponding to the 3D data 820 in which frames F9-F11 are hidden according to a one-time undo operation as shown in FIG. 8 , on the display 130 .
- the user may want to confirm the image in the current state so that the hiding process or the restoration process is not performed any more.
- the user may want the data in the current state to be definitively fixed without being modified any more after the undo operation and the restore operation are attempted.
- it is not hygienic to use the input device of the data processing device 100 for data confirmation processing while holding the scanner 200 to scan the patient's oral cavity, and it is not hygienic and not efficient in processing the scan operation.
- the user may issue a data processing command by inputting a user using the user interface provided in the scanner 200 .
- the data processing apparatus 100 may generate 3D data 840 composed of F1 to F8 and may display an image 841 corresponding to the 3D data 840 .
- the image 821 and the image 841 are the same, but the user can no longer apply the undo function or the restore function to the image 841 because the image 841 is an image after locking so that it cannot be edited anymore.
- the data processing device 100 cannot apply a reverting function or a reviving function to the 3D data determined by the locking operation, but a reverting function or to the 3D data generated based on the newly received scan data after locking.
- the revive function can be applied.
- the scan data is then received and the three-dimensional data including the new frames F9-F14 is received and the three-dimensional data 850 is obtained.
- the data processing device 100 may apply a restore function and a restore function to the newly created F9 - F14 after the locking operation.
- the data processing device 100 may generate three-dimensional data by aligning based on overlapping portions from the received scan data. .
- the user may scan a non-contiguous area of the oral cavity using the scanner 200 . For example, if the scan starts from the left molar and moves to the right molar in the middle to scan, the scanner can acquire scan data for two non-consecutive areas.
- the data processing apparatus 100 performs a rewinding operation or a resurrecting operation in a situation in which one or more objects simultaneously display 3D data for an area on a display will be described with reference to FIGS. 11 to 13 .
- FIG. 11 is a reference diagram for explaining an operation in which the data processing apparatus displays images corresponding to a plurality of regions of an object, according to an exemplary embodiment.
- the data processing apparatus 100 displays a thumbnail of a first image corresponding to the first area of the object on the first sub screen 1110, and a second spaced apart from the first area of the object on the second sub screen 1120 A thumbnail of the second image corresponding to the area may be displayed, and a third image corresponding to the first area or a third area spaced apart from the second area of the object may be displayed on the main screen 1130 .
- the number of sub-screens may be variously determined.
- the data processing apparatus 100 may perform a data processing operation, such as reverting, reviving, or locking, on 3D data corresponding to an image displayed on the main screen. Therefore, in order to perform a data processing operation on the image displayed on the sub screen, the image corresponding to the sub screen may be moved to the main screen.
- the user may exchange an image displayed on the first sub screen or the second sub screen with an image displayed on the main screen according to a user input. For example, the user causes a first image corresponding to the thumbnail displayed on the first sub screen to be displayed on the main screen by an operation of dragging a thumbnail displayed on the first sub screen to the first sub screen, and the third image displayed on the main screen A thumbnail of the image may be displayed on the first sub screen. After moving in this way, the data processing apparatus 100 may perform a data processing operation on the third image displayed on the main screen 1130 .
- the data processing apparatus 100 may not perform any operation. In this case, the user can manually move the image displayed on the sub screen to the main screen.
- the data processing device 100 when all images displayed on the main screen are hidden by a repetitive undo operation and there are no images to be displayed anymore, the data processing device 100 automatically moves the image corresponding to the sub screen to the main screen.
- the data processing apparatus 100 may move images corresponding to the sub screens to the main screen in the order of recently scanned data. For example, if the image of the second sub screen is more recently scanned data, the data processing apparatus 100 may move the image corresponding to the second sub screen to the main screen.
- the data processing apparatus may set locking for each sub-screen.
- the locking may mean setting such that a reverting operation or a reviving operation is not executed.
- a lock icon 1140 indicating locking is displayed on the first sub screen. In order to indicate locking, a visual effect can be given not only by the lock icon but also by using a color or thickness on the border of the sub screen.
- FIG. 12 is a reference diagram for explaining an operation of displaying 3D data corresponding to a first region of an object on a display of a data processing apparatus according to an exemplary embodiment.
- the data processing apparatus 100 receives scan data corresponding to a first region of an oral cavity, which is an object, processes the received scan data to generate three-dimensional data, and a first image corresponding to the generated three-dimensional data. can be displayed in real time on the main screen 1210 of the display.
- the data processing device 100 receives scan data corresponding to the second area that is not continuous with the first area, that is, spaced apart from the first area
- the thumbnail image of the first image corresponding to the first area is moved to the sub screen 1220 and displayed.
- the scan data corresponding to the second region may be processed to generate 3D data, and a second image corresponding to the generated 3D data may be displayed on the main screen 1210 of the display in real time.
- the user in a state in which images corresponding to different regions are displayed on the main screen 1210 and the sub screen 1220 of the display of the data processing device 100, the user can access the image displayed on the main screen 1210, that is, the second image corresponding to the second region. It is possible to perform a reverting action or a resurrecting action for each. That is, when the user inputs a user input corresponding to the undo operation by manipulating the scanner 200 while images corresponding to different regions are displayed on the main screen 1210 and the sub screen 1220 as shown in 1200B of FIG.
- the data processing device 100 may receive a control signal corresponding to the user input from the scanner 200, and perform hiding processing on the 3D data corresponding to the second image corresponding to the second area displayed on the main screen 1210 . Then, as the user repeatedly performs the undo operation on the second image corresponding to the second area displayed on the main screen 1210, all the second images displayed on the main screen 1210 are hidden, so that the second image to be displayed on the main screen 1210 is displayed. When it is not visible, the data processing device 100 may operate according to various scenarios.
- the data processing apparatus 100 may not automatically perform any operation. That is, since the data processing device 100 performs data processing operations, that is, returning, reviving, and locking operations, on the 3D data corresponding to the image displayed on the main screen, in order to perform the data processing operation on the image of the sub screen, the user The image should be moved to the main screen.
- the user may display the first image corresponding to the sub screen 1220 on the main screen 1210 through a user input such as dragging a thumbnail of the first image displayed on the sub screen 1220 to the main screen 1210.
- a user input such as dragging a thumbnail of the first image displayed on the sub screen 1220 to the main screen 1210.
- the data processing device 100 receives a control signal corresponding to the user input from the scanner 200, In response to the reception of the control signal, a hiding process may be performed on the 3D data of the first image corresponding to the main screen 1210 .
- the user in a state in which the second image is displayed on the main screen 1210 of the display of the data processing device 100 and the first image is displayed on the sub screen 1220, the user performs a data processing operation on the first image displayed on the sub screen 1220 If the user wants to exchange the positions of the second image displayed on the main screen 1210 and the first image displayed on the sub screen 1220 with each other, the first image is displayed on the main screen 1110, and the data for the first image is processing operations may be performed.
- the second image displayed on the main screen 1210 is hidden by the user repeatedly performing a reverting operation on the second image corresponding to the second area displayed on the main screen 1210.
- the data processing apparatus 100 may automatically move the image of the sub screen 1220 to the main screen 1210 .
- FIG. 13 is a reference diagram for explaining an operation in which the data processing apparatus 100 automatically moves an image of the sub screen 1220 to the main screen 1210 when all images displayed on the main screen 1210 are hidden, according to an exemplary embodiment.
- the display device 100 repeatedly performs the undo operation on the second image corresponding to the second area displayed on the main screen 1210, so that all the second images displayed on the main screen 1210 are hidden, and the main screen
- the data processing device 100 may automatically move the first image corresponding to the thumbnail displayed on the sub screen 1220 to the main screen 1210 .
- the data processing apparatus 100 may perform hiding processing on the 3D data corresponding to the first image displayed on the main screen 1210 .
- the method of processing an oral image according to an embodiment of the present disclosure may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium.
- an embodiment of the present disclosure may be a computer-readable storage medium in which one or more programs including at least one instruction for executing a method of processing an oral image are recorded.
- the computer-readable storage medium may include program instructions, data files, data structures, and the like alone or in combination.
- examples of the computer-readable storage medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and floppy disks.
- Magneto-optical media such as, and hardware devices configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like may be included.
- the device-readable storage medium may be provided in the form of a non-transitory storage medium.
- the 'non-transitory storage medium' may mean that the storage medium is a tangible device.
- the 'non-transitory storage medium' may include a buffer in which data is temporarily stored.
- the method for processing an oral image according to various embodiments disclosed herein may be provided by being included in a computer program product.
- the computer program product may be distributed in the form of a machine-readable storage medium (eg, compact disc read only memory (CD-ROM)). Alternatively, it may be distributed online (eg, downloaded or uploaded) through an application store (eg, play store, etc.) or directly between two user devices (eg, smartphones).
- the computer program product according to the disclosed embodiment may include a storage medium in which a program including at least one instruction for performing the method for processing an oral image according to the disclosed embodiment is recorded.
- Computer-readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media. Also, computer-readable media may include computer storage media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- the disclosed embodiments may be implemented as a S/W program including instructions stored in a computer-readable storage medium.
- a computer may include an electronic device according to the disclosed embodiments as a device capable of calling a stored instruction from a storage medium and operating according to the called instruction according to the disclosed embodiment.
- the computer-readable storage medium may be provided in the form of a non-transitory storage medium.
- 'non-transitory' means that the storage medium does not include a signal and is tangible, but does not distinguish that data is semi-permanently or temporarily stored in the storage medium.
- control method according to the disclosed embodiments may be provided included in a computer program product (computer program product).
- Computer program products may be traded between sellers and buyers as commodities.
- the computer program product may include a S/W program and a computer-readable storage medium in which the S/W program is stored.
- the computer program product may include a product (eg, a downloadable app) in the form of a S/W program distributed electronically through a device manufacturer or an electronic market (eg, Google Play Store, App Store).
- a product eg, a downloadable app
- the storage medium may be a server of a manufacturer, a server of an electronic market, or a storage medium of a relay server temporarily storing a SW program.
- the computer program product in a system consisting of a server and a device, may include a storage medium of the server or a storage medium of the device.
- a third device eg, a smartphone
- the computer program product may include a storage medium of the third device.
- the computer program product may include the S/W program itself transmitted from the server to the device or the third apparatus, or transmitted from the third apparatus to the device.
- one of the server, the device and the third apparatus may execute the computer program product to perform the method according to the disclosed embodiments.
- two or more of a server, a device, and a third apparatus may execute a computer program product to distribute the method according to the disclosed embodiments.
- a server may execute a computer program product stored in the server to control a device communicatively connected with the server to perform the method according to the disclosed embodiments.
- the third apparatus may execute a computer program product to control a device communicatively connected with the third apparatus to perform the method according to the disclosed embodiment.
- the third device may download the computer program product from the server and execute the downloaded computer program product.
- the third device may execute the computer program product provided in a preloaded state to perform the method according to the disclosed embodiments.
- unit may be a hardware component such as a processor or circuit, and/or a software component executed by a hardware component such as a processor.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Epidemiology (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Architecture (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Physiology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Geometry (AREA)
- Endoscopes (AREA)
- Image Input (AREA)
Abstract
Description
Claims (20)
- 데이터 처리 장치에 있어서,디스플레이,통신 인터페이스,하나 이상의 인스트럭션을 포함하는 메모리, 및상기 하나 이상의 인스트럭션을 실행하는 프로세서를 포함하고,상기 프로세서는 상기 하나 이상의 인스트럭션을 실행함으로써,대상체를 스캔함으로써 획득한 상기 대상체의 스캔 데이터를 스캐너로부터 수신하도록 상기 통신 인터페이스를 제어하고,상기 스캔 데이터에 기반하여 생성된 삼차원 데이터에 대응하는 이미지를 표시하도록 상기 디스플레이를 제어하고,상기 이미지를 표시하는 동안, 상기 스캐너의 하나 이상의 사용자 인터페이스를 통해서 감지된 사용자 입력에 따라 생성된 제어 신호를 상기 스캐너로부터 수신하도록 상기 통신 인터페이스를 제어하고,상기 제어 신호에 따라 상기 생성된 삼차원 데이터를 조정하는 기능을 수행하고,상기 조정된 삼차원 데이터에 기반하여 이미지를 표시하도록 상기 디스플레이를 제어하는, 데이터 처리 장치.
- 제1항에 있어서,상기 프로세서는 상기 하나 이상의 인스트럭션을 실행함으로써,상기 제어 신호가 되돌리기 (undo) 명령에 대응됨에 따라, 상기 삼차원 데이터를 구성하는 프레임들 중 가장 최근에 입력된 미리 지정된 개수의 프레임들을 숨김 처리하는, 데이터 처리 장치.
- 제2항에 있어서,상기 프로세서는 상기 하나 이상의 인스트럭션을 실행함으로써,상기 제어 신호가 되살리기 (redo) 명령에 대응됨에 따라, 가장 최근에 숨김 처리되었던 미리 지정된 개수의 프레임들을 복구 처리하는, 데이터 처리 장치.
- 제3항에 있어서,상기 프로세서는 상기 하나 이상의 인스트럭션을 실행함으로써,상기 제어 신호가 락킹 (locking) 명령에 대응됨에 따라, 상기 숨김 처리나 상기 복구 처리가 더 이상 허용되지 않도록 현재 상태의 삼차원 데이터에 기반하여 삼차원 데이터를 확정 처리하는, 데이터 처리 장치.
- 제1항에 있어서,상기 프로세서는 상기 하나 이상의 인스트럭션을 실행함으로써,상기 대상체의 제1영역에 대응하는 스캔 데이터에 기반하여 생성된 제1삼차원 데이터에 대응하는 제1이미지를 서브 화면에 표시하도록 상기 디스플레이를 제어하고,상기 제1영역과 이격된 상기 대상체의 제2영역에 대응하는 스캔 데이터에 기반하여 생성된 제2삼차원 데이터에 대응하는 제2이미지를 메인 화면에 표시하도록 상기 디스플레이를 제어하고,상기 스캐너로부터 수신된 제어 신호에 따라 상기 메인 화면에 표시된 상기 제2이미지에 대응하는 삼차원 데이터를 조정하는 기능을 수행하고,상기 조정된 삼차원 데이터에 기반하여 제2이미지를 표시하도록 상기 디스플레이를 제어하는, 데이터 처리 장치.
- 제5항에 있어서,상기 프로세서는 상기 하나 이상의 인스트럭션을 실행함으로써,상기 서브 화면에 표시된 상기 제1이미지에 대응하는 삼차원 데이터에 대해서 데이터 조정 기능이 수행되지 않도록 하는 락킹 기능을 설정하는, 데이터 처리 장치.
- 제5항에 있어서,상기 프로세서는 상기 하나 이상의 인스트럭션을 실행함으로써,상기 제어 신호가 되돌리기 (undo) 명령에 대응됨에 따라, 상기 제2삼차원 데이터를 구성하는 프레임들 중 가장 최근에 입력된 미리 지정된 개수의 프레임들을 숨김 처리하는 동작에 의해 상기 제2삼차원 데이터가 모두 숨김 처리되면, 상기 서브 화면에 대응하는 상기 제1삼차원 데이터에 대응하는 제1이미지를 상기 메인 화면에 표시하도록 상기 디스플레이를 제어하는, 데이터 처리 장치.
- 스캐너에 있어서,스캔 동작을 수행하는 광학부,하나 이상의 센서,통신 인터페이스, 및프로세서를 포함하고,상기 프로세서는,상기 광학부를 이용하여 대상체를 스캔함으로써 획득한 상기 대상체의 스캔 데이터를 데이터 처리 장치로 전송하도록 상기 통신 인터페이스를 제어하고,상기 하나 이상의 센서를 통해서 감지된 사용자 입력을 획득하고,상기 사용자 입력에 따라 상기 스캔 동작을 멈추도록 상기 광학부를 제어하고,상기 사용자 입력에 대응하는 기능을 수행하거나 또는 상기 사용자 입력에 대응하는 기능을 수행하도록 제어하는 제어 신호를 상기 데이터 처리 장치로 전송하도록 상기 통신 인터페이스를 제어하는, 스캐너.
- 제8항에 있어서,상기 하나 이상의 센서의 각 센서는 복수의 기능 중 하나 이상의 기능에 대응되는, 스캐너.
- 제9항에 있어서,상기 복수의 기능은,데이터 조정 기능, 락킹 기능, 상기 광학부의 설정을 조정하는 기능 중 적어도 하나를 포함하는, 스캐너.
- 데이터 처리 장치의 동작 방법에 있어서,대상체를 스캔함으로써 획득한 상기 대상체의 스캔 데이터를 스캐너로부터 수신하는 동작,상기 스캔 데이터에 기반하여 생성된 삼차원 데이터에 대응하는 이미지를 표시하는 동작,상기 이미지를 표시하는 동안, 상기 스캐너의 하나 이상의 사용자 인터페이스를 통해서 감지된 사용자 입력에 따라 생성된 제어 신호를 상기 스캐너로부터 수신하는 동작,상기 제어 신호에 따라 상기 생성된 삼차원 데이터를 조정하는 기능을 수행하는 동작, 및상기 조정된 삼차원 데이터에 기반하여 이미지를 표시하는 동작을 포함하는, 데이터 처리 장치의 동작 방법.
- 제11항에 있어서,상기 제어 신호가 되돌리기 (undo) 명령에 대응됨에 따라, 상기 삼차원 데이터를 구성하는 프레임들 중 가장 최근에 입력된 미리 지정된 개수의 프레임들을 숨김 처리하는 동작을 더 포함하는, 데이터 처리 장치의 동작 방법.
- 제12항에 있어서,상기 제어 신호가 되살리기 (redo) 명령에 대응됨에 따라, 가장 최근에 숨김 처리되었던 미리 지정된 개수의 프레임들을 복구 처리하는 동작을 더 포함하는, 데이터 처리 장치의 동작 방법.
- 제13항에 있어서,상기 제어 신호가 락킹 (locking) 명령에 대응됨에 따라, 상기 숨김 처리나 상기 복구 처리가 더 이상 허용되지 않도록 현재 상태의 삼차원 데이터에 기반하여 삼차원 데이터를 확정 처리하는 동작을 더 포함하는, 데이터 처리 장치의 동작 방법.
- 제11항에 있어서,상기 대상체의 제1영역에 대응하는 스캔 데이터에 기반하여 생성된 제1삼차원 데이터에 대응하는 제1이미지를 서브 화면에 표시하는 동작,상기 제1영역과 이격된 상기 대상체의 제2영역에 대응하는 스캔 데이터에 기반하여 생성된 제2삼차원 데이터에 대응하는 제2이미지를 메인 화면에 표시하는 동작,상기 스캐너로부터 수신된 제어 신호에 따라 상기 메인 화면에 표시된 상기 제2이미지에 대응하는 삼차원 데이터를 조정하는 기능을 수행하는 동작, 및상기 조정된 삼차원 데이터에 기반하여 제2이미지를 표시하는 동작을 더 포함하는, 데이터 처리 장치의 동작 방법.
- 제15항에 있어서,상기 서브 화면에 표시된 상기 제1이미지에 대응하는 삼차원 데이터에 대해서 데이터 조정 기능이 수행되지 않도록 하는 락킹 기능을 설정하는 동작을 더 포함하는, 데이터 처리 장치의 동작 방법.
- 제15항에 있어서,상기 제어 신호가 되돌리기 (undo) 명령에 대응됨에 따라, 상기 제2삼차원 데이터를 구성하는 프레임들 중 가장 최근에 입력된 미리 지정된 개수의 프레임들을 숨김 처리하는 동작에 의해 상기 제2삼차원 데이터가 모두 숨김 처리되면, 상기 서브 화면에 대응하는 상기 제1삼차원 데이터에 대응하는 제1이미지를 상기 메인 화면에 표시하는 동작을 더 포함하는, 데이터 처리 장치의 동작 방법.
- 광학부 및 하나 이상의 센서를 포함하는 스캐너의 동작 방법에 있어서,상기 광학부를 이용하여 대상체를 스캔함으로써 획득한 상기 대상체의 스캔 데이터를 데이터 처리 장치로 전송하는 동작,상기 하나 이상의 센서를 통해서 감지된 사용자 입력을 획득하는 동작,상기 사용자 입력에 따라 상기 스캔 동작을 멈추도록 상기 광학부를 제어하는 동작, 및상기 사용자 입력에 대응하는 기능을 수행하거나 또는 상기 사용자 입력에 대응하는 기능을 수행하도록 제어하는 제어 신호를 상기 데이터 처리 장치로 전송하는 동작을 포함하는, 스캐너의 동작 방법.
- 제18항에 있어서,상기 하나 이상의 센서의 각 센서는 복수의 기능 중 하나 이상의 기능에 대응되는, 스캐너의 동작 방법.
- 데이터 처리 장치의 동작 방법의 구현을 위해 상기 데이터 처리 장치의 프로세서에 의해 실행되는 하나 이상의 프로그램이 기록된 컴퓨터 판독 가능 기록 매체에 있어서,상기 데이터 처리 장치의 동작 방법은,대상체를 스캔함으로써 획득한 상기 대상체의 스캔 데이터를 스캐너로부터 수신하는 동작,상기 스캔 데이터에 기반하여 생성된 삼차원 데이터에 대응하는 이미지를 표시하는 동작,상기 이미지를 표시하는 동안, 상기 스캐너의 하나 이상의 사용자 인터페이스를 통해서 감지된 사용자 입력에 따라 생성된 제어 신호를 상기 스캐너로부터 수신하는 동작,상기 제어 신호에 따라 상기 생성된 삼차원 데이터를 조정하는 기능을 수행하는 동작, 및상기 조정된 삼차원 데이터에 기반하여 이미지를 표시하는 동작을 포함하는, 컴퓨터 판독 가능 기록 매체.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22784965.0A EP4321128A1 (en) | 2021-04-07 | 2022-04-06 | Data processing device, scanner, and method for operating same |
US18/285,694 US20240180397A1 (en) | 2021-04-07 | 2022-04-06 | Data processing device, scanner, and method for operating same |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20210045438 | 2021-04-07 | ||
KR10-2021-0045438 | 2021-04-07 | ||
KR10-2021-0131884 | 2021-10-05 | ||
KR1020210131884A KR102626891B1 (ko) | 2021-04-07 | 2021-10-05 | 데이터 처리 장치, 스캐너 및 그 동작 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022216056A1 true WO2022216056A1 (ko) | 2022-10-13 |
Family
ID=83545535
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/004974 WO2022216056A1 (ko) | 2021-04-07 | 2022-04-06 | 데이터 처리 장치, 스캐너 및 그 동작 방법 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240180397A1 (ko) |
EP (1) | EP4321128A1 (ko) |
WO (1) | WO2022216056A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100126700A (ko) * | 2008-01-23 | 2010-12-02 | 센서블 테크놀로지스, 인크. | 햅티컬 작동가능한 치아 모델링 시스템 |
KR20170113412A (ko) * | 2016-03-30 | 2017-10-12 | 주식회사바텍 | 치과용 구강스캐너 시스템 |
KR20170125924A (ko) * | 2015-03-06 | 2017-11-15 | 얼라인 테크널러지, 인크. | 구강 이미지의 자동 선택 및 잠금 |
KR20180126166A (ko) * | 2017-05-17 | 2018-11-27 | 주식회사바텍 | 디스플레이를 갖는 구강 스캐너 및 이를 포함하는 구강 스캐너 시스템 |
US20190011996A1 (en) * | 2015-03-06 | 2019-01-10 | Align Technology, Inc. | Intraoral scanner with touch sensitive input |
-
2022
- 2022-04-06 EP EP22784965.0A patent/EP4321128A1/en active Pending
- 2022-04-06 US US18/285,694 patent/US20240180397A1/en active Pending
- 2022-04-06 WO PCT/KR2022/004974 patent/WO2022216056A1/ko active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100126700A (ko) * | 2008-01-23 | 2010-12-02 | 센서블 테크놀로지스, 인크. | 햅티컬 작동가능한 치아 모델링 시스템 |
KR20170125924A (ko) * | 2015-03-06 | 2017-11-15 | 얼라인 테크널러지, 인크. | 구강 이미지의 자동 선택 및 잠금 |
US20190011996A1 (en) * | 2015-03-06 | 2019-01-10 | Align Technology, Inc. | Intraoral scanner with touch sensitive input |
KR20170113412A (ko) * | 2016-03-30 | 2017-10-12 | 주식회사바텍 | 치과용 구강스캐너 시스템 |
KR20180126166A (ko) * | 2017-05-17 | 2018-11-27 | 주식회사바텍 | 디스플레이를 갖는 구강 스캐너 및 이를 포함하는 구강 스캐너 시스템 |
Also Published As
Publication number | Publication date |
---|---|
US20240180397A1 (en) | 2024-06-06 |
EP4321128A1 (en) | 2024-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019039771A1 (en) | ELECTRONIC DEVICE FOR STORING DEPTH INFORMATION IN RELATION TO AN IMAGE BASED ON DEPTH INFORMATION PROPERTIES OBTAINED USING AN IMAGE, AND ITS CONTROL METHOD | |
WO2016006946A1 (ko) | 증강현실 컨텐츠의 생성 및 재생 시스템과 이를 이용한 방법 | |
WO2021242050A1 (ko) | 구강 이미지의 처리 방법, 그에 따른 동작을 수행하는 구강 진단 장치, 및 그 방법을 수행하는 프로그램이 저장된 컴퓨터 판독 가능 저장 매체 | |
WO2021133053A1 (ko) | 전자 장치 및 그의 제어 방법 | |
WO2019035582A1 (en) | DISPLAY APPARATUS AND SERVER, AND METHODS OF CONTROLLING THE SAME | |
WO2022092627A1 (ko) | 3차원 모델로부터 대상체 영역을 결정하는 방법 및 3차원 모델 처리 장치 | |
WO2016080653A1 (en) | Method and apparatus for image processing | |
WO2022035221A1 (ko) | 구강 이미지 처리 장치 및 구강 이미지 처리 방법 | |
WO2022216056A1 (ko) | 데이터 처리 장치, 스캐너 및 그 동작 방법 | |
WO2022164175A1 (ko) | 삼차원 구강 모델을 처리하는 방법 및 장치 | |
WO2021242053A1 (ko) | 3차원 데이터 획득 방법, 장치 및 그 방법을 수행하는 프로그램이 저장된 컴퓨터 판독 가능 저장 매체 | |
WO2022092802A1 (ko) | 삼차원 구강 모델을 처리하는 방법 및 장치 | |
WO2022065756A1 (ko) | 구강 이미지 처리 장치, 및 구강 이미지 처리 방법 | |
WO2022203305A1 (ko) | 데이터 처리 장치 및 데이터 처리 방법 | |
WO2023282619A1 (ko) | 3차원 모델 상에 텍스트를 추가하는 방법 및 3차원 모델 처리 장치 | |
WO2022019647A1 (ko) | 구강 이미지 처리 장치 및 구강 이미지 처리 방법 | |
WO2023038455A1 (ko) | 구강 이미지를 처리하는 방법 및 데이터 처리 장치 | |
WO2023059166A1 (ko) | 구강 이미지를 처리하는 방법 및 데이터 처리 장치 | |
WO2022203354A1 (ko) | 삼차원 구강 모델 처리 장치 및 삼차원 구강 모델 처리 방법 | |
WO2017155365A1 (en) | Electronic apparatus for providing panorama image and control method thereof | |
WO2023059167A1 (ko) | 구강 이미지 처리 장치 및 구강 이미지 처리 방법 | |
WO2023063767A1 (ko) | 구강 이미지 처리 장치, 및 구강 이미지 처리 방법 | |
EP3366037A1 (en) | Electronic apparatus for providing panorama image and control method thereof | |
WO2023063805A1 (ko) | 구강 이미지 처리 장치, 및 구강 이미지 처리 방법 | |
WO2022265270A1 (ko) | 이미지 처리 장치, 및 이미지 처리 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22784965 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18285694 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022784965 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022784965 Country of ref document: EP Effective date: 20231107 |