CN111476905B - Robot-assisted tooth preparation simulation system based on augmented reality - Google Patents
Robot-assisted tooth preparation simulation system based on augmented reality Download PDFInfo
- Publication number
- CN111476905B CN111476905B CN202010261538.3A CN202010261538A CN111476905B CN 111476905 B CN111476905 B CN 111476905B CN 202010261538 A CN202010261538 A CN 202010261538A CN 111476905 B CN111476905 B CN 111476905B
- Authority
- CN
- China
- Prior art keywords
- model
- curve
- preparation
- real
- curved surface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000002360 preparation method Methods 0.000 title claims abstract description 91
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 17
- 238000004088 simulation Methods 0.000 title claims abstract description 11
- 230000003993 interaction Effects 0.000 claims abstract description 31
- 238000004891 communication Methods 0.000 claims abstract description 7
- 238000012800 visualization Methods 0.000 claims abstract description 5
- 238000000034 method Methods 0.000 claims description 22
- 238000004364 calculation method Methods 0.000 claims description 16
- 230000008859 change Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 11
- 230000008569 process Effects 0.000 claims description 10
- 210000004513 dentition Anatomy 0.000 claims description 9
- 230000036346 tooth eruption Effects 0.000 claims description 9
- 239000013598 vector Substances 0.000 claims description 8
- 238000009877 rendering Methods 0.000 claims description 4
- 230000009466 transformation Effects 0.000 claims description 4
- 238000001514 detection method Methods 0.000 claims description 3
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 2
- 238000013459 approach Methods 0.000 claims description 2
- 238000013178 mathematical model Methods 0.000 claims description 2
- 238000005457 optimization Methods 0.000 claims description 2
- 238000011002 quantification Methods 0.000 claims description 2
- 230000011218 segmentation Effects 0.000 claims description 2
- 210000000214 mouth Anatomy 0.000 abstract description 4
- 230000000694 effects Effects 0.000 abstract description 3
- 101100117387 Catharanthus roseus DPAS gene Proteins 0.000 description 4
- 208000002925 dental caries Diseases 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000036544 posture Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4023—Decimation- or insertion-based scaling, e.g. pixel or line decimation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/344—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30036—Dental; Teeth
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Public Health (AREA)
- Epidemiology (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Pathology (AREA)
- Data Mining & Analysis (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Biomedical Technology (AREA)
- Databases & Information Systems (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a robot-assisted tooth preparation simulation system based on augmented reality, relates to the technical field of robot-assisted tooth preparation, and aims to solve the problems that at present, doctors only view a patient tooth model by CT images, tooth preparation information and a tooth preparation scheme cannot be visualized, an ideal communication effect cannot be achieved with a patient and the like. By adding the computer-generated virtual patient tooth model into the real environment around the doctor to generate a virtual-real combined environment, the visualization and man-machine interaction modes of the doctor's oral cavity are improved, and the treatment efficiency is improved. The system comprises: the system comprises a real-time registration module, a tooth preparation track planning module, a UI and model real-time interaction module; the real-time registration module is used for meeting the requirement that a doctor can display the virtual medical model in any real environment in real time; the tooth preparation track planning module is used for obtaining a tooth preparation track of the auxiliary tooth preparation of the robot; the UI and model real-time interaction module provides a more convenient and flexible interaction mode for doctors or users, and obtains augmented reality experience with higher expressive force; thereby realizing more reasonable preoperative planning and improving the preparation precision of the preparation body.
Description
Technical Field
The invention relates to an augmented reality-based robot-assisted tooth preparation simulation system, and belongs to the technical field of robot-assisted tooth preparation.
Background
Caries is an important cause of tooth defect, and seriously affects oral health of people. Dental restoration is an important means for treating defects of teeth, and dental preparation is an essential treatment link in the restoration process, which means an operation process that a doctor quantitatively removes teeth at a caries site and forms a desired three-dimensional shape. In the traditional tooth preparation process, a great deal of repeated fine adjustment of caries teeth is required depending on manual operation of doctors and combining with abundant clinical experience. However, the current situation of unbalanced doctor-patient ratio in China cannot meet the current extremely large dental preparation demand, so that a robot and auxiliary software are required to be introduced to replace or assist doctors to efficiently complete dental preparation work, the problem of unbalanced doctor-patient can be relieved, and the dental preparation quality and the oral cavity treatment effect can be effectively improved.
In actual clinical dental restoration, a doctor only views a patient tooth model by using a CT image, tooth preparation information and a tooth preparation scheme cannot be visualized, and an ideal communication effect cannot be achieved with a patient. Therefore, a computer simulation technology is added on the basis of introducing the robot, so that the visualization and man-machine interaction modes of the oral cavity of a doctor are improved, and the treatment efficiency is improved.
Disclosure of Invention
Aiming at the problems, the invention aims to provide a robot auxiliary tooth preparation simulation system based on augmented reality, which solves the problem that an effective visual simulation system is lacking in the prior robot auxiliary tooth preparation, improves the visualization and man-machine interaction modes of a doctor oral cavity, improves the treatment efficiency, and further realizes the robot auxiliary tooth preparation.
The invention adopts the scheme for solving the problems that:
an augmented reality-based robot-assisted dental preparation simulation system, the augmented reality-based robot-assisted dental preparation simulation system comprising: the system comprises a real-time registration module, a tooth preparation track planning module, a UI and model real-time interaction module;
the real-time registration module utilizes image processing algorithms with various identification characteristics, then uses a camera to shoot pattern detection in an identification picture to identify ID information of an identification point, calculates relative pose of the identification point and a viewpoint so as to match a virtual object to the prepared identification pattern, and completes corresponding pairing between images of a time sequence, thereby realizing real-time tracking augmented reality, integrating actual doctor use scenes and meeting the requirement that a doctor can display a virtual medical model in any real environment in real time;
the tooth preparation track planning module is used for processing the model of the target tooth, obtaining the curved surface data of the target tooth, dividing, back calculating and interpolating the curved surface data, completing tooth preparation track planning of the curved surface according to the model value points and the curved surface sheet characteristics of the curved surface, and finally obtaining the tooth preparation track of the robot auxiliary tooth preparation;
the UI and model real-time interaction module is used for providing a more convenient and flexible interaction mode for doctors or users, obtaining augmented reality experience with better expressive force, and dividing information data into four functional scenes according to intraoperative requirements: "display patient dentition", "display preparation model", "dental preparation curve/surface" and "robot control section"; the UI and model real-time interaction module utilizes touch feedback to identify finger states, so that scaling, rotation and transparency change of the model are realized, doctors are helped to intuitively and comprehensively know information of patients and the condition of teeth after preparation, more reasonable preoperative planning is realized, communication with the patients is enhanced, the precision after preparation is higher, and the treatment efficiency is improved;
the beneficial effects of the invention are as follows:
1. in the processing process of the real-time registration module, the invention utilizes the space and the characteristic texture or shape on the registration object to complete registration, extracts the needed space information and characteristic information, completes the corresponding pairing between images of a time sequence, realizes the augmented reality of real-time tracking, completes the characteristic matching of the stored characteristics of the object in a database, further calculates all the positions and the postures of the camera, improves the registration precision of the depth direction, ensures that the implementation scene of the system is wider, and meets the requirement that a doctor can display a virtual medical model in any real environment in real time.
2. In the area dividing stage of the curved surface, the invention takes the adjacent characteristics of teeth as references, and the quantity N of the quadrilateral curved surface pieces S generated in the pretreatment stage is based on s Number N of triangular curved surface pieces T t And a special intersection point M number N m As a dividing parameter, the triangular curved surface piece T and the special intersection point M are found to be distributed on two sides with arc change, the functional tip inclined plane is divided into a left adjacent plane, a middle tip inclined plane and a right adjacent plane by taking a labial side as a reference direction, two boundary curves with extension characteristics in the u direction and the v direction are obtained, the personalized requirement of the functional tip inclined plane is met, and the operation rate of the tooth preparation curve is optimized.
3. In the back calculation process of the functional sharp inclined surface tooth preparation curve, the characteristic of the curve obtained by utilizing the cumulative chord length parameterization method is utilized in consideration of ineffective contact to the adjacent teeth of the target tooth in the preparation process, the distribution condition of discrete points Q according to the chord length is reflected, the preparation precision of the tooth preparation curve is effectively improved, and the functional sharp inclined surface tooth preparation curve is consistent, smooth and continuous.
Drawings
For ease of illustration, the invention is described in detail by the following detailed description and the accompanying drawings.
FIG. 1 is a principle of position registration;
FIG. 2 is an example of a natural mark point method based on depth information;
FIG. 3 is a graph of relationships between various functional scenarios;
FIG. 4 is a start interface UI design and scene transition C# procedure;
FIG. 5 is a back calculation flow of a curved surface control point;
FIG. 6 is a flow chart of surface interpolation;
FIG. 7 is a configuration interface and portion C# program of a scaling model;
FIG. 8 is a configuration interface and part of the C# procedure for a rotation model;
FIG. 9 is a configuration interface and part C# program that changes the transparency of a model;
FIG. 10 is a published DPAS AR with software screen shots;
FIG. 11 is a test showing a patient dentition scenario;
FIG. 12 is a graph/surface view showing patient dentition and dental preparation curves;
fig. 13 is a robot control part scene test.
Detailed Description
For the purpose of making apparent the objects, technical solutions and advantages of the present invention patent, the present invention patent is described below by way of specific embodiments shown in the drawings, but it should be understood that these descriptions are merely exemplary and are not intended to limit the scope of the present invention patent, and furthermore, in the following description, descriptions of well-known structures and techniques are omitted so as not to unnecessarily obscure the concepts of the present invention patent.
Description of the preferred embodiment
As shown in fig. 1, 2, 3, and 4, a robot-assisted dental preparation simulation system according to the present embodiment includes:
the system comprises a real-time registration module, a tooth preparation track planning module, a UI and model real-time interaction module;
the real-time registration module utilizes image processing algorithms with various identification characteristics, then uses a camera to shoot pattern detection in an identification picture to identify ID information of an identification point, calculates relative pose of the identification point and a viewpoint so as to match a virtual object to the prepared identification pattern, and completes corresponding pairing between images of a time sequence, thereby realizing real-time tracking augmented reality, integrating actual doctor use scenes and meeting the requirement that a doctor can display a virtual medical model in any real environment in real time;
the tooth preparation track planning module is used for processing the model of the target tooth, obtaining the curved surface data of the target tooth, dividing, back calculating and interpolating the curved surface data, completing tooth preparation track planning of the curved surface according to the model value points and the curved surface sheet characteristics of the curved surface, and finally obtaining the tooth preparation track of the robot auxiliary tooth preparation;
the UI and model real-time interaction module is used for providing a more convenient and flexible interaction mode for doctors or users, obtaining augmented reality experience with better expressive force, and dividing information data into four functional scenes according to intraoperative requirements: "display patient dentition", "display preparation model", "dental preparation curve/surface" and "robot control section"; the UI and model real-time interaction module utilizes touch feedback to identify finger states, so that scaling, rotation and transparency change of the model are realized, doctors are helped to intuitively and comprehensively know information of patients and the condition of teeth after preparation, more reasonable preoperative planning is realized, communication with the patients is enhanced, the precision after preparation is higher, and the treatment efficiency is improved;
examples II
As shown in fig. 5 and 6, in this embodiment, the specific process of implementing the function of the tooth preparation trajectory planning module is as follows:
1) Preparation of body model pretreatment
Scanning according to a tooth preparation model clinically by a doctor to obtain a standardized preparation model in an obj format, processing the standardized tooth preparation three-dimensional model in the obj format by using Geomagic Wrap reverse engineering software, and processing the generated triangular curved surface sheet and quadrilateral curved surface sheet with shape parameters and characteristics of the curved surface sheet to obtain a smoother preparation model and optimize calculation speed;
2) Region division of functional tip bevels
According to the number N of quadrilateral curved plates S generated in the pretreatment stage s Number N of triangular curved surface pieces T t And a special intersection point M number N m As dividing parameters, controlling the number N of quadrilateral curved surface pieces S s And the number N of the triangular curved surface pieces T t Ratio R of (2) s Within 0.05, wherein the special intersection point M refers to the number N of adjacent edge units in the topology optimization stage k >4 nodes; dividing the functional tip inclined plane into a left adjacent plane, a middle tip inclined plane and a right adjacent plane by taking the labial side as a reference direction;
3) Determination of functional tip-bevel node vectors U and V and control point P i,j Is the back calculation of (2)
Determining ith line type value point Q in u direction by adopting accumulated chord length parameterization method i,j ,j=0、1、...、m, i=0, 1, 2,..n, and let the parameters of the i-th row in the u-direction be u i,j :
The u-direction and v-direction parameterizations can take the arithmetic mean of the same column parameters for all rows to arrive at equations (2), (3):
because the functional tip inclined plane based on the NURBS curved surface has two directions, the model value point Q of the ith row in the u direction needs to be calculated p times first i,j The back calculation of the curve is carried out, so that the section line of the ith row can be constructed, and an intermediate control point R is obtained i,j The method comprises the steps of carrying out a first treatment on the surface of the Then fixing the j coordinate in the v direction, calculating q times, and performing back calculation on the j column in the v direction, wherein R i,j As the data point of the back calculation, the back calculation result can be used for obtaining the control point P of the j-th curve i,j The method comprises the steps of carrying out a first treatment on the surface of the Then get P i,j Namely a control point of the curved surface;
4) Curved surface interpolation of functional tip inclined plane
The interpolation process of the Functional tip inclined plane is realized by MATLAB software programming, and an interpolation function 'functional_level_interaction' is defined, wherein variables are the times p and q of the u direction and the v direction and N of the two directions u And N v Curved surface type value point Q i,j To output node vectors U and V and control point P x ,P y And P z Inputting the value output by the function into a plot_functional_level to draw a graph function, and inputting a grid to generate a total SUM value so as to complete the surface interpolation of the Functional tip inclined plane;
5) Generation of tooth preparation track of functional tip inclined plane robot
Dividing a curved surface into two parameter line groups according to the u direction and the v direction, comparing the curve lengths in the two directions, and selecting a longer reference line as a robot tail end track reference line and the other reference line direction as a feeding direction; the model value point Q obtained according to the step 3) i,j In order to enable the robot to accurately complete the expected adjacent tooth preparation curve, the curve needs to be divided, the more the number of theoretically divided segments, the more the tail end track of the robot approaches the expected curve, and a NURBS curve mathematical model is introduced;
ω i as the weight factor of the weight factor, i=0, 1,. -%, n; p (P) i To control vertices, i=0, 1..n, the number of which is n+1; p is NURBS curve number; n (N) i,p (u) is a basis function; n in formula (4) i,p (u) is:
in formula (5): u (u) i Is an element of a non-uniform node vector U, as shown in equation (6);
m+1 is the length of the node vector U; the relation among m, p and n is m=n+p+1; a and b are 0 and 1;
the interpolation principle is to use time sequence { t } 1 ,t 2 ,...,t k ,...,t n-1 ,t n Sequence of { u } segmentation parameters 1 ,u 2 ,...,u k ,u k+1 ,...,u n-1 ,u n And then obtains the interpolation point sequence { C (u) 1 ),C(u 2 ),...,C(u k ),...,C(u n-1 ),C(u n ) The interpolation core calculation is to calculate u by using the interpolation period T k And u k+1 The relation between them is further defined by C (u k ) Obtain C (u) k+1 );
First, the derivative of NURBS curve is deduced to obtain the first derivative C of curve pair u (1) (u);
Wherein: c (C) (1) (u) is the first derivative of the curve with respect to u;
continuing to calculate the second derivative of the curve to obtain C (2) (u):
Wherein: c (C) (2) (u) is the second derivative of the curve to u, the basis function N i,p (m) The expression of (u) is:
the feed rate on the NURBS curve is expressed as:
sorting the transformation formula (10) to obtain formula (11):
wherein: c (C) x (1) (u k ) Is the first derivative of the curve x direction; c (C) y (1) (u k ) Is the first derivative of the curve y direction; c (C) z (1) (u k ) Is the first derivative of the curve z direction;
continuing to calculate the second derivative of the formula (12), and obtaining:
wherein: c (C) x (2) (u k ) Is the second derivative of the curve in the x direction; c (C) y (2) (u k ) Is the second derivative of the curve y direction; c (C) z (2) (u k ) Is the second derivative of the curve in the z direction;
the relation between the geometrical characteristics and the movement characteristics of the functional tip inclined surface tooth preparation track based on NURBS curve is obtained by arrangement:
example III
As shown in fig. 7, 8, and 9, in this embodiment, the specific process of implementing the function of the UI and model real-time interaction module is:
1) Model scaling
Attaching a model in each scene by writing a C# program Scale of a scaling model; in the program, firstly, the touch condition of fingers needs to be recorded, the zooming operation is based on two-point operation in multi-point touch, the first touch point is recorded by using input. GetTouch (0), the second touch point is recorded by using input. GetTouch (1), the distance between the touch points recorded this time is subtracted by the distance oldDistance between the touch points recorded last time for recording the change condition of the gesture, if the difference between the distances is larger than 0, the gesture is enlarged, otherwise, the gesture is reduced, the basic factor of the zooming operation is the product of the model in XYZ three direction coefficients, the coefficients are obtained by the distances, and therefore, the model can realize the change of the size along with the zooming of the gesture;
2) Model rotation
The model is rotated by adjusting the sliding bar, the interaction of the rotating model can be completed in the touch operation of the scaling model, but the sliding bar is inconvenient for quantification of the interaction and visualization of the operation in a finger touch mode, the rotating condition can be visually checked and adjusted, when a C# program for controlling the rotation of the model by the sliding bar is written, an external parameter sliding bar and a model Object are defined, a sliding bar changing value is obtained by using a sliding component < sliding > (). Value, the rotation of the model is realized by using an Object transformation rotation () function, the obtained sliding bar value is used as the rotating speed of the model, and the rotating interaction of the model can be completed along with the sliding speed of the sliding bar;
3) Changing model transparency
The doctor can change the transparency of the model of the preparation body part according to the requirement, the new interaction mode can more intuitively check the relation of three stages, the model transparency is changed at present and the interaction mode of the sliding bar is still adopted, the change of the model transparency alpha value is completed by changing the value of the sliding bar, firstly, the sliding bar slide and the transparent model Object are defined externally, the value of a float type alpha is defined internally, alpha is assigned to alpha slide.value, when the value is equal to 1, the model opaque rendering mode is opaque, and when the value of the alpha slide is smaller than 1, the model is changed into transparent rendering mode;
example IV
To illustrate the method of using the present system, as shown in fig. 10, 11, 12, and 13, the following illustrates the operation of the present invention by specifically taking one example;
1) Start interface and Main Menu
The published DPAS AR is installed in the Mate 20Pro and runs, the DPAS AR can enter a starting interface of the dental preparation auxiliary software, the DPAS AR has information such as software names, units and version numbers, and the background real-time dynamic scene of the software is the same as the expected design; clicking a 'start' in a start interface to enter a main menu interface, wherein four scenes and a return key are arranged in the main menu and used for returning to the previous-stage start interface;
2) Displaying patient dentition
The patient dentition is selected to be displayed in the main menu to enter a preset scene I, the patient teeth CT image on the right side can be arbitrarily selected and checked, and the switching between CT images can be completed according to the requirements of doctors or users; after a CT image of a certain patient is selected, clicking a preset AR mode to enter the AR mode, selecting any plane according to the surrounding environment, and waiting for a few seconds to generate a guide icon for guiding and placing virtual patient dentition; one of the placement conditions set in the test is a double-click plane, namely a placement model, when a doctor or a user double-clicks the plane, the placement of the model can be completed, and the scaling and rotation of the model can be realized through the sliding bar and the touch operation below the model;
3) Displaying the model of the preparation and the tooth preparation curve/curved surface
Clicking back in the scene of displaying patient dentition, returning to a main menu, entering into the scene of displaying a preparation model, and opening tooth preparation information according to requirements; the tooth virtual model of each stage can be displayed by clicking three buttons on the left side to select the tooth for preparing each stage; likewise, a main menu can be returned to enter a scene of a tooth preparation curve/curved surface, and three models can be displayed according to a preset UI;
4) Robot control part
As shown in fig. 13, entering a "robot control part" scenario, installing a bluetooth module of a robot to a communication port of the robot, and when the bluetooth is not connected in an initial state, only displaying an energizing indicator lamp by an indicator lamp of the bluetooth module; the designed Bluetooth communication tooth preparation auxiliary robot is connected with tooth preparation auxiliary software, and a connection state indicator lamp of the Bluetooth module is in a normally-on state after connection so as to display that connection is successful;
while there has been shown and described what are at present considered to be the basic principles and the essential features of the invention and the advantages of the invention, it will be understood by those skilled in the art that the invention is not limited by the foregoing embodiments, but is described in the foregoing examples and description merely illustrative of the principles of the invention, and various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (1)
1. An augmented reality-based robot-assisted dental preparation simulation system is characterized in that: the robot auxiliary tooth preparation simulation system based on augmented reality comprises a real-time registration module, a tooth preparation track planning module and a UI and model real-time interaction module;
the real-time registration module utilizes image processing algorithms with various identification characteristics, then uses a camera to shoot pattern detection in an identification picture to identify ID information of an identification point, calculates relative pose of the identification point and a viewpoint so as to match a virtual object to the prepared identification pattern, and completes corresponding pairing between images of a time sequence, thereby realizing real-time tracking augmented reality, integrating actual doctor use scenes and meeting the requirement that a doctor can display a virtual medical model in any real environment in real time;
the tooth preparation track planning module is used for processing the model of the target tooth, obtaining the curved surface data of the target tooth, dividing, back calculating and interpolating the curved surface data, completing tooth preparation track planning of the curved surface according to the model value points and the curved surface sheet characteristics of the curved surface, and finally obtaining the tooth preparation track of the robot auxiliary tooth preparation;
the UI and model real-time interaction module is used for providing a more convenient and flexible interaction mode for doctors or users, obtaining augmented reality experience with better expressive force, and dividing information data into four functional scenes according to intraoperative requirements: "display patient dentition", "display preparation model", "dental preparation curve/surface" and "robot control section"; the UI and model real-time interaction module utilizes touch feedback to identify finger states, so that scaling, rotation and transparency change of the model are realized, doctors are helped to intuitively and comprehensively know information of patients and the condition of teeth after preparation, more reasonable preoperative planning is realized, communication with the patients is enhanced, the precision after preparation is higher, and the treatment efficiency is improved;
the specific process of realizing the functions of the tooth preparation track planning module is as follows:
1) Preparation of body model pretreatment
Scanning according to a tooth preparation model clinically by a doctor to obtain a standardized preparation model in an obj format, processing the standardized tooth preparation three-dimensional model in the obj format by using Geomagic Wrap reverse engineering software, and processing the generated triangular curved surface sheet and quadrilateral curved surface sheet with shape parameters and characteristics of the curved surface sheet to obtain a smoother preparation model and optimize calculation speed;
2) Region division of functional tip bevels
According to the number N of quadrilateral curved plates S generated in the pretreatment stage s Number N of triangular curved surface pieces T t And a special intersection point M number N m As dividing parameters, controlling the number N of quadrilateral curved surface pieces S s And the number N of the triangular curved surface pieces T t Ratio R of (2) s Within 0.05, wherein the special intersection point M refers to the number N of adjacent edge units in the topology optimization stage k >4 nodes; dividing the functional tip inclined plane into a left adjacent plane, a middle tip inclined plane and a right adjacent plane by taking the labial side as a reference direction;
3) Determination of functional tip-bevel node vectors U and V and control point P i,j Is the back calculation of (2)
Determining ith line type value point Q in u direction by adopting accumulated chord length parameterization method i,j J=0, 1,..m, i=0, 1, 2,..m, n, and let the parameter of the i-th row in the u-direction be u i,j :
The u-direction and v-direction parameterizations can take the arithmetic mean of the same column parameters for all rows to arrive at equations (2), (3):
because the functional tip inclined plane based on the NURBS curved surface has two directions, the model value point Q of the ith row in the u direction needs to be calculated p times first i,j The back calculation of the curve is carried out, so that the section line of the ith row can be constructed, and an intermediate control point R is obtained i,j The method comprises the steps of carrying out a first treatment on the surface of the Then fixing the j coordinate in the v direction, calculating q times, and performing back calculation on the j column in the v direction, wherein R i,j As the data point of the back calculation, the back calculation result can be used for obtaining the control point P of the j-th curve i,j The method comprises the steps of carrying out a first treatment on the surface of the Then get P i,j Namely a control point of the curved surface;
4) Curved surface interpolation of functional tip inclined plane
The interpolation process of the Functional tip inclined plane is realized by MATLAB software programming, and an interpolation function 'functional_level_interaction' is defined, wherein variables are the times p and q of the u direction and the v direction and N of the two directions u And N v Curved surface type value point Q i,j To output node vectors U and V and control point P x ,P y And P z Inputting the value output by the function into a plot_functional_level to draw a graph function, and inputting a grid to generate a total SUM value so as to complete the surface interpolation of the Functional tip inclined plane;
5) Robot tooth preparation track generation of functional tip inclined plane
Dividing a curved surface into two parameter line groups according to the u direction and the v direction, comparing the curve lengths in the two directions, and selecting a longer reference line as a robot tail end track reference line and the other reference line direction as a feeding direction; the model value point Q obtained according to the step 3) i,j In order to enable the robot to accurately complete the expected adjacent tooth preparation curve, the curve needs to be divided, the more the number of theoretically divided segments, the more the tail end track of the robot approaches the expected curve, and a NURBS curve mathematical model is introduced;
ω i as the weight factor of the weight factor, i=0, 1,. -%, n; p (P) i To controlVertex-making, i=0, 1,..n, the number of which is n+1; p is NURBS curve number; n (N) i,p (u) is a basis function; n in formula (4) i,p (u) is:
in formula (5): u (u) i Is an element of a non-uniform node vector U, as shown in equation (6);
m+1 is the length of the node vector U; the relation among m, p and n is m=n+p+1; a and b are 0 and 1;
the interpolation principle is to use time sequence { t } 1 ,t 2 ,...,t k ,...,t n-1 ,t n Sequence of { u } segmentation parameters 1 ,u 2 ,...,u k ,u k+1 ,...,u n-1 ,u n And then obtains the interpolation point sequence { C (u) 1 ),C(u 2 ),...,C(u k ),...,C(u n-1 ),C(u n ) The interpolation core calculation is to calculate u by using the interpolation period T k And u k+1 The relation between them is further defined by C (u k ) Obtain C (u) k+1 );
First, the derivative of NURBS curve is deduced to obtain the first derivative C of curve pair u (1) (u);
Wherein: c (C) (1) (u) is the first derivative of the curve with respect to u;
continuing to calculate the second derivative of the curve to obtain C (2) (u):
Wherein: c (C) (2) (u) is the second derivative of the curve to u, the basis function N i,p (m) The expression of (u) is:
the feed rate on the NURBS curve is expressed as:
sorting the transformation formula (10) to obtain formula (11):
wherein: c (C) x (1) (u k ) Is the first derivative of the curve x direction; c (C) y (1) (u k ) Is the first derivative of the curve y direction; c (C) z (1) (u k ) Is the first derivative of the curve z direction;
continuing to calculate the second derivative of the formula (12), and obtaining:
wherein: c (C) x (2) (u k ) Is the second derivative of the curve in the x direction; c (C) y (2) (u k ) Is the second derivative of the curve y direction; c (C) z (2) (u k ) Is the second derivative of the curve in the z direction;
the relation between the geometrical characteristics and the movement characteristics of the functional tip inclined surface tooth preparation track based on NURBS curve is obtained by arrangement:
the UI and model real-time interaction module realizes the specific process of the functions of the UI and model real-time interaction module:
1) Model scaling
Attaching a model in each scene by writing a C# program Scale of a scaling model; in the program, firstly, the touch condition of fingers needs to be recorded, the zooming operation is based on two-point operation in multi-point touch, the first touch point is recorded by using input. GetTouch (0), the second touch point is recorded by using input. GetTouch (1), the distance between the touch points recorded this time is subtracted by the distance oldDistance between the touch points recorded last time for recording the change condition of the gesture, if the difference between the distances is larger than 0, the gesture is enlarged, otherwise, the gesture is reduced, the basic factor of the zooming operation is the product of the model in XYZ three direction coefficients, the coefficients are obtained by the distances, and therefore, the model can realize the change of the size along with the zooming of the gesture;
2) Model rotation
The model is rotated by adjusting the sliding bar, the interaction of the rotating model can be completed in the touch operation of the scaling model, but the sliding bar is inconvenient for quantification of the interaction and visualization of the operation in a finger touch mode, the rotating condition can be visually checked and adjusted, when a C# program for controlling the rotation of the model by the sliding bar is written, an external parameter sliding bar and a model Object are defined, a sliding bar changing value is obtained by using a sliding component < sliding > (). Value, the rotation of the model is realized by using an Object transformation rotation () function, the obtained sliding bar value is used as the rotating speed of the model, and the rotating interaction of the model can be completed along with the sliding speed of the sliding bar;
3) Changing model transparency
The doctor can change the transparency of the model of the preparation body part according to the requirement, the new interaction mode can more intuitively check the relation of three stages, the model transparency is changed at present, the interaction mode of the sliding bar is still adopted, the change of the model transparency alpha value is completed by changing the value of the sliding bar, firstly, the sliding bar slide and the transparent model Object are defined externally, the value of the float type alpha is defined internally, alpha is assigned to alpha slide.value, when the value is equal to 1, the model opaque rendering mode is opaque, and when the value of the alpha slide is smaller than 1, the model is changed into transparent rendering mode.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010261538.3A CN111476905B (en) | 2020-04-04 | 2020-04-04 | Robot-assisted tooth preparation simulation system based on augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010261538.3A CN111476905B (en) | 2020-04-04 | 2020-04-04 | Robot-assisted tooth preparation simulation system based on augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111476905A CN111476905A (en) | 2020-07-31 |
CN111476905B true CN111476905B (en) | 2023-11-21 |
Family
ID=71749836
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010261538.3A Active CN111476905B (en) | 2020-04-04 | 2020-04-04 | Robot-assisted tooth preparation simulation system based on augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111476905B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109785374A (en) * | 2019-01-23 | 2019-05-21 | 北京航空航天大学 | A kind of automatic unmarked method for registering images in real time of dentistry augmented reality surgical navigational |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7717708B2 (en) * | 2001-04-13 | 2010-05-18 | Orametrix, Inc. | Method and system for integrated orthodontic treatment planning using unified workstation |
JP5676325B2 (en) * | 2010-08-10 | 2015-02-25 | 伊藤 秀文 | Information processing apparatus, information processing method, and program |
-
2020
- 2020-04-04 CN CN202010261538.3A patent/CN111476905B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109785374A (en) * | 2019-01-23 | 2019-05-21 | 北京航空航天大学 | A kind of automatic unmarked method for registering images in real time of dentistry augmented reality surgical navigational |
Non-Patent Citations (2)
Title |
---|
借助功能性牙合记录建牙合的计算机辅助设计基础;刘明丽 等;现代口腔医学杂志;143-145 * |
牙弓曲线发生器的运动规划与仿真;姜金刚 等;哈尔滨理工大学学报;32-36 * |
Also Published As
Publication number | Publication date |
---|---|
CN111476905A (en) | 2020-07-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10546405B2 (en) | Techniques and workflows for computer graphics animation system | |
Gregory et al. | intouch: Interactive multiresolution modeling and 3d painting with a haptic interface | |
CN108510577B (en) | Realistic motion migration and generation method and system based on existing motion data | |
Fuge et al. | Conceptual design and modification of freeform surfaces using dual shape representations in augmented reality environments | |
Zeleznik et al. | SKETCH: An interface for sketching 3D scenes | |
Schmid et al. | Overcoat: an implicit canvas for 3d painting | |
Yaqi et al. | Computer aided orthodontics treatment by virtual segmentation and adjustment | |
Coffey et al. | Design by dragging: An interface for creative forward and inverse design with simulation ensembles | |
CN101964116A (en) | In compact the inserting | |
De Araujo et al. | Blobmaker: Free-form modelling with variational implicit surfaces | |
Beccari et al. | A fast interactive reverse-engineering system | |
KR101867991B1 (en) | Motion edit method and apparatus for articulated object | |
EP3527163B1 (en) | Computer implemented method for modifying a digital three-dimensional model of a dentition | |
Lim et al. | A study of sketching behaviour to support free-form surface modelling from on-line sketching | |
Bornik et al. | A hybrid user interface for manipulation of volumetric medical data | |
US20230337898A1 (en) | Oral image processing method, oral diagnostic device for performing operation according thereto, and computer-readable storage medium in which program for performing method is stored | |
US9892485B2 (en) | System and method for mesh distance based geometry deformation | |
Vilanova et al. | VirEn: A virtual endoscopy system | |
Wan et al. | Geodesic distance-based realistic facial animation using RBF interpolation | |
CN111476905B (en) | Robot-assisted tooth preparation simulation system based on augmented reality | |
van Dijk et al. | Sketch input for conceptual surface design | |
Li et al. | Animating cartoon faces by multi‐view drawings | |
CN114237436A (en) | Camera path drawing method and rendering interaction system based on camera path drawing | |
Araújo et al. | Combining Virtual Environments and Direct Manipulation for Architectural Modeling' | |
Ehmann et al. | A touch‐enabled system for multi‐resolution modeling and 3D painting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |