CN115546103A - Oral CBCT automatic registration method - Google Patents
Oral CBCT automatic registration method Download PDFInfo
- Publication number
- CN115546103A CN115546103A CN202211023684.8A CN202211023684A CN115546103A CN 115546103 A CN115546103 A CN 115546103A CN 202211023684 A CN202211023684 A CN 202211023684A CN 115546103 A CN115546103 A CN 115546103A
- Authority
- CN
- China
- Prior art keywords
- centroid
- point
- cbct
- tooth
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007408 cone-beam computed tomography Methods 0.000 title claims abstract description 104
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000001514 detection method Methods 0.000 claims abstract description 27
- 230000002980 postoperative effect Effects 0.000 claims abstract description 18
- 238000006243 chemical reaction Methods 0.000 claims abstract description 15
- 238000012163 sequencing technique Methods 0.000 claims abstract description 11
- 238000012549 training Methods 0.000 claims abstract description 10
- 238000012805 post-processing Methods 0.000 claims abstract description 8
- 239000007943 implant Substances 0.000 claims description 20
- 230000008569 process Effects 0.000 claims description 13
- 230000006870 function Effects 0.000 claims description 11
- 238000009499 grossing Methods 0.000 claims description 9
- 239000011159 matrix material Substances 0.000 claims description 9
- 238000013136 deep learning model Methods 0.000 claims description 8
- 230000011218 segmentation Effects 0.000 claims description 7
- 230000009466 transformation Effects 0.000 claims description 4
- 238000004458 analytical method Methods 0.000 claims description 3
- 238000003062 neural network model Methods 0.000 claims description 3
- 238000004519 manufacturing process Methods 0.000 claims description 2
- 238000011156 evaluation Methods 0.000 abstract description 8
- 238000000605 extraction Methods 0.000 abstract description 5
- 238000013135 deep learning Methods 0.000 abstract description 4
- 239000004053 dental implant Substances 0.000 abstract description 4
- 238000001356 surgical procedure Methods 0.000 abstract description 4
- 230000000694 effects Effects 0.000 description 7
- 238000002513 implantation Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 3
- 238000002372 labelling Methods 0.000 description 3
- 210000000214 mouth Anatomy 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
- G06V10/763—Non-hierarchical techniques, e.g. based on statistics of modelling distributions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30036—Dental; Teeth
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Multimedia (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
Abstract
The invention provides an oral CBCT automatic registration algorithm, which comprises the following steps: making a heat map label of the tooth centroid based on the oral CBCT image; training a tooth centroid detection model; respectively carrying out tooth centroid detection and post-processing on the CBCT images before and after the operation to obtain the three-dimensional coordinates of each tooth centroid point; clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper jaw from the lower jaw; sequencing the texture center points of the jaw to obtain an ordered texture center point set of two CBCT data; registering the texture center points of the jaw by using a point registration algorithm; and carrying out CBCT point cloud registration to obtain the preoperative and postoperative oral CBCT registration conversion relationship. The method combines the centroid extraction based on deep learning, the point registration algorithm and the ICP algorithm, has the advantages of simplicity in operation, high registration speed and high precision, provides accurate data basis for postoperative evaluation of dental implant surgery performed by medical personnel, and has wide application prospect and value.
Description
Technical Field
The invention relates to the field of medical image processing, in particular to an oral CBCT automatic registration method.
Background
In recent years, the dynamic navigation technology is widely applied and developed in the field of oral implant surgery. The main flow of the dynamic navigation tooth implantation operation is to plan the relative position of the implant in the CBCT of the patient oral cavity before the operation, and to navigate and track the CBCT model, plan the relative positions of the implant and the actual implant in real time in the operation process, so as to assist a doctor to implant the actual implant into the planned implant, namely to perform the operation more accurately.
The evaluation of the implantation precision after the implantation of the implant is an important evaluation standard of the effect of the dynamic navigation implantation operation, however, the registration precision of the CBCT models before and after the operation directly affects the accuracy and reliability of the evaluation result, so before the implantation precision evaluation, firstly, the CBCT of the oral cavity after the operation needs to be registered to the position of the CBCT of the oral cavity before the operation, and then, the precision errors of the actual implantation implant and the planning implant can be calculated.
The oral CBCT registration methods commonly used at present are a point registration algorithm and an iterative closest point algorithm (ICP). The principle of the point registration algorithm is that at least three registration points at the same position are manually selected on a CBCT model before and after an operation, such as the cusp or the concave position of a tooth, and then the position conversion relation of the two groups of selected points is calculated and applied to the CBCT registration. The method has the advantages that the point selection is performed manually, the registration success rate is extremely high, the defects are that the point selection process needs to be performed manually, the operation is complicated, and the point selection position is selected manually and subjectively, so that the registration accuracy is not high. The principle of the ICP algorithm is that isosurface extraction is firstly carried out on two CBCT models, point cloud registration is carried out on the two isosurfaces, the conversion relation of the isosurfaces is obtained, and finally the conversion relation is applied to the CBCT model registration. The method is an automatic registration algorithm, has the advantages of high registration precision and simplicity in operation when the registration is successful, and has the defects that the extracted isosurface possibly has great difference due to the fact that the shooting time of two CBCT models is different and whether the two CBCT models have implant difference exists or not, and therefore the registration success rate is not high.
Disclosure of Invention
The invention aims to overcome the defects of the prior art described in the background art, and provides an oral CBCT automatic registration method, which combines the centroid extraction based on deep learning, the point registration algorithm and the ICP algorithm, has the advantages of simple operation, high registration speed and high precision, provides accurate data basis for the postoperative evaluation of the dental implant operation by medical personnel, and has wide application prospect and value.
The invention is realized by adopting the following technical scheme: an oral CBCT automatic registration method is provided, which comprises the following steps:
making a heat map label of the tooth centroid based on the oral CBCT image;
training a tooth centroid detection model;
respectively carrying out tooth centroid detection and post-processing on the CBCT images before and after the operation to obtain three-dimensional coordinates of each tooth centroid point;
clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper jaw from the lower jaw;
sequencing the texture center points of the jaw to obtain an ordered texture center point set of two CBCT data;
registering the texture center points of the jaw by using a point registration algorithm;
and carrying out CBCT point cloud registration to obtain the preoperative and postoperative oral CBCT registration conversion relationship.
Further, the heat map labeling of tooth centroids based on oral CBCT images includes:
displaying three orthogonal slices including a transverse plane, a sagittal plane and a coronal plane based on the oral CBCT image;
delineating a foreground region and a background region on three orthogonal slices for each tooth;
based on the preliminarily sketched foreground and background areas, obtaining a tooth segmentation result by using a graph cutting algorithm;
calculating the central point of the segmentation result of each tooth to obtain the centroid point of the tooth;
and performing Gaussian smoothing on the center of mass point image to obtain a heat map label of the center of mass of the tooth.
Further, a smoothing radius of gaussian smoothing of the centroid point image is set to 2mm.
Further, the heat map label of the tooth centroid is a three-dimensional image with a numerical range of [0, 1], wherein the numerical value at the centroid point is 1, and the more distant the voxel points from the centroid point is, the smaller the value is.
Further, the training of the tooth centroid detection model comprises:
and training the input CBCT image and the corresponding mass center heat map label by adopting a deep learning model structure until a loss function is converged to obtain a neural network model parameter for mass center detection.
Further, the deep learning model adopts a UNet model, and the loss function adopts a mean square error loss MSELoss function.
Further, the tooth centroid detection and post-processing are respectively performed on the CBCT images before and after the operation, so as to obtain three-dimensional coordinates of each tooth centroid point, including:
carrying out centroid detection on the CBCT images before and after the operation respectively by using the tooth centroid detection model, wherein the obtained centroid detection result is a heat map;
binarizing the heat map by taking 0.5 as a threshold value;
performing connected domain analysis on the binarized image, wherein each connected domain represents one tooth;
and calculating the central point of each connected domain to obtain the centroid coordinate of each tooth under the LPS coordinate system.
Further, the clustering the coordinates of the centers of mass of the teeth by using a clustering algorithm to separate the upper and lower jaws comprises:
taking two points with the minimum and maximum Z-axis coordinates in the centroid points as initial clustering centers, and calculating the distance from each centroid point to the two clustering centers in the Z direction;
distributing clusters for each centroid point according to the distance, wherein the number of the clusters is set to be 2, namely the upper jaw and the lower jaw, and the cluster to which the cluster center with the smaller distance belongs is the cluster to which the centroid point belongs;
after each centroid point is distributed with a cluster, calculating the central points of the two clusters, taking the two central points as new clustering centers, calculating the distance from each centroid point to the clustering centers again, and distributing the clusters;
the above process is repeated until the cluster center is no longer changed, at which time the centroid points are clustered into two clusters representing the set of centroid points for the maxillary teeth and the set of centroid points for the mandibular teeth, respectively.
Further, the clustering algorithm adopts a K-Means algorithm.
Further, the step of sequencing the texture center points of the jaw to obtain an ordered texture center point set of the two CBCT data includes:
selecting a corresponding centroid point set according to the implant position of the planning operation, and if the planned implant is in the upper jaw, sequencing the centroid set of the lower jaw; if the planned implant is in the lower jaw, sequencing the mass center set of the upper jaw;
the coordinate range of the oral CBCT image under the LPS coordinate system is marked as [ xmin, ymin, zmin ] to [ xmax, ymax, zmax ];
determining an angular point according to the image range, wherein four points can be used as the angular points, namely [ xmin, ymax, zmin ], [ xmin, ymax, zmax ], [ xmax, ymax, zmin ], [ xmax, ymax, zmax ];
selecting one of the corner points as a target corner point, finding a centroid point closest to the target corner point as a first point, searching the centroid point closest to the point in the centroid point set as a next point, and removing the found point from the set;
repeating the above process until the set is empty, obtaining two ordered centroid point sets of CBCT data, and recording as sets respectivelyP A And setP B At the moment, the centroid points before and after the operation correspond to each other one by one.
Further, the registering the dentin center point of the jaw by using a point registration algorithm comprises the following steps:
computing two centroid point sets using a point registration algorithmP B ToP A Is converted into a matrixT 1 As shown in the following formula:
further, the CBCT point cloud registration is performed to obtain a preoperative and postoperative oral CBCT registration conversion relationship, including:
recording a preoperative CBCT model as A and a postoperative CBCT model as B;
by a relationship matrixMoving post-operative CBCT model B to a new locationAs a result of the coarse registration;
to A and A respectivelyExtracting the isosurface to obtain data of two groups of point cloudsAndthe following relationship is satisfied:
computing point clouds using ICP algorithmTo point cloudConversion relation ofThe following conversion relationship is satisfied:
finally, a transformation relation matrix for moving the post-operation CBCT model B to the pre-operation CBCT model A can be obtainedThe following relationship is satisfied:
the invention provides an oral CBCT automatic registration method, which combines the centroid extraction based on deep learning, the point registration algorithm and the ICP algorithm, can automatically register oral CBCT images of the same patient, has the advantages of simple operation, high registration speed and high precision, provides accurate data basis for the postoperative evaluation of medical personnel for dental implant surgery, and has wide application prospect and value. Compared with the existing manual point selection registration algorithm, the method has the advantages that the operation is convenient, and the registration precision is higher; compared with the ICP algorithm, the method provided by the invention has a higher registration success rate.
Drawings
Features, advantages and technical effects of exemplary embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of an oral CBCT automatic registration method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a thermal map labeling process for tooth centroids according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a centroid detection result provided by an embodiment of the present invention;
FIG. 4 is a diagram illustrating the results of model prediction and post-processing provided by an embodiment of the present invention;
FIG. 5 is a schematic diagram of four corner points of a CBCT image range according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a centroid point sorting process according to an embodiment of the present invention.
Detailed Description
Features and exemplary embodiments of various aspects of the present disclosure will be described in detail below, and in order to make objects, technical solutions and advantages of the present disclosure more apparent, the present disclosure will be described in further detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are intended to be illustrative only and are not intended to be limiting of the disclosure. It will be apparent to one skilled in the art that the present disclosure may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present disclosure by illustrating examples of the present disclosure.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element.
For a better understanding of the present invention, embodiments thereof will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of an oral CBCT automatic registration method according to an embodiment of the present invention.
As shown in fig. 1, the present invention provides an oral CBCT automatic registration method, which includes the following steps:
s101, manufacturing a heat map label of the tooth centroid based on the oral CBCT image;
s102, training a tooth centroid detection model;
s103, respectively carrying out tooth centroid detection and post-processing on the CBCT images before and after the operation to obtain three-dimensional coordinates of each tooth centroid point;
s104, clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper jaw and the lower jaw;
s105, sequencing the texture center points of the jaw to obtain an ordered texture center point set of two CBCT data;
s106, registering the texture center points of the jaw by using a point registration algorithm;
and S107, carrying out CBCT point cloud registration to obtain the preoperative and postoperative oral CBCT registration conversion relation.
Fig. 2 is a schematic flow chart of thermal map labeling of tooth centroids according to an embodiment of the present invention.
As shown in fig. 2, the creating of the heat map label of the tooth centroid based on the oral CBCT image in S101 includes:
displaying three orthogonal slices including a transverse plane, a sagittal plane and a coronal plane based on the oral CBCT image;
delineating a foreground region and a background region on three orthogonal slices for each tooth;
based on the preliminarily sketched foreground and background areas, obtaining a segmentation result of the teeth by using an image segmentation algorithm;
calculating a central point of the segmentation result of each tooth to obtain a centroid point of the tooth;
and performing Gaussian smoothing on the centroid point image to obtain a heat map label of the tooth centroid.
Alternatively, other segmentation algorithms that achieve the same or similar effect, such as level set, seed point growth, etc., may also be used in segmenting the teeth.
Optionally, a smoothing radius of gaussian smoothing of the centroid point image is set to 2mm.
Optionally, the heat map label of the tooth centroid is a three-dimensional image with a value range of [0, 1], wherein the value at the centroid point is 1 and the values of the voxel points farther from the centroid are smaller.
Optionally, the training of the tooth centroid detection model in S102 includes:
and training the input CBCT image and the corresponding mass center heat map label by adopting a deep learning model structure until a loss function is converged to obtain a neural network model parameter for mass center detection.
Optionally, the deep learning model employs a UNet model, and the loss function employs a mean square error loss mselos function.
Alternatively, the deep learning model may also adopt other deep learning model structures capable of achieving the same or similar effects, and the loss function may also adopt other loss functions capable of achieving the same or similar effects.
Optionally, the performing tooth centroid detection and post-processing on the CBCT images before and after the operation in S103 to obtain three-dimensional coordinates of each tooth centroid point includes:
carrying out centroid detection on the CBCT image by using the tooth centroid detection model, wherein the obtained centroid detection result is a heat map as shown in figure 3;
binarizing the heat map by taking 0.5 as a threshold value;
performing connected domain analysis on the binarized image, wherein each connected domain represents one tooth as shown in fig. 4;
and calculating the central point of each connected domain to obtain the centroid coordinate of each tooth under the LPS coordinate system.
Optionally, the clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper and lower jaws in S104 includes:
taking two points with the minimum and maximum Z-axis coordinates in the centroid points as initial clustering centers, and calculating the distance from each centroid point to the two clustering centers in the Z direction;
distributing clusters for each centroid point according to the distance, wherein the number of the clusters is set to be 2, namely the upper jaw and the lower jaw, and the cluster to which the cluster center with the smaller distance belongs is the cluster to which the centroid point belongs;
after each centroid point is distributed with a cluster, calculating the central points of the two clusters, taking the two central points as new clustering centers, calculating the distance from each centroid point to the clustering centers again, and distributing the clusters;
the above process is repeated until the cluster center is no longer changed, at which time the centroid points are clustered into two clusters representing the set of centroid points for the maxillary teeth and the set of centroid points for the mandibular teeth, respectively.
Optionally, the clustering algorithm employs a K-Means algorithm.
Alternatively, the clustering algorithm may also adopt other clustering algorithms that can achieve the same or similar effect, such as a Learning Vectorization (LVQ) algorithm, a KNN algorithm, and the like.
Optionally, the sorting of the centroid points of the jaw in S105 results in an ordered set of centroid points of the two CBCT data, including:
selecting a corresponding centroid point set according to the implant position of the planning operation, and if the planned implant is in the upper jaw, sequencing the centroid set of the lower jaw; if the planned implant is in the lower jaw, sequencing the mass center set of the upper jaw;
the coordinate range of the oral CBCT image under the LPS coordinate system is marked as [ xmin, ymin, zmin ] to [ xmax, ymax, zmax ];
determining a corner point according to the image range, as shown in fig. 5, four points can be used as corner points, which are [ xmin, ymax, zmin ], [ xmin, ymax, zmax ], [ xmax, ymax, zmin ], [ xmax, ymax, zmax ];
selecting a point 3 as a target corner point, as shown in fig. 6, finding a centroid point 1 closest to the corner point 3 as a first point, searching a centroid point 2 closest to the centroid point 1 in the centroid point set as a next point, and removing the centroid point 1 from the set;
repeating the above process until the set is empty, obtaining two ordered centroid point sets of CBCT data, and recording as sets respectivelyP A And collectionsP B At the moment, the centroid points before and after the operation correspond to each other one by one.
Optionally, the registering the dental centroid points by using a point registration algorithm in S106 includes:
computing two centroid point sets using a point registration algorithmP B ToP A Is converted into a matrixT 1 As shown in the following formula:
optionally, the performing CBCT point cloud registration in S107 to obtain a preoperative and postoperative oral CBCT registration conversion relationship includes:
recording a preoperative CBCT model as A and a postoperative CBCT model as B;
by a relation matrixMoving post-operative CBCT model B to a new locationAs a result of the coarse registration;
respectively to A and AExtracting the isosurface to obtain data of two groups of point cloudsAndthe following relationship is satisfied:
computing point clouds using ICP algorithmTo point cloudConversion relation of (2)The following conversion relationship is satisfied:
finally, a transformation relation matrix for moving the post-operation CBCT model B to the pre-operation CBCT model A can be obtainedThe following relationship is satisfied:
alternatively, other registration algorithms that can achieve the same or similar effect, such as RANSAC point cloud registration algorithm, 4PCS registration algorithm, etc., may also be used in performing CBCT point cloud registration.
The invention provides an oral CBCT automatic registration method, which combines the centroid extraction based on deep learning, the point registration algorithm and the ICP algorithm, can automatically register oral CBCT images of the same patient, has the advantages of simple operation, high registration speed and high precision, provides accurate data basis for the postoperative evaluation of medical personnel for dental implant surgery, and has wide application prospect and value. Compared with the existing manual point selection registration algorithm, the method has the advantages that the operation is convenient, and the registration precision is higher; compared with the ICP algorithm, the method provided by the invention has a higher registration success rate.
While the invention has been described with reference to a preferred embodiment, various modifications may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In particular, the technical features mentioned in the embodiments can be combined in any way as long as there is no structural conflict. It is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Claims (12)
1. An oral CBCT automatic registration method is characterized by comprising the following steps:
s101, manufacturing a heat map label of the tooth centroid based on the oral CBCT image;
s102, training a tooth centroid detection model;
s103, respectively carrying out tooth centroid detection and post-processing on the CBCT images before and after the operation to obtain three-dimensional coordinates of each tooth centroid point;
s104, clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper jaw and the lower jaw;
s105, sequencing the texture center points of the jaw to obtain an ordered texture center point set of two CBCT data;
s106, registering the texture center points of the jaw by using a point registration algorithm;
and S107, carrying out CBCT point cloud registration to obtain the preoperative and postoperative oral CBCT registration conversion relation.
2. The oral CBCT automatic registration method as claimed in claim 1, wherein the step of making heat map labels of tooth centroids based on oral CBCT images in S101 comprises:
displaying three orthogonal slices including a transverse plane, a sagittal plane and a coronal plane based on the oral CBCT image;
delineating a foreground region and a background region on three orthogonal slices for each tooth;
based on the preliminarily sketched foreground and background areas, obtaining a tooth segmentation result by using a graph cutting algorithm;
calculating a central point of the segmentation result of each tooth to obtain a centroid point of the tooth;
and performing Gaussian smoothing on the center of mass point image to obtain a heat map label of the center of mass of the tooth.
3. The oral CBCT automatic registration method according to claim 2, wherein a smoothing radius for Gaussian smoothing of the centroid point images is set to 2mm.
4. The oral CBCT automatic registration method as claimed in claim 2, wherein the heat map label of the tooth centroid is a three-dimensional image with a value range of [0, 1], wherein the value at the centroid point is 1, and the voxel point values farther from the centroid are smaller.
5. The oral CBCT automatic registration method as claimed in claim 1, wherein the training of the tooth centroid detection model in S102 comprises:
and training the input CBCT image and the corresponding mass center heat map label by adopting a deep learning model structure until a loss function is converged to obtain a neural network model parameter for mass center detection.
6. The oral CBCT automatic registration method of claim 5, wherein the deep learning model adopts a UNet model and the loss function adopts a mean square error loss (MSELoss) function.
7. The oral CBCT automatic registration method as claimed in claim 1, wherein the step S103 of performing tooth centroid detection and post-processing on the preoperative and postoperative CBCT images respectively to obtain three-dimensional coordinates of each tooth centroid point comprises:
carrying out centroid detection on the CBCT images before and after the operation respectively by using the tooth centroid detection model, wherein the obtained centroid detection result is a heat map;
binarizing the heat map by taking 0.5 as a threshold value;
performing connected domain analysis on the binarized image, wherein each connected domain represents one tooth;
and calculating the central point of each connected domain to obtain the centroid coordinate of each tooth under the LPS coordinate system.
8. The oral CBCT automatic registration method as claimed in claim 1, wherein said clustering the tooth centroid coordinates using a clustering algorithm to separate the upper and lower jaws in S104 comprises:
taking two points with the minimum and maximum Z-axis coordinates in the centroid points as initial clustering centers, and calculating the distance from each centroid point to the two clustering centers in the Z direction;
distributing clusters for each centroid point according to the distance, wherein the number of the clusters is set to be 2, namely the upper jaw and the lower jaw, and the cluster to which the cluster center with the smaller distance belongs is the cluster to which the centroid point belongs;
after each centroid point is distributed with a cluster, calculating the central points of the two clusters, taking the two central points as a new clustering center, calculating the distance from each centroid point to the clustering center again, and distributing the clusters;
the above process is repeated until the cluster center is no longer changed, at which time the centroid points are clustered into two clusters representing the set of centroid points for the maxillary teeth and the set of centroid points for the mandibular teeth, respectively.
9. The oral CBCT automatic registration method of claim 8, wherein said clustering algorithm employs K-Means algorithm.
10. The oral CBCT automatic registration method as claimed in claim 1, wherein the step of sorting the dental centroid points in S105 to obtain two ordered sets of the centroid points of the CBCT data comprises:
selecting a corresponding centroid point set according to the implant position of the planning operation, and if the planned implant is in the upper jaw, sequencing the centroid set of the lower jaw; if the planned implant is in the lower jaw, sequencing the mass center set of the upper jaw;
the coordinate range of the oral CBCT image under the LPS coordinate system is marked as [ xmin, ymin, zmin ] to [ xmax, ymax, zmax ];
determining an angular point according to the image range, wherein four points can be used as the angular points, namely [ xmin, ymax, zmin ], [ xmin, ymax, zmax ], [ xmax, ymax, zmin ], [ xmax, ymax, zmax ];
selecting one of the corner points as a target corner point, finding a centroid point closest to the target corner point as a first point, searching a centroid point closest to the centroid point in the centroid point set as a next point, and removing the found point from the set;
repeating the above process until the set is empty, obtaining two ordered centroid point sets of CBCT data, and recording as sets respectivelyP A And collectionsP B At the moment, the centroid points before and after the operation correspond to each other one by one.
11. The oral CBCT automatic registration method as claimed in claim 1, wherein the registration of the dental centroid points by using the point registration algorithm in S106 comprises:
calculating two centroid point sets by using point registration algorithmP B ToP A Is converted into a matrixT 1 As shown in the following formula:
12. the oral CBCT automatic registration method as claimed in claim 1, wherein the CBCT point cloud registration in S107 to obtain preoperative and postoperative oral CBCT registration transformation relationship comprises:
recording a preoperative CBCT model as A and a postoperative CBCT model as B;
by a relationship matrixMoving post-operative CBCT model B to a new locationAs a result of the coarse registration;
to A and A respectivelyExtracting the isosurface to obtain data of two groups of point cloudsAndthe following relationship is satisfied:
computing point clouds using ICP algorithmTo point cloudConversion relation ofThe following conversion relationship is satisfied:
finally, a transformation relation matrix for moving the post-operation CBCT model B to the pre-operation CBCT model A can be obtainedAnd satisfies the following relationship:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211023684.8A CN115546103A (en) | 2022-08-25 | 2022-08-25 | Oral CBCT automatic registration method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211023684.8A CN115546103A (en) | 2022-08-25 | 2022-08-25 | Oral CBCT automatic registration method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115546103A true CN115546103A (en) | 2022-12-30 |
Family
ID=84725664
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211023684.8A Pending CN115546103A (en) | 2022-08-25 | 2022-08-25 | Oral CBCT automatic registration method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115546103A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116883246A (en) * | 2023-09-06 | 2023-10-13 | 感跃医疗科技(成都)有限公司 | Super-resolution method for CBCT image |
-
2022
- 2022-08-25 CN CN202211023684.8A patent/CN115546103A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116883246A (en) * | 2023-09-06 | 2023-10-13 | 感跃医疗科技(成都)有限公司 | Super-resolution method for CBCT image |
CN116883246B (en) * | 2023-09-06 | 2023-11-14 | 感跃医疗科技(成都)有限公司 | Super-resolution method for CBCT image |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109410256B (en) | Automatic high-precision point cloud and image registration method based on mutual information | |
CN110189352B (en) | Tooth root extraction method based on oral cavity CBCT image | |
CN109934855A (en) | A kind of livewire work scene power components three-dimensional rebuilding method based on cloud | |
CN109785374A (en) | A kind of automatic unmarked method for registering images in real time of dentistry augmented reality surgical navigational | |
CN111696210A (en) | Point cloud reconstruction method and system based on three-dimensional point cloud data characteristic lightweight | |
AU2020101836A4 (en) | A method for generating femoral x-ray films based on deep learning and digital reconstruction of radiological image | |
CN102222357B (en) | Foot-shaped three-dimensional surface reconstruction method based on image segmentation and grid subdivision | |
CN101706971A (en) | Automatic division method of dental crowns in dental models | |
CN102254317A (en) | Method for automatically extracting dental arch curved surface in dental implantation navigation | |
CN112037200A (en) | Method for automatically identifying anatomical features and reconstructing model in medical image | |
CN105389444B (en) | A kind of gum edge curve design method of personalization tooth-implanting | |
CN106127753B (en) | CT images body surface handmarking's extraction method in a kind of surgical operation | |
CN103294883A (en) | Method and system for intervention planning for transcatheter aortic valve implantation | |
CN111968146A (en) | Three-dimensional tooth jaw mesh model segmentation method | |
EP1525560B1 (en) | Automated measurement of objects using deformable models | |
CN112790879B (en) | Tooth axis coordinate system construction method and system of tooth model | |
CN116229007B (en) | Four-dimensional digital image construction method, device, equipment and medium using BIM modeling | |
CN115619773B (en) | Three-dimensional tooth multi-mode data registration method and system | |
CN107680110A (en) | Inner ear three-dimensional level-set segmentation methods based on statistical shape model | |
CN115546103A (en) | Oral CBCT automatic registration method | |
CN106327479A (en) | Apparatus and method for identifying blood vessels in angiography-assisted congenital heart disease operation | |
CN109965979A (en) | A kind of steady Use of Neuronavigation automatic registration method without index point | |
CN113870326B (en) | Structural damage mapping, quantifying and visualizing method based on image and three-dimensional point cloud registration | |
CN113223063B (en) | Tooth registration method based on ICP algorithm and point cloud elimination algorithm | |
CN113974920A (en) | Knee joint femur force line determining method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |