CN115546103A - Oral CBCT automatic registration method - Google Patents

Oral CBCT automatic registration method Download PDF

Info

Publication number
CN115546103A
CN115546103A CN202211023684.8A CN202211023684A CN115546103A CN 115546103 A CN115546103 A CN 115546103A CN 202211023684 A CN202211023684 A CN 202211023684A CN 115546103 A CN115546103 A CN 115546103A
Authority
CN
China
Prior art keywords
centroid
point
cbct
tooth
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211023684.8A
Other languages
Chinese (zh)
Inventor
乔天
王捷
聂智林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Jianjia Medical Technology Co ltd
Original Assignee
Hangzhou Jianjia Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Jianjia Medical Technology Co ltd filed Critical Hangzhou Jianjia Medical Technology Co ltd
Priority to CN202211023684.8A priority Critical patent/CN115546103A/en
Publication of CN115546103A publication Critical patent/CN115546103A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30036Dental; Teeth

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Multimedia (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

The invention provides an oral CBCT automatic registration algorithm, which comprises the following steps: making a heat map label of the tooth centroid based on the oral CBCT image; training a tooth centroid detection model; respectively carrying out tooth centroid detection and post-processing on the CBCT images before and after the operation to obtain the three-dimensional coordinates of each tooth centroid point; clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper jaw from the lower jaw; sequencing the texture center points of the jaw to obtain an ordered texture center point set of two CBCT data; registering the texture center points of the jaw by using a point registration algorithm; and carrying out CBCT point cloud registration to obtain the preoperative and postoperative oral CBCT registration conversion relationship. The method combines the centroid extraction based on deep learning, the point registration algorithm and the ICP algorithm, has the advantages of simplicity in operation, high registration speed and high precision, provides accurate data basis for postoperative evaluation of dental implant surgery performed by medical personnel, and has wide application prospect and value.

Description

Oral CBCT automatic registration method
Technical Field
The invention relates to the field of medical image processing, in particular to an oral CBCT automatic registration method.
Background
In recent years, the dynamic navigation technology is widely applied and developed in the field of oral implant surgery. The main flow of the dynamic navigation tooth implantation operation is to plan the relative position of the implant in the CBCT of the patient oral cavity before the operation, and to navigate and track the CBCT model, plan the relative positions of the implant and the actual implant in real time in the operation process, so as to assist a doctor to implant the actual implant into the planned implant, namely to perform the operation more accurately.
The evaluation of the implantation precision after the implantation of the implant is an important evaluation standard of the effect of the dynamic navigation implantation operation, however, the registration precision of the CBCT models before and after the operation directly affects the accuracy and reliability of the evaluation result, so before the implantation precision evaluation, firstly, the CBCT of the oral cavity after the operation needs to be registered to the position of the CBCT of the oral cavity before the operation, and then, the precision errors of the actual implantation implant and the planning implant can be calculated.
The oral CBCT registration methods commonly used at present are a point registration algorithm and an iterative closest point algorithm (ICP). The principle of the point registration algorithm is that at least three registration points at the same position are manually selected on a CBCT model before and after an operation, such as the cusp or the concave position of a tooth, and then the position conversion relation of the two groups of selected points is calculated and applied to the CBCT registration. The method has the advantages that the point selection is performed manually, the registration success rate is extremely high, the defects are that the point selection process needs to be performed manually, the operation is complicated, and the point selection position is selected manually and subjectively, so that the registration accuracy is not high. The principle of the ICP algorithm is that isosurface extraction is firstly carried out on two CBCT models, point cloud registration is carried out on the two isosurfaces, the conversion relation of the isosurfaces is obtained, and finally the conversion relation is applied to the CBCT model registration. The method is an automatic registration algorithm, has the advantages of high registration precision and simplicity in operation when the registration is successful, and has the defects that the extracted isosurface possibly has great difference due to the fact that the shooting time of two CBCT models is different and whether the two CBCT models have implant difference exists or not, and therefore the registration success rate is not high.
Disclosure of Invention
The invention aims to overcome the defects of the prior art described in the background art, and provides an oral CBCT automatic registration method, which combines the centroid extraction based on deep learning, the point registration algorithm and the ICP algorithm, has the advantages of simple operation, high registration speed and high precision, provides accurate data basis for the postoperative evaluation of the dental implant operation by medical personnel, and has wide application prospect and value.
The invention is realized by adopting the following technical scheme: an oral CBCT automatic registration method is provided, which comprises the following steps:
making a heat map label of the tooth centroid based on the oral CBCT image;
training a tooth centroid detection model;
respectively carrying out tooth centroid detection and post-processing on the CBCT images before and after the operation to obtain three-dimensional coordinates of each tooth centroid point;
clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper jaw from the lower jaw;
sequencing the texture center points of the jaw to obtain an ordered texture center point set of two CBCT data;
registering the texture center points of the jaw by using a point registration algorithm;
and carrying out CBCT point cloud registration to obtain the preoperative and postoperative oral CBCT registration conversion relationship.
Further, the heat map labeling of tooth centroids based on oral CBCT images includes:
displaying three orthogonal slices including a transverse plane, a sagittal plane and a coronal plane based on the oral CBCT image;
delineating a foreground region and a background region on three orthogonal slices for each tooth;
based on the preliminarily sketched foreground and background areas, obtaining a tooth segmentation result by using a graph cutting algorithm;
calculating the central point of the segmentation result of each tooth to obtain the centroid point of the tooth;
and performing Gaussian smoothing on the center of mass point image to obtain a heat map label of the center of mass of the tooth.
Further, a smoothing radius of gaussian smoothing of the centroid point image is set to 2mm.
Further, the heat map label of the tooth centroid is a three-dimensional image with a numerical range of [0, 1], wherein the numerical value at the centroid point is 1, and the more distant the voxel points from the centroid point is, the smaller the value is.
Further, the training of the tooth centroid detection model comprises:
and training the input CBCT image and the corresponding mass center heat map label by adopting a deep learning model structure until a loss function is converged to obtain a neural network model parameter for mass center detection.
Further, the deep learning model adopts a UNet model, and the loss function adopts a mean square error loss MSELoss function.
Further, the tooth centroid detection and post-processing are respectively performed on the CBCT images before and after the operation, so as to obtain three-dimensional coordinates of each tooth centroid point, including:
carrying out centroid detection on the CBCT images before and after the operation respectively by using the tooth centroid detection model, wherein the obtained centroid detection result is a heat map;
binarizing the heat map by taking 0.5 as a threshold value;
performing connected domain analysis on the binarized image, wherein each connected domain represents one tooth;
and calculating the central point of each connected domain to obtain the centroid coordinate of each tooth under the LPS coordinate system.
Further, the clustering the coordinates of the centers of mass of the teeth by using a clustering algorithm to separate the upper and lower jaws comprises:
taking two points with the minimum and maximum Z-axis coordinates in the centroid points as initial clustering centers, and calculating the distance from each centroid point to the two clustering centers in the Z direction;
distributing clusters for each centroid point according to the distance, wherein the number of the clusters is set to be 2, namely the upper jaw and the lower jaw, and the cluster to which the cluster center with the smaller distance belongs is the cluster to which the centroid point belongs;
after each centroid point is distributed with a cluster, calculating the central points of the two clusters, taking the two central points as new clustering centers, calculating the distance from each centroid point to the clustering centers again, and distributing the clusters;
the above process is repeated until the cluster center is no longer changed, at which time the centroid points are clustered into two clusters representing the set of centroid points for the maxillary teeth and the set of centroid points for the mandibular teeth, respectively.
Further, the clustering algorithm adopts a K-Means algorithm.
Further, the step of sequencing the texture center points of the jaw to obtain an ordered texture center point set of the two CBCT data includes:
selecting a corresponding centroid point set according to the implant position of the planning operation, and if the planned implant is in the upper jaw, sequencing the centroid set of the lower jaw; if the planned implant is in the lower jaw, sequencing the mass center set of the upper jaw;
the coordinate range of the oral CBCT image under the LPS coordinate system is marked as [ xmin, ymin, zmin ] to [ xmax, ymax, zmax ];
determining an angular point according to the image range, wherein four points can be used as the angular points, namely [ xmin, ymax, zmin ], [ xmin, ymax, zmax ], [ xmax, ymax, zmin ], [ xmax, ymax, zmax ];
selecting one of the corner points as a target corner point, finding a centroid point closest to the target corner point as a first point, searching the centroid point closest to the point in the centroid point set as a next point, and removing the found point from the set;
repeating the above process until the set is empty, obtaining two ordered centroid point sets of CBCT data, and recording as sets respectivelyP A And setP B At the moment, the centroid points before and after the operation correspond to each other one by one.
Further, the registering the dentin center point of the jaw by using a point registration algorithm comprises the following steps:
computing two centroid point sets using a point registration algorithmP B ToP A Is converted into a matrixT 1 As shown in the following formula:
Figure 606045DEST_PATH_IMAGE002
further, the CBCT point cloud registration is performed to obtain a preoperative and postoperative oral CBCT registration conversion relationship, including:
recording a preoperative CBCT model as A and a postoperative CBCT model as B;
by a relationship matrix
Figure 100002_DEST_PATH_IMAGE003
Moving post-operative CBCT model B to a new location
Figure 176704DEST_PATH_IMAGE004
As a result of the coarse registration;
to A and A respectively
Figure 776312DEST_PATH_IMAGE004
Extracting the isosurface to obtain data of two groups of point clouds
Figure 100002_DEST_PATH_IMAGE005
And
Figure 237206DEST_PATH_IMAGE006
the following relationship is satisfied:
Figure 7585DEST_PATH_IMAGE008
Figure 194984DEST_PATH_IMAGE010
computing point clouds using ICP algorithm
Figure 163946DEST_PATH_IMAGE006
To point cloud
Figure 411388DEST_PATH_IMAGE005
Conversion relation of
Figure 100002_DEST_PATH_IMAGE011
The following conversion relationship is satisfied:
Figure DEST_PATH_IMAGE013
finally, a transformation relation matrix for moving the post-operation CBCT model B to the pre-operation CBCT model A can be obtained
Figure 996959DEST_PATH_IMAGE014
The following relationship is satisfied:
Figure 784786DEST_PATH_IMAGE016
the invention provides an oral CBCT automatic registration method, which combines the centroid extraction based on deep learning, the point registration algorithm and the ICP algorithm, can automatically register oral CBCT images of the same patient, has the advantages of simple operation, high registration speed and high precision, provides accurate data basis for the postoperative evaluation of medical personnel for dental implant surgery, and has wide application prospect and value. Compared with the existing manual point selection registration algorithm, the method has the advantages that the operation is convenient, and the registration precision is higher; compared with the ICP algorithm, the method provided by the invention has a higher registration success rate.
Drawings
Features, advantages and technical effects of exemplary embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of an oral CBCT automatic registration method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a thermal map labeling process for tooth centroids according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a centroid detection result provided by an embodiment of the present invention;
FIG. 4 is a diagram illustrating the results of model prediction and post-processing provided by an embodiment of the present invention;
FIG. 5 is a schematic diagram of four corner points of a CBCT image range according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a centroid point sorting process according to an embodiment of the present invention.
Detailed Description
Features and exemplary embodiments of various aspects of the present disclosure will be described in detail below, and in order to make objects, technical solutions and advantages of the present disclosure more apparent, the present disclosure will be described in further detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are intended to be illustrative only and are not intended to be limiting of the disclosure. It will be apparent to one skilled in the art that the present disclosure may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present disclosure by illustrating examples of the present disclosure.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element.
For a better understanding of the present invention, embodiments thereof will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of an oral CBCT automatic registration method according to an embodiment of the present invention.
As shown in fig. 1, the present invention provides an oral CBCT automatic registration method, which includes the following steps:
s101, manufacturing a heat map label of the tooth centroid based on the oral CBCT image;
s102, training a tooth centroid detection model;
s103, respectively carrying out tooth centroid detection and post-processing on the CBCT images before and after the operation to obtain three-dimensional coordinates of each tooth centroid point;
s104, clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper jaw and the lower jaw;
s105, sequencing the texture center points of the jaw to obtain an ordered texture center point set of two CBCT data;
s106, registering the texture center points of the jaw by using a point registration algorithm;
and S107, carrying out CBCT point cloud registration to obtain the preoperative and postoperative oral CBCT registration conversion relation.
Fig. 2 is a schematic flow chart of thermal map labeling of tooth centroids according to an embodiment of the present invention.
As shown in fig. 2, the creating of the heat map label of the tooth centroid based on the oral CBCT image in S101 includes:
displaying three orthogonal slices including a transverse plane, a sagittal plane and a coronal plane based on the oral CBCT image;
delineating a foreground region and a background region on three orthogonal slices for each tooth;
based on the preliminarily sketched foreground and background areas, obtaining a segmentation result of the teeth by using an image segmentation algorithm;
calculating a central point of the segmentation result of each tooth to obtain a centroid point of the tooth;
and performing Gaussian smoothing on the centroid point image to obtain a heat map label of the tooth centroid.
Alternatively, other segmentation algorithms that achieve the same or similar effect, such as level set, seed point growth, etc., may also be used in segmenting the teeth.
Optionally, a smoothing radius of gaussian smoothing of the centroid point image is set to 2mm.
Optionally, the heat map label of the tooth centroid is a three-dimensional image with a value range of [0, 1], wherein the value at the centroid point is 1 and the values of the voxel points farther from the centroid are smaller.
Optionally, the training of the tooth centroid detection model in S102 includes:
and training the input CBCT image and the corresponding mass center heat map label by adopting a deep learning model structure until a loss function is converged to obtain a neural network model parameter for mass center detection.
Optionally, the deep learning model employs a UNet model, and the loss function employs a mean square error loss mselos function.
Alternatively, the deep learning model may also adopt other deep learning model structures capable of achieving the same or similar effects, and the loss function may also adopt other loss functions capable of achieving the same or similar effects.
Optionally, the performing tooth centroid detection and post-processing on the CBCT images before and after the operation in S103 to obtain three-dimensional coordinates of each tooth centroid point includes:
carrying out centroid detection on the CBCT image by using the tooth centroid detection model, wherein the obtained centroid detection result is a heat map as shown in figure 3;
binarizing the heat map by taking 0.5 as a threshold value;
performing connected domain analysis on the binarized image, wherein each connected domain represents one tooth as shown in fig. 4;
and calculating the central point of each connected domain to obtain the centroid coordinate of each tooth under the LPS coordinate system.
Optionally, the clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper and lower jaws in S104 includes:
taking two points with the minimum and maximum Z-axis coordinates in the centroid points as initial clustering centers, and calculating the distance from each centroid point to the two clustering centers in the Z direction;
distributing clusters for each centroid point according to the distance, wherein the number of the clusters is set to be 2, namely the upper jaw and the lower jaw, and the cluster to which the cluster center with the smaller distance belongs is the cluster to which the centroid point belongs;
after each centroid point is distributed with a cluster, calculating the central points of the two clusters, taking the two central points as new clustering centers, calculating the distance from each centroid point to the clustering centers again, and distributing the clusters;
the above process is repeated until the cluster center is no longer changed, at which time the centroid points are clustered into two clusters representing the set of centroid points for the maxillary teeth and the set of centroid points for the mandibular teeth, respectively.
Optionally, the clustering algorithm employs a K-Means algorithm.
Alternatively, the clustering algorithm may also adopt other clustering algorithms that can achieve the same or similar effect, such as a Learning Vectorization (LVQ) algorithm, a KNN algorithm, and the like.
Optionally, the sorting of the centroid points of the jaw in S105 results in an ordered set of centroid points of the two CBCT data, including:
selecting a corresponding centroid point set according to the implant position of the planning operation, and if the planned implant is in the upper jaw, sequencing the centroid set of the lower jaw; if the planned implant is in the lower jaw, sequencing the mass center set of the upper jaw;
the coordinate range of the oral CBCT image under the LPS coordinate system is marked as [ xmin, ymin, zmin ] to [ xmax, ymax, zmax ];
determining a corner point according to the image range, as shown in fig. 5, four points can be used as corner points, which are [ xmin, ymax, zmin ], [ xmin, ymax, zmax ], [ xmax, ymax, zmin ], [ xmax, ymax, zmax ];
selecting a point 3 as a target corner point, as shown in fig. 6, finding a centroid point 1 closest to the corner point 3 as a first point, searching a centroid point 2 closest to the centroid point 1 in the centroid point set as a next point, and removing the centroid point 1 from the set;
repeating the above process until the set is empty, obtaining two ordered centroid point sets of CBCT data, and recording as sets respectivelyP A And collectionsP B At the moment, the centroid points before and after the operation correspond to each other one by one.
Optionally, the registering the dental centroid points by using a point registration algorithm in S106 includes:
computing two centroid point sets using a point registration algorithmP B ToP A Is converted into a matrixT 1 As shown in the following formula:
Figure 14779DEST_PATH_IMAGE002
optionally, the performing CBCT point cloud registration in S107 to obtain a preoperative and postoperative oral CBCT registration conversion relationship includes:
recording a preoperative CBCT model as A and a postoperative CBCT model as B;
by a relation matrix
Figure 229860DEST_PATH_IMAGE003
Moving post-operative CBCT model B to a new location
Figure 912514DEST_PATH_IMAGE004
As a result of the coarse registration;
respectively to A and A
Figure 504032DEST_PATH_IMAGE004
Extracting the isosurface to obtain data of two groups of point clouds
Figure 588532DEST_PATH_IMAGE005
And
Figure 177776DEST_PATH_IMAGE006
the following relationship is satisfied:
Figure 415902DEST_PATH_IMAGE008
Figure 748795DEST_PATH_IMAGE010
computing point clouds using ICP algorithm
Figure 750118DEST_PATH_IMAGE006
To point cloud
Figure 510263DEST_PATH_IMAGE005
Conversion relation of (2)
Figure 229827DEST_PATH_IMAGE011
The following conversion relationship is satisfied:
Figure 366410DEST_PATH_IMAGE013
finally, a transformation relation matrix for moving the post-operation CBCT model B to the pre-operation CBCT model A can be obtained
Figure 956660DEST_PATH_IMAGE014
The following relationship is satisfied:
Figure 887707DEST_PATH_IMAGE016
alternatively, other registration algorithms that can achieve the same or similar effect, such as RANSAC point cloud registration algorithm, 4PCS registration algorithm, etc., may also be used in performing CBCT point cloud registration.
The invention provides an oral CBCT automatic registration method, which combines the centroid extraction based on deep learning, the point registration algorithm and the ICP algorithm, can automatically register oral CBCT images of the same patient, has the advantages of simple operation, high registration speed and high precision, provides accurate data basis for the postoperative evaluation of medical personnel for dental implant surgery, and has wide application prospect and value. Compared with the existing manual point selection registration algorithm, the method has the advantages that the operation is convenient, and the registration precision is higher; compared with the ICP algorithm, the method provided by the invention has a higher registration success rate.
While the invention has been described with reference to a preferred embodiment, various modifications may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In particular, the technical features mentioned in the embodiments can be combined in any way as long as there is no structural conflict. It is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (12)

1. An oral CBCT automatic registration method is characterized by comprising the following steps:
s101, manufacturing a heat map label of the tooth centroid based on the oral CBCT image;
s102, training a tooth centroid detection model;
s103, respectively carrying out tooth centroid detection and post-processing on the CBCT images before and after the operation to obtain three-dimensional coordinates of each tooth centroid point;
s104, clustering the tooth centroid coordinates by using a clustering algorithm to separate the upper jaw and the lower jaw;
s105, sequencing the texture center points of the jaw to obtain an ordered texture center point set of two CBCT data;
s106, registering the texture center points of the jaw by using a point registration algorithm;
and S107, carrying out CBCT point cloud registration to obtain the preoperative and postoperative oral CBCT registration conversion relation.
2. The oral CBCT automatic registration method as claimed in claim 1, wherein the step of making heat map labels of tooth centroids based on oral CBCT images in S101 comprises:
displaying three orthogonal slices including a transverse plane, a sagittal plane and a coronal plane based on the oral CBCT image;
delineating a foreground region and a background region on three orthogonal slices for each tooth;
based on the preliminarily sketched foreground and background areas, obtaining a tooth segmentation result by using a graph cutting algorithm;
calculating a central point of the segmentation result of each tooth to obtain a centroid point of the tooth;
and performing Gaussian smoothing on the center of mass point image to obtain a heat map label of the center of mass of the tooth.
3. The oral CBCT automatic registration method according to claim 2, wherein a smoothing radius for Gaussian smoothing of the centroid point images is set to 2mm.
4. The oral CBCT automatic registration method as claimed in claim 2, wherein the heat map label of the tooth centroid is a three-dimensional image with a value range of [0, 1], wherein the value at the centroid point is 1, and the voxel point values farther from the centroid are smaller.
5. The oral CBCT automatic registration method as claimed in claim 1, wherein the training of the tooth centroid detection model in S102 comprises:
and training the input CBCT image and the corresponding mass center heat map label by adopting a deep learning model structure until a loss function is converged to obtain a neural network model parameter for mass center detection.
6. The oral CBCT automatic registration method of claim 5, wherein the deep learning model adopts a UNet model and the loss function adopts a mean square error loss (MSELoss) function.
7. The oral CBCT automatic registration method as claimed in claim 1, wherein the step S103 of performing tooth centroid detection and post-processing on the preoperative and postoperative CBCT images respectively to obtain three-dimensional coordinates of each tooth centroid point comprises:
carrying out centroid detection on the CBCT images before and after the operation respectively by using the tooth centroid detection model, wherein the obtained centroid detection result is a heat map;
binarizing the heat map by taking 0.5 as a threshold value;
performing connected domain analysis on the binarized image, wherein each connected domain represents one tooth;
and calculating the central point of each connected domain to obtain the centroid coordinate of each tooth under the LPS coordinate system.
8. The oral CBCT automatic registration method as claimed in claim 1, wherein said clustering the tooth centroid coordinates using a clustering algorithm to separate the upper and lower jaws in S104 comprises:
taking two points with the minimum and maximum Z-axis coordinates in the centroid points as initial clustering centers, and calculating the distance from each centroid point to the two clustering centers in the Z direction;
distributing clusters for each centroid point according to the distance, wherein the number of the clusters is set to be 2, namely the upper jaw and the lower jaw, and the cluster to which the cluster center with the smaller distance belongs is the cluster to which the centroid point belongs;
after each centroid point is distributed with a cluster, calculating the central points of the two clusters, taking the two central points as a new clustering center, calculating the distance from each centroid point to the clustering center again, and distributing the clusters;
the above process is repeated until the cluster center is no longer changed, at which time the centroid points are clustered into two clusters representing the set of centroid points for the maxillary teeth and the set of centroid points for the mandibular teeth, respectively.
9. The oral CBCT automatic registration method of claim 8, wherein said clustering algorithm employs K-Means algorithm.
10. The oral CBCT automatic registration method as claimed in claim 1, wherein the step of sorting the dental centroid points in S105 to obtain two ordered sets of the centroid points of the CBCT data comprises:
selecting a corresponding centroid point set according to the implant position of the planning operation, and if the planned implant is in the upper jaw, sequencing the centroid set of the lower jaw; if the planned implant is in the lower jaw, sequencing the mass center set of the upper jaw;
the coordinate range of the oral CBCT image under the LPS coordinate system is marked as [ xmin, ymin, zmin ] to [ xmax, ymax, zmax ];
determining an angular point according to the image range, wherein four points can be used as the angular points, namely [ xmin, ymax, zmin ], [ xmin, ymax, zmax ], [ xmax, ymax, zmin ], [ xmax, ymax, zmax ];
selecting one of the corner points as a target corner point, finding a centroid point closest to the target corner point as a first point, searching a centroid point closest to the centroid point in the centroid point set as a next point, and removing the found point from the set;
repeating the above process until the set is empty, obtaining two ordered centroid point sets of CBCT data, and recording as sets respectivelyP A And collectionsP B At the moment, the centroid points before and after the operation correspond to each other one by one.
11. The oral CBCT automatic registration method as claimed in claim 1, wherein the registration of the dental centroid points by using the point registration algorithm in S106 comprises:
calculating two centroid point sets by using point registration algorithmP B ToP A Is converted into a matrixT 1 As shown in the following formula:
Figure DEST_PATH_IMAGE001
12. the oral CBCT automatic registration method as claimed in claim 1, wherein the CBCT point cloud registration in S107 to obtain preoperative and postoperative oral CBCT registration transformation relationship comprises:
recording a preoperative CBCT model as A and a postoperative CBCT model as B;
by a relationship matrix
Figure DEST_PATH_IMAGE002
Moving post-operative CBCT model B to a new location
Figure DEST_PATH_IMAGE003
As a result of the coarse registration;
to A and A respectively
Figure 954937DEST_PATH_IMAGE003
Extracting the isosurface to obtain data of two groups of point clouds
Figure DEST_PATH_IMAGE004
And
Figure DEST_PATH_IMAGE005
the following relationship is satisfied:
Figure DEST_PATH_IMAGE006
Figure DEST_PATH_IMAGE007
computing point clouds using ICP algorithm
Figure 325874DEST_PATH_IMAGE005
To point cloud
Figure 43294DEST_PATH_IMAGE004
Conversion relation of
Figure DEST_PATH_IMAGE008
The following conversion relationship is satisfied:
Figure DEST_PATH_IMAGE009
finally, a transformation relation matrix for moving the post-operation CBCT model B to the pre-operation CBCT model A can be obtained
Figure DEST_PATH_IMAGE010
And satisfies the following relationship:
Figure DEST_PATH_IMAGE011
CN202211023684.8A 2022-08-25 2022-08-25 Oral CBCT automatic registration method Pending CN115546103A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211023684.8A CN115546103A (en) 2022-08-25 2022-08-25 Oral CBCT automatic registration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211023684.8A CN115546103A (en) 2022-08-25 2022-08-25 Oral CBCT automatic registration method

Publications (1)

Publication Number Publication Date
CN115546103A true CN115546103A (en) 2022-12-30

Family

ID=84725664

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211023684.8A Pending CN115546103A (en) 2022-08-25 2022-08-25 Oral CBCT automatic registration method

Country Status (1)

Country Link
CN (1) CN115546103A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116883246A (en) * 2023-09-06 2023-10-13 感跃医疗科技(成都)有限公司 Super-resolution method for CBCT image

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116883246A (en) * 2023-09-06 2023-10-13 感跃医疗科技(成都)有限公司 Super-resolution method for CBCT image
CN116883246B (en) * 2023-09-06 2023-11-14 感跃医疗科技(成都)有限公司 Super-resolution method for CBCT image

Similar Documents

Publication Publication Date Title
CN109410256B (en) Automatic high-precision point cloud and image registration method based on mutual information
CN110189352B (en) Tooth root extraction method based on oral cavity CBCT image
CN109934855A (en) A kind of livewire work scene power components three-dimensional rebuilding method based on cloud
CN109785374A (en) A kind of automatic unmarked method for registering images in real time of dentistry augmented reality surgical navigational
CN111696210A (en) Point cloud reconstruction method and system based on three-dimensional point cloud data characteristic lightweight
AU2020101836A4 (en) A method for generating femoral x-ray films based on deep learning and digital reconstruction of radiological image
CN102222357B (en) Foot-shaped three-dimensional surface reconstruction method based on image segmentation and grid subdivision
CN101706971A (en) Automatic division method of dental crowns in dental models
CN102254317A (en) Method for automatically extracting dental arch curved surface in dental implantation navigation
CN112037200A (en) Method for automatically identifying anatomical features and reconstructing model in medical image
CN105389444B (en) A kind of gum edge curve design method of personalization tooth-implanting
CN106127753B (en) CT images body surface handmarking's extraction method in a kind of surgical operation
CN103294883A (en) Method and system for intervention planning for transcatheter aortic valve implantation
CN111968146A (en) Three-dimensional tooth jaw mesh model segmentation method
EP1525560B1 (en) Automated measurement of objects using deformable models
CN112790879B (en) Tooth axis coordinate system construction method and system of tooth model
CN116229007B (en) Four-dimensional digital image construction method, device, equipment and medium using BIM modeling
CN115619773B (en) Three-dimensional tooth multi-mode data registration method and system
CN107680110A (en) Inner ear three-dimensional level-set segmentation methods based on statistical shape model
CN115546103A (en) Oral CBCT automatic registration method
CN106327479A (en) Apparatus and method for identifying blood vessels in angiography-assisted congenital heart disease operation
CN109965979A (en) A kind of steady Use of Neuronavigation automatic registration method without index point
CN113870326B (en) Structural damage mapping, quantifying and visualizing method based on image and three-dimensional point cloud registration
CN113223063B (en) Tooth registration method based on ICP algorithm and point cloud elimination algorithm
CN113974920A (en) Knee joint femur force line determining method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination