CN118154536A - Mammary gland bilateral difference detection method based on 3D imaging point cloud data - Google Patents
Mammary gland bilateral difference detection method based on 3D imaging point cloud data Download PDFInfo
- Publication number
- CN118154536A CN118154536A CN202410265256.9A CN202410265256A CN118154536A CN 118154536 A CN118154536 A CN 118154536A CN 202410265256 A CN202410265256 A CN 202410265256A CN 118154536 A CN118154536 A CN 118154536A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- breast
- point
- cloud data
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000002146 bilateral effect Effects 0.000 title claims abstract description 27
- 238000003384 imaging method Methods 0.000 title claims abstract description 18
- 238000001514 detection method Methods 0.000 title claims abstract description 13
- 210000005075 mammary gland Anatomy 0.000 title claims abstract description 9
- 210000000481 breast Anatomy 0.000 claims abstract description 145
- 239000011159 matrix material Substances 0.000 claims abstract description 37
- 238000000034 method Methods 0.000 claims abstract description 34
- 230000011218 segmentation Effects 0.000 claims abstract description 7
- 230000008569 process Effects 0.000 claims abstract description 6
- 239000013598 vector Substances 0.000 claims description 54
- 230000009466 transformation Effects 0.000 claims description 15
- 238000004364 calculation method Methods 0.000 claims description 10
- 210000001562 sternum Anatomy 0.000 claims description 10
- 238000010606 normalization Methods 0.000 claims description 7
- 210000002417 xiphoid bone Anatomy 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 6
- 239000003550 marker Substances 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 6
- 239000007787 solid Substances 0.000 claims description 6
- 238000012800 visualization Methods 0.000 claims description 6
- 230000008859 change Effects 0.000 claims description 5
- 239000003086 colorant Substances 0.000 claims description 5
- 238000004458 analytical method Methods 0.000 claims description 4
- 238000011156 evaluation Methods 0.000 claims description 4
- 238000007665 sagging Methods 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 238000002790 cross-validation Methods 0.000 claims description 2
- 230000036541 health Effects 0.000 claims description 2
- 238000003709 image segmentation Methods 0.000 claims description 2
- 238000007781 pre-processing Methods 0.000 claims description 2
- 238000012216 screening Methods 0.000 claims description 2
- 230000000007 visual effect Effects 0.000 claims description 2
- 238000004140 cleaning Methods 0.000 claims 1
- 230000004927 fusion Effects 0.000 claims 1
- 238000013461 design Methods 0.000 abstract description 3
- 238000001356 surgical procedure Methods 0.000 description 14
- 206010006187 Breast cancer Diseases 0.000 description 10
- 208000026310 Breast neoplasm Diseases 0.000 description 10
- 210000000038 chest Anatomy 0.000 description 6
- 238000002316 cosmetic surgery Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000000691 measurement method Methods 0.000 description 4
- 210000001519 tissue Anatomy 0.000 description 4
- 238000012795 verification Methods 0.000 description 4
- 210000001664 manubrium Anatomy 0.000 description 3
- 238000009966 trimming Methods 0.000 description 3
- 206010028980 Neoplasm Diseases 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000000945 filler Substances 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000004393 prognosis Methods 0.000 description 2
- 238000007634 remodeling Methods 0.000 description 2
- 239000000725 suspension Substances 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000002679 ablation Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 239000008358 core component Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 230000036210 malignancy Effects 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000008092 positive effect Effects 0.000 description 1
- 230000002980 postoperative effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30068—Mammography; Breast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Artificial Intelligence (AREA)
- Multimedia (AREA)
- Algebra (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Geometry (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
Abstract
The invention provides a method for detecting bilateral differences of mammary glands based on 3D imaging point cloud data, and belongs to the technical field of bilateral differences detection of mammary glands. The method establishes a breast difference assessment environment, sets mark points on a patient, updates a plane model through iteration, obtains a well-fitted reference plane when the plane model finally converges, realizes horizontal fitting of point cloud data to the reference plane, combines the mark points and a rotation matrix of the process to define a new coordinate system, designs a breast region segmentation method based on a coreless drift clustering algorithm, turns over the point cloud data of a breast region through a mirror surface, adopts an ICP iterative closest point algorithm on the data before and after turning over in fine registration, continuously iterates to reduce errors, realizes the height coincidence of the breast on the affected side and the breast on the other side after mirror image, and solves the problem of difficulty in finding the center faces of the breasts on two sides.
Description
Technical Field
The invention belongs to the technical field of bilateral difference detection of mammary glands, and relates to a bilateral difference detection method of mammary glands based on 3D imaging point cloud data.
Background
At present, breast cancer has become the most common malignancy among women's tumors worldwide. With the improvement of health consciousness and the rapid development of medical level of people, the requirements of breast cancer patients on aesthetics are gradually improved, breast reconstruction after breast cancer operation can repair defects of breasts, and breast reconstruction has positive effects on reconstruction confidence of the breast cancer patients and improvement of life quality. Meanwhile, through breast reconstruction, the satisfaction degree of the patient on the breast appearance is improved, and the best aesthetic requirement of the patient is met, so that the breast reconstruction method is also an important aspect of physical and mental rehabilitation of the patient.
Breast cancer ablation is a common method in breast cancer treatment to treat disease by resecting a focal region of a patient. However, this procedure also results in a difference in the breasts on both sides of the patient. To assist the patient in restoring the shape and appearance of the breast, breast reconstruction surgery is increasingly receiving attention from doctors and patients.
In addition to patients undergoing breast cancer excision surgery, there are also individuals who are not satisfied with their own breasts, who wish to achieve their own desired effect by breast plastic surgery. When chest surgery is performed, doctors are required to know and master the chest difference at two sides of a patient so as to ensure that the surgery achieves an ideal effect.
Whether it is breast reconstruction surgery or breast plastic surgery, a doctor needs to know and grasp the difference between the breast before and after the operation on the patient's affected side, or the difference between the breast on the healthy side and the breast on the affected side of the patient. The differences are mainly manifested in two aspects: first, breast volume variability; and secondly, breast position difference. That is, for breast cancer patients, the volume of the affected side breast is reduced after operation compared with that before operation due to the fact that the affected side breast passes through excision operation, so that the affected side breast can generate volume difference before operation and after operation; for breast-shaping patients, there is also a volume difference in the breasts on both sides during the plastic surgery. Also, the shape of the breast may change from surgery, for example, if a partial or total excision is performed, the shape of the affected side breast may be different from the healthy side breast or the preoperative affected side breast, and thus the missing portion of the affected side breast may be positionally different from the healthy side breast.
Currently, the detection of the difference between two sides of a breast is mainly divided into a subjective method and an objective method. Objective methods mainly include three types, 2D imaging methods, 3D imaging methods, mechanical-based scale or volume measurement methods.
Subjective methods are performed by doctors at the time of surgery by directly observing the breasts, comparing the morphological differences and volume differences of the breasts, but it is difficult to accurately evaluate bilateral breast differences.
Objective methods measurement and evaluation methods based on 2D or 3D imaging of the contoured surface can be used to measure breast volume. 2D imaging methods typically acquire cross-sectional images of the breast using techniques such as X-ray or CT scanning. Then, by calculating the area and thickness of breast tissue in these images, the volume of the breast can be obtained. This method has the advantage of being simple and easy to implement, but has the disadvantage of not providing detailed information about the internal structure of the breast. The 3D imaging method generally acquires a three-dimensional image of the breast using a Magnetic Resonance Imaging (MRI) or ultrasound technique or the like. The images are then processed and analyzed by computer software to obtain the volume and shape of the breast. The advantage of this approach is that detailed information about the internal structure of the breast can be provided, but the disadvantage is that more expensive equipment and longer scan times are required. Based on a mechanical dimension or volume measurement method, the non-rigid and curved-surface shape of the breast is not easy to measure and express accurately, belongs to contact measurement, and is inconvenient to use in operation in an operation environment.
The breast, which is an irregular geometry, is susceptible to a number of factors, and how to select appropriate measurement methods to guide clinical practice is a matter of concern to clinicians.
In summary, existing bilateral breast differential measurement and assessment methods either do not perform well or are not suitable for full-size measurement and analytical assessment of the breast in clinical settings, particularly during breast remodeling procedures. In view of the above, the invention provides a system for feeding back the difference between healthy side and affected side of a patient in real time in breast surgery appearance remodelling, which is used for carrying out accurate non-contact 3D measurement and differential assessment on double-sided breasts in supine position of the patient in surgery based on a 3D imaging technology, so as to realize accurate navigation on affected side breast trimming in surgery, and facilitate doctors to obtain the volume of the filler and the filling position of the filler.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a two-side difference comparison algorithm for affected side excision and appearance remodeling in breast surgery, which can solve the problem of the difference between the affected side preoperative breast and the postoperative breast in clinical surgery and mainly comprises two aspects: firstly, calculating breast volume difference; and secondly, the problem of breast position variability. Thereby realizing the accurate navigation of the breast operation trimming of the patient in operation.
In order to achieve the above purpose, the present invention provides the following technical solutions:
a breast bilateral difference detection method based on 3D imaging point cloud data comprises the following steps:
first, a breast difference assessment environment is constructed, and marker points are set for a detected person before an image is acquired.
1.1 Preparing a scanning environment: and adjusting the scanning ambient light to ensure that the target object has no reflection or shadow. The scan field of view is cleaned, ensuring that only the target object is scanned.
1.2 Set a flag point): a marking point is respectively arranged at the midpoint of the upper edge of the sternum handle and the xiphoid process. The two mark points provide a symmetry axis (namely the central axis) and a symmetry plane (namely the central axis) of an actual human body for a mammary gland bilateral difference detection method. The method comprises the following steps: in the middle of the chest there is a vertical bone, the sternum. The midline of the sternum is the midline passing through the sternum, and the two end points are the top of the sternum, the midpoint of the upper edge of the manubrium, and the tail of the sternum, the xiphoid process. A marking point is respectively arranged at two end points of the sternum midline,
1.3 A) setting scanning device: setting scanning equipment parameters to ensure that scanning data with optimal quality are acquired; aligning the scanning device to the breast area to be scanned, initiating a scanning procedure, ensuring that the scanning device is able to capture the entire area intact, and ensuring that the scanning device is moved to cover all angles and details required. The specific process for setting the parameters of the scanning equipment is as follows: the scanning device is arranged according to the instructions or guidelines of the scanning device. Parameters such as resolution, scan angle, scan distance, etc. are adjusted to obtain the best quality scan data. The scanning device is a camera or a scanner, and the camera is preferably a 3D camera.
Secondly, the following processing is performed on the data of interest in surgical navigation, such as a 3D point cloud image and a surface image thereof, sagging information of an image of a patient side breast, a difference between a patient side breast and a healthy side breast volume, and coordinate values, when a subject lies on an operation table.
2.1 Data preprocessing: firstly, denoising the acquired model point cloud data, and removing noise points in the point cloud data by using various filtering technologies (such as Gaussian filtering, median filtering and the like). Meanwhile, based on the density of the point cloud data and the point cloud normal, outliers are removed to reduce the influence of noise.
2.2 Data normalization: and carrying out scale standardization and coordinate standardization on the point cloud data. One aspect is scale normalization, where the scale of the point cloud data may be different, to ensure analysis at the same scale. On the other hand, the coordinate standardization ensures that the coordinates of the point cloud data are in the same coordinate system.
And thirdly, dividing the points in the point cloud data into inner points and outliers, and iteratively updating the plane model to enable the updated plane model to better adapt to the inner points, and obtaining a well-fitted reference plane when the points are finally converged. Then, a rotation matrix is obtained by calculating the rotation axis and rotation angle between the normal vector of the reference plane and the normal vector of the plane to be aligned. And finally, applying a rotation matrix to the point cloud data, and performing rotation transformation on points in the point cloud through matrix multiplication, so as to realize horizontal fitting of the point cloud data to a reference plane.
3.1 Initializing a plane model: three points are randomly selected from the point cloud to serve as an initial plane model, and a starting point of the given algorithm serves as an initial state of iterative updating. Let these three points be p 1(x1,y1,z1),p2(x2,y2,z2) and p 3(x3,y3,z3), respectively).
3.2 Iteratively updating the planar model:
First, for each point p i(xi,yi,zi), its distance to the current planar model is calculated. The distance of point p i to the planar model is calculated using the following formula:
Where P i is the coordinates of point P i, A is a point on the plane, and n is the normal vector of the plane. The normal vector n can be obtained by calculating the cross product of two parallel vectors:
n=(P2-P1)×(P3-P1) (2)
then, the points are classified into interior points or outliers according to the distance threshold that has been set: if the distance d i from the point to the plane model is less than or equal to the distance threshold, classifying the point as an interior point; otherwise, it is classified as an outlier.
Finally, the parameters of the new planar model are re-estimated using the interior points. The method of re-estimating the new planar model may use a least squares or other fitting algorithm. Assuming the index set of inliers as Inliers, in the step of re-estimating the new plane model, the points in Inliers are used to solve the following system of equations by least squares fitting algorithm to obtain the parameters of the new plane model:
n·P=c (3)
Where P is the coordinate matrix of the interior points and c is a constant vector related to the plane position.
3.3 Convergence judgment): it is checked whether the number of iterations reaches a maximum number of iterations or whether the change in the planar model is smaller than a certain threshold. And if any one of the conditions is met, stopping iteration, and obtaining a reference plane of the point cloud data.
3.4 Calculating a rotation matrix: in aligning the point cloud data with the reference plane, a rotation matrix is first calculated to rotate the point cloud data to the position of the reference plane. The rotation matrix is represented by a rotation axis and a rotation angle, wherein the calculation of the rotation axis is achieved using a cross-product of the normal vector N ref of the reference plane and the normal vector N desired of the desired alignment plane:
Raxis=Nref×Ndesired (4)
the rotation axes are normalized to ensure the validity of the rotation matrix:
The rotation angle is calculated by the point multiplication of the normal vector N ref of the reference plane and the normal vector N desired of the required alignment plane:
θ=cos-1(Nref·Ndesired) (6)
Converting the rotation axis and rotation angle into a rotation matrix using an axis angle representation:
R=axang2rotm([Raxis,θ]) (7)
3.5 Applying the rotation matrix to the point cloud data to achieve planar alignment. For the coordinate vector P of each point, a matrix multiplication P aligned =r·p is used to perform rotation transformation, so as to realize horizontal fitting of the point cloud data.
Fourth, a three-dimensional coordinate system of the point cloud data is established
4.1 Determining one axis of the three-dimensional coordinate system: and determining a direction vector by calculating a difference vector between two mark points (a middle point of the upper edge of a sternum handle and a xiphoid process) set during image acquisition, and carrying out normalization processing, wherein the direction vector after normalization processing is a Y axis in a three-dimensional coordinate system parallel to a reference plane obtained by plane fitting.
4.2 Calculating the marker point center: and calculating the central coordinate of the central axis, and taking the central coordinate as the origin of the three-dimensional coordinate system.
4.3 Establishing a coordinate system: two vectors perpendicular to the Y axis are selected as the other two axes of the new coordinate system, wherein the X axis is perpendicular to the vectors of the Y axis and parallel to a reference plane obtained by plane fitting, and the Z axis is perpendicular to the vectors of the X axis and the Y axis at the same time, and the perpendicular vectors are obtained by cross multiplication operation.
4.4 Determining the coordinate system equation: converting each point in the point cloud data into a new coordinate system, and calculating new coordinates (x ', y ', z ') of a certain point in the point cloud data according to the following formula:
Where (x, y, z) is the coordinates of a point in the original coordinate system and (x 0,y0,z0) is the origin coordinates of the new coordinate system.
Fifthly, segmentation of the breast area is achieved through a coreless drift clustering algorithm.
5.1 A coreless drift clustering algorithm is applied to determine bandwidth parameters: bandwidth (bandwidth) is an important parameter in the coreless drift algorithm that determines the window size, i.e. affects the neighborhood around each point. For coreless drift algorithms, the choice of bandwidth parameters is critical, and the best bandwidth parameters are typically chosen by cross-validation or based on data characteristics.
5.2 Selecting an initial seed point in the point cloud data, and taking the initial seed point as a start center of the clustering. For each initial seed point, a drift vector is calculated pointing in the direction of maximum data density gradient and the density estimate involved in the drift vector is calculated using a gaussian kernel function:
Where h is the bandwidth parameter and x is the distance between points. For a given point x i, the drift vector m (x i) is used to guide the direction of movement of the point to find the location of greatest density. The calculation formula of the drift vector:
Where x j is one of the neighbors of x i and N (x i) is the neighborhood of all points within the bandwidth centered on x i;
5.3 Based on the drift vector, the position of each initial seed point is updated to move in the direction of maximum data density. And clustering the point cloud data by using the bandwidth parameters. Repeating the steps until the seed points tend to the clustering center, and finally forming a clustering area. Updating the position of the seed point x i based on the drift vector
5.4 Dividing the point cloud of the breast area, and identifying point cloud data belonging to the breast area according to the clustering result. For the segmentation of the point cloud data, the point cloud data identified as the breast region is extracted for subsequent symmetry assessment.
And sixthly, calculating the volume difference of breasts at two sides and visualizing the position difference through mirror surface overturning.
6.1 Mirror face overturning and registering and aligning point cloud data before and after overturning, so as to realize visualization of breast position difference at two sides: for bilateral breast 3D curved surface difference analysis, mirror surface overturning is carried out on a YOZ plane in a space coordinate system where point cloud data of a segmented breast region are located, so as to obtain a supine position breast 3D curved surfaceFor reference, bilateral breast differences were analyzed and bilateral differences fusing breast 3D and sagging information were measured. The specific process is as follows:
in a 3D rectangular coordinate system, the bilateral breast 3D image shot under the supine position condition of the detected person is recorded as Point cloud data/>, of segmented breast regionMirror surface overturning is carried out by taking YOZ plane in a three-dimensional coordinate system as a symmetrical plane, and the 3D curved surface of the overturned breast is used for/>And (3) representing. Extracting the breast curvature from (x, y, z)/>The data set of (a) is denoted as a, and the data set after the inversion update is expressed as
The (x, y, z) in the above formulas represents 3 coordinate variables of the new 3D coordinate system.
Original breast surface datasetBreast curved surface data set after overturning through YOZ plane mirror image overturningThe breast image coordinates before and after inversion have the following correspondence: the x-axis coordinates are reversed and the y-axis and z-axis coordinates are unchanged.
The physical 3D image is recorded asRepresenting the 3D curved surface of the turned breast. Will overturn the 3D curved surface/>As a standard surface, solid 3D image/>As a comparison object, coarse registration and fine registration are performed. The initial rough matching is performed by selecting a plurality of pairs of special mark points set in advance by a doctor as matching labels, and providing a good initial position for the accurate matching after manual selection. And in the fine registration, an ICP iterative closest point algorithm is adopted, and errors are reduced in continuous iteration, so that an ideal precision value is realized.
Point cloud after overturningIs denoted as P i, from the original point cloud/>Searching a point with the shortest Euclidean distance, marking as q i.Pi and q i as corresponding points, obtaining a transformation matrix, iterating for a plurality of times, finally obtaining the optimal transformation matrix, enabling two point clouds to coincide, setting the stopping condition of the iteration, and adopting the following formula:
wherein R represents a rotation transformation matrix and T represents a translation transformation matrix. k represents a point cloud Total points in (a); r pi represents/>A rotational transformation matrix of selected points in the point cloud; registered/>And/>Updated to ω' (x, y, z) and ω (x, y, z).
Defining a volume error function of the solid breast after gridding and the standard breast curved surface after mirroring:
V(3D)(xi,yj,zi,j)=∫∫S[ω(xi,yj,zi,j)-ω′(-xi,yj,zi,j)]dxdy (14)
i and j represent the serial numbers in the x and y directions in the new coordinate system, z i,j represents the curved surface height and shape corresponding to x i,yj, and S represents the set of sub-regions with volume differences in the solid or standard breast obtained by image segmentation.
Screening out subregions larger than the minimum error specified by doctors in the set, drawing a contour map, and reflecting different degrees of difference of different subregions according to the light to dark colors of the errors. Finally, the region with symmetry smaller than the minimum error specified by doctor on the 3D curved surface is displayed on the display, and the position and shape differences of omega' x, y, z) and point cloud omega (x, y, z) are displayed in different colors.
6.2 Calculation of the difference in volume of breasts on both sides: the bottom surface selects the registered point cloud omega (x, y, z), the top surface selects the registered point cloud omega' (x, y, z), a proper step length is set, the bottom surface is divided into a plurality of discrete small grids, and the updating formula (14) is as follows:
The volumes of the corresponding cells of each grid are calculated and summed together, as ω (x, y, z) and ω' (x, y, z) together comprise two left and two right breasts, the total difference volume is divided by 2 to give V 0. Calculation of bilateral breast volume differences the visualization of the position differences was coordinated, assisting the physician in taking the corresponding volume of tissue for filling according to V 0 at any stage in the procedure. The procedure is repeated, trimming the breast appearance until satisfactory.
The invention has the beneficial effects that: aiming at the problem of bilateral difference detection of the breast existing in the existing breast plastic surgery and the breast cancer prognosis appearance remodelling surgery, the invention constructs a bilateral difference detection method of the breast based on 3D imaging point cloud data by considering the differences of the breast area, the position and the like, establishes a breast difference evaluation environment and sets mark points for patients, obtains a well-fitting reference plane through iteratively updating a plane model when finally converging, realizes the horizontal fitting of the point cloud data to the reference plane, combines the mark points and a rotating matrix of the process to define a new coordinate system, designs a breast area segmentation method based on a coreless drift clustering algorithm, adopts an ICP iterative closest point algorithm to the data before and after overturning in fine registration, continuously iterates to reduce errors, realizes the high coincidence of the breast on the affected side and the breast on the other side after mirroring, solves the problem of difficult searching of the center faces of the breast on the two sides, and greatly changes the mode of the existing breast plastic surgery or the breast cancer prognosis surgery mainly by means of eye observation and judgment and the like. The breast image and data obtained in real time by the 3D camera realize calculation of the breast volume difference at two sides and visualization of the position difference, relatively objectively provide important basic reference for personal judgment of doctors, and the measuring equipment is simple and accurate, and the evaluation method is simple and easy to implement, thereby meeting the requirement of clinical operation.
Drawings
FIG. 1 is a schematic diagram of a layout structure of a 3D point cloud imaging device according to the present invention;
FIG. 2 is an overall logic block diagram of the present invention;
FIG. 3 is a schematic view of the location of the midpoint of the upper edge of the manubrium and the xiphoid process with marker points;
FIG. 4 is a bilateral breast three-dimensional coordinate system as described herein;
FIG. 5 is a flow chart of a 3D point cloud data acquisition portion of the present invention;
FIG. 6 is a flow chart of the present invention for achieving horizontal fitting;
FIG. 7 is a flow chart of the present invention for establishing a three-dimensional coordinate system;
FIG. 8 is a flow chart for segmenting breast regions;
FIG. 9 is a flow chart of two-sided breast volume difference calculation and position difference visualization;
in the figure: 1-main control computer, 2-3D camera, 3-operating table, 4-davit.
Detailed Description
The invention is further illustrated below with reference to specific examples.
To achieve a double-sided differential detection of breast based on 3D imaging point cloud data, a specific operating environment needs to be constructed. The core components of the environment comprise a main control computer 1, a 3D camera 2 which is in signal transmission with the main control computer 1 and a simulated operating table 3. Furthermore, a universal boom 4 is required to support and adjust the position and angle of the devices. Specifically, the 3D camera 2 needs to be stacked one above the other and fixed on a fixed base plate by bolts. The upper part of the fixed bottom plate is connected with the bottom of the universal suspension arm 4 through bolts, and the top of the universal suspension arm 4 is fixed on the ceiling above the operating table 3. This design allows us to flexibly change the position and orientation of the 3D camera and projector by adjusting the universal boom to accommodate different surgical needs. In addition, 2 marker points are required to be arranged on the body surface of the patient in advance and used as reference points for determining the central axis of the 3D image.
A detection method for calculating two-side breast volume difference and visualizing position difference based on 3D imaging point cloud data comprises the following steps:
A. First step layout of 3D point cloud imaging device
A1. according to the overall structure layout shown in fig. 1, a main control computer 1 and a 3D camera 2 are connected.
A2. As shown in FIG. 3, marker points are arranged at the midpoint of the upper edge of the manubrium and the xiphoid process of the subject.
A3. the boom 4 is pulled to align the 3D camera 2 attached to the boom 4 to the chest of the subject, and the distance between the front end of the lens of the 3D camera 2 and the surface of the chest of the patient is maintained at 950mm.
B. Secondly, denoising and standardizing the acquired point cloud data
B1. And (3) performing point cloud acquisition on the chest of the patient by using a 3D camera.
B2. Gaussian filtering, median filtering, and outlier removal are used to reduce noise points in the noise-removed point cloud data.
B3. And ensuring that the coordinates of the point cloud data are in the same coordinate system. If there is no scaling relationship between two or more point cloud data sets (i.e., when the scales are the same), the rotation matrix and translation vector can be solved by classical ICP (Iterative Closest Point) method to perform point set alignment.
C. Thirdly, acquiring a reference plane (horizontal plane) in the point cloud data to realize horizontal fitting of the point cloud data
C1. Three points, p 1(x1,y1,z1),p2(x2,y2,z2) and p 3(x3,y3,z3, respectively, are randomly selected from the point cloud to initialize the planar model.
C2. For each point p i(xi,yi,zi, its distance to the current planar model is calculated using equation (1).
C3. Dividing the points into inner points or outliers according to the distance threshold, re-estimating parameters of the plane model by using the inner points, and stopping iteration when the change of the plane model is smaller than a set threshold or not, so as to obtain a reference plane of the point cloud data.
C4. And (3) calculating a rotation matrix by using formulas (4), (5), (6) and (7), and rotating the point cloud data to the position of the reference plane according to the obtained rotation matrix to realize horizontal fitting of the point cloud data to the horizontal plane.
D. fourth, a new three-dimensional coordinate system is established according to two mark points in the point cloud data after plane fitting
D1. A direction vector is determined by a difference vector V= (x 2-x1,y2-y1,z2-z1) between two mark points, namely, the coordinate of the mid point of the upper edge of the sternum is (x 1,y1,z1) and the coordinate of the xiphoid process is (x 2,y2,z2), and normalization processing is carried out, namelyAnd finally, cross-product is carried out on the normalized difference vector V' and the normal vector of the reference plane obtained by the C3 to obtain the Y axis in the new three-dimensional coordinate system.
D2. and calculating the center coordinate of the Y axis to obtain the origin of the three-dimensional coordinate system.
D3. And obtaining an X axis which is perpendicular to the Y axis and parallel to the reference plane through cross multiplication operation, wherein a Z axis is perpendicular to the X axis and the Y axis at the same time, and the establishment of the three-dimensional coordinate system is completed.
D4. the original point cloud coordinates (x, y, z) are converted into a new coordinate system through a formula (8) to obtain new coordinates (x ', y ', z ') of the point cloud data.
E. fifth step, using a coreless drift clustering algorithm to realize segmentation of bilateral breast areas
E1. An optimal bandwidth parameter is determined. The point cloud data set is divided into a training set and a verification set by adopting a cross verification method. Then, a search range for bandwidth parameters is set and a series of candidate parameter values are selected within the range. Clustering is performed on each candidate parameter using a training set, and clustering effects are evaluated using a verification set. In the process, the clustering quality is quantified by using the contour coefficient and the Davies-Bouldin index, and the bandwidth parameter which enables the verification of the clustering effect to be optimal is selected.
E2. An initial seed point x i is selected from the point cloud data as the starting center of the cluster and the density estimate involved in the drift vector is calculated using equation (9).
E3. Based on equation (10), the location of each seed point is updated. The best bandwidth parameters found in E1 are used to cluster the point cloud data. Repeating the steps until the seed points tend to the clustering center, and finally forming a clustering area. Updating the position of the seed point to be based on the drift vector using equation (11)
E4. And according to the clustering result, dividing the breast area point cloud. Each point cloud data is assigned to a particular cluster. These clusters represent different tissues or structures, breast region clusters are located at the center of the data points and have a higher density, while other clusters represent other tissues or noise.
F. Sixth, calculating the volume difference and visualizing the position difference of the breasts at two sides
F1. Point cloud data of segmented breast areaMirror-turning by using YOZ plane in three-dimensional coordinate system as symmetry plane and using existing turning, and marking 3D curved surface of turned breast as/>
F2. Point cloud after overturningP i is arbitrarily selected from the point cloud/>Searching a point q i.Pi and a point q i with the shortest Euclidean distance as corresponding points to obtain a transformation matrix, and finishing registration by taking the formula (13) as an iteration stopping condition.
F3. The registered point cloud omega '(x, y, z) and the registered point cloud omega (x, y, z) are displayed in different colors, and the visual position and shape difference of the registered point cloud omega' (x, y, z) are increased.
F4. The registered point cloud omega (x, y, z) is selected as a bottom surface, the point cloud omega' (x, y, z) is selected as a top surface, the bottom surface is divided into a plurality of discrete small grids, the volumes of the corresponding units of each grid are calculated, and the volumes are added and summed to calculate the total difference volume V 0.
The examples described above represent only embodiments of the invention and are not to be understood as limiting the scope of the patent of the invention, it being pointed out that several variants and modifications may be made by those skilled in the art without departing from the concept of the invention, which fall within the scope of protection of the invention.
Claims (2)
1. The method for detecting the bilateral differences of the mammary glands based on the 3D imaging point cloud data is characterized by comprising the following steps of:
Firstly, constructing a breast difference evaluation environment, and setting mark points for a detected person before collecting an image;
1.1 Preparing a scanning environment: adjusting the scanning ambient light to ensure that the target object has no reflection or shadow; cleaning a scanning visual field to ensure that only a target object is scanned;
1.2 Set a flag point): a marking point is respectively arranged at the midpoint of the upper edge of the sternum handle and the xiphoid process; the two mark points provide a symmetry axis and a symmetry plane of an actual human body for a mammary gland bilateral difference detection method;
1.3 A) setting scanning device: setting scanning equipment parameters to ensure that scanning data with optimal quality are acquired; aligning the scanning device to the breast area to be scanned, starting a scanning process, ensuring that the scanning device can completely capture the whole area, and ensuring that the scanning device is moved to cover all angles and details required;
Secondly, carrying out the following processing on data focused by surgical navigation, such as a 3D point cloud image and a surface image thereof, sagging information of a patient side breast image, a patient side breast and health side breast volume difference and coordinate values of a detected person when the detected person lies on an operating table;
2.1 Data preprocessing: firstly, denoising the acquired model point cloud data, and removing noise points in the point cloud data by using various filtering technologies; meanwhile, based on the density of the point cloud data and the normal line of the point cloud, outliers are removed to reduce the influence of noise;
2.2 Data normalization: performing scale standardization and coordinate standardization on the point cloud data; on the one hand, the scale is standardized, the scales of the point cloud data can be different, and the standardization is performed to ensure that analysis is performed under the same scale; on the other hand, the coordinate is standardized, so that the coordinates of the point cloud data are ensured to be in the same coordinate system;
Dividing points in the point cloud data into inner points and outliers, and iteratively updating a plane model to enable the points to be better suitable for the inner points, and obtaining a well-fitted reference plane when final convergence is achieved; then, a rotation matrix is obtained by calculating a rotation axis and a rotation angle between a normal vector of a reference plane and a normal vector of a plane to be aligned; finally, applying a rotation matrix to the point cloud data, and performing rotation transformation on points in the point cloud through matrix multiplication, so as to realize horizontal fitting of the point cloud data to a reference plane;
3.1 Initializing a plane model: randomly selecting three points from the point cloud as an initial plane model, and taking a starting point of the given algorithm as an initial state of iterative updating; let these three points be p 1(x1,y1,z1),p2(x2,y2,z2) and p 3(x3,y3,z3), respectively);
3.2 Iteratively updating the planar model:
first, for each point p i(xi,yi,zi), calculate its distance to the current planar model; the distance of point p i to the planar model is calculated using the following formula:
Wherein P i is the coordinates of point P i, A is a point on the plane, and n is the normal vector of the plane; the normal vector n can be obtained by calculating the cross product of two parallel vectors:
n=(P2-P1)×(P3-P1) (2)
Then, the points are classified into interior points or outliers according to the distance threshold that has been set: if the distance d i from the point to the plane model is less than or equal to the distance threshold, classifying the point as an interior point; otherwise, classifying the points as outliers;
finally, re-estimating parameters of the new plane model by using the interior points, wherein the method for re-estimating the new plane model can use a least square method or other fitting algorithms;
3.3 Convergence judgment): checking whether the iteration number reaches the maximum iteration number or whether the change of the plane model is smaller than a certain threshold value; if any one of the conditions is met, stopping iteration, and obtaining a reference plane of the point cloud data;
3.4 Calculating a rotation matrix: in aligning the point cloud data with the reference plane, a rotation matrix is first calculated, which may be represented by a rotation axis and a rotation angle, wherein the calculation of the rotation axis may be implemented using a cross-product of a normal vector N ref of the reference plane and a normal vector N desired of the desired alignment plane:
Raxis=Nref×Ndesired (4)
the rotation axes are normalized to ensure the validity of the rotation matrix:
The rotation angle is calculated by the point multiplication of the normal vector N ref of the reference plane and the normal vector N desired of the required alignment plane:
θ=cos-1(Nref·Ndesired) (6)
it is converted into a rotation matrix using an axis angle representation:
R=axang2rotm([Raxis,θ]) (7)
3.5 Applying the rotation matrix to the point cloud data to achieve planar alignment; performing rotation transformation on the coordinate vector P of each point by using matrix multiplication P aligned = R.P, and realizing horizontal fitting of point cloud data;
fourth, a three-dimensional coordinate system of the point cloud data is established
4.1 Determining one axis of the three-dimensional coordinate system: determining a direction vector by calculating a difference vector between two mark points set during image acquisition, and carrying out normalization processing to obtain a Y-axis in a three-dimensional coordinate system parallel to a reference plane obtained by plane fitting;
4.2 Calculating the marker point center: calculating the central coordinate of the central axis, and taking the central coordinate as the origin of a three-dimensional coordinate system;
4.3 Establishing a coordinate system: two vectors perpendicular to the Y axis are selected as other two axes of the new coordinate system, wherein the X axis is perpendicular to the vectors of the Y axis and is parallel to a reference plane obtained by plane fitting, the Z axis is simultaneously perpendicular to the vectors of the X axis and the Y axis, and the perpendicular vectors can be obtained by cross multiplication;
4.4 Equation of coordinate system: converting each point in the point cloud data into a new coordinate system, and calculating new coordinates (x ', y ', z ') of a certain point in the point cloud data according to the following formula:
Wherein 9x, y, z0 are the coordinates of the point in the original coordinate system, and (x 0,y0,z0) are the origin coordinates of the new coordinate system;
Fifthly, realizing segmentation of the breast area through a coreless drift clustering algorithm;
5.1 A coreless drift clustering algorithm is applied, and firstly, bandwidth parameters are determined: selecting optimal bandwidth parameters by cross-validation or based on data characteristics;
5.2 Selecting initial seed points in the point cloud data, wherein the points are used as the initial centers of clustering; for each seed point, a drift vector is calculated pointing in the direction of maximum data density gradient and the density estimate involved in the drift vector is calculated using a gaussian kernel function:
Where h is the bandwidth parameter and x is the distance between points; for a given point x i, the drift vector m (x i) is used to guide the direction of movement of the point to find the location of greatest density; the calculation formula of the drift vector:
Where x j is one of the neighbors of x i and N (x i) is the neighborhood of all points within the bandwidth centered on x i;
5.3 Updating the position of each seed point based on the drift vector to move the seed point in the direction of maximum data density; clustering the point cloud data by using the bandwidth parameters; repeating the steps until the seed points trend to the clustering center, and finally forming a clustering area; updating the position of the seed point x i based on the drift vector
5.4 Dividing point cloud of the breast area, and identifying point cloud data belonging to the breast area according to the clustering result; for the segmentation of the point cloud data, extracting the point cloud data identified as the breast area for subsequent symmetry assessment;
sixthly, calculating the volume difference of breasts at two sides and visualizing the position difference through mirror surface overturning;
6.1 Mirror face overturning and registering and aligning point cloud data before and after overturning, so as to realize visualization of breast position difference at two sides: for the bilateral breast 3D curved surface difference analysis, mirror surface overturning is carried out on a YOZ plane in a space coordinate system where point cloud data of the segmented breast area are located, so as to enable the breast 3D curved surface to be in a supine position As a benchmark, bilateral breast variability is analyzed and bilateral differences fusing breast 3D and sagging information are measured; in a 3D rectangular coordinate system, a bilateral breast 3D image shot under the supine position condition of a detected person is recorded as/>3D curved surface of turned breast/>A representation; extracting breast curvature/>, from the multivariate fusion information (x, y, z, d)The data set of (a) is denoted as a and can be expressed as
(X, y, z) in the above formulae represents 3 coordinate variables of the new 3D coordinate system;
Original breast surface dataset The curved surface data set of the breast after being turned over can be obtained through yoz plane mirror image turningThe breast image coordinates before and after inversion have the following correspondence: the x-axis coordinates are opposite, and the y-axis coordinates and the z-axis coordinates are unchanged;
The physical 3D image is recorded as A 3D curved surface representing the turned breast; will overturn the 3D curved surface/>As a standard surface, solid 3D image/>As a comparison object, coarse registration and fine registration are performed; the initial rough matching can be realized by selecting a plurality of pairs of special mark points which are set in advance by a doctor as matching labels, and a better initial position can be provided for the subsequent accurate matching through manual selection; an ICP iterative closest point algorithm is adopted in fine registration, and errors are reduced in continuous iteration, so that an ideal precision value is realized;
Point cloud after overturning Is denoted as P i, from the original point cloud/>Searching a point with the shortest Euclidean distance, marking as q i;Pi and q i as corresponding points, obtaining a transformation matrix, iterating for a plurality of times, finally obtaining the optimal transformation matrix, enabling two point clouds to coincide, setting the stopping condition of the iteration, and adopting the following formula:
wherein R represents a rotation transformation matrix, and T represents a translation transformation matrix; k represents a point cloud Total points in (a); /(I)Representation/>A rotational transformation matrix of selected points in the point cloud; registered/>And/>Updated to ω' (x, y, z) and ω (x, y, z);
defining a volume error function of the solid breast after gridding and the standard breast curved surface after mirroring:
V(3D)(xi,yj,zi,j)=∫∫S[ω(xi,yj,zi,j)-ω′(-xi,yj,zi,j)]dxdy (14)
i and j respectively represent serial numbers in x and y directions in a new coordinate system, z i,j represents the height and shape of a curved surface corresponding to x i,yj, and S represents a set of sub-regions with volume differences in a solid breast or a standard breast obtained through image segmentation; screening out subregions of the set, which are larger than the minimum error specified by a doctor, drawing a contour map, and reflecting different degrees of difference of different subregions according to the light to dark colors of the errors; finally, displaying the region with symmetry smaller than the minimum error specified by doctor on the display, and displaying the position and shape difference of omega' (x, y, z) and point cloud omega (x, y, z) in different colors;
6.2 Calculation of the difference in volume of breasts on both sides: the bottom surface selects point cloud omega '(x, y, z), the top surface selects point cloud omega' (x, y, z), a proper step length is set, the bottom surface is divided into a plurality of discrete small grids, and a formula (14) is updated as follows:
calculating the volumes of the corresponding cells of each grid and summing up, because ω (x, y, z) and ω' (x, y, z) together comprise two left and two right breasts, the total difference volume being divided by 2 to obtain V 0; calculation of bilateral breast volume differences coordinates visualization of positional differences.
2. The method for detecting bilateral differences of breast based on cloud data of 3D imaging points according to claim 1, wherein in the step 3.2), assuming that the index set of the interior points is Inliers, in the step of re-estimating the plane model, the following equation set is solved by least squares fitting algorithm using the points in Inliers to obtain new plane model parameters:
n·P=c (3)
Where P is the coordinate matrix of the interior points and c is a constant vector related to the plane position.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410265256.9A CN118154536A (en) | 2024-03-08 | 2024-03-08 | Mammary gland bilateral difference detection method based on 3D imaging point cloud data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410265256.9A CN118154536A (en) | 2024-03-08 | 2024-03-08 | Mammary gland bilateral difference detection method based on 3D imaging point cloud data |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118154536A true CN118154536A (en) | 2024-06-07 |
Family
ID=91288162
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410265256.9A Pending CN118154536A (en) | 2024-03-08 | 2024-03-08 | Mammary gland bilateral difference detection method based on 3D imaging point cloud data |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118154536A (en) |
-
2024
- 2024-03-08 CN CN202410265256.9A patent/CN118154536A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107622492B (en) | Lung fissure segmentation method and system | |
US11576645B2 (en) | Systems and methods for scanning a patient in an imaging system | |
CN107481228B (en) | Human back scoliosis angle measuring method based on computer vision | |
US11576578B2 (en) | Systems and methods for scanning a patient in an imaging system | |
US8965108B2 (en) | Method and system of automatic determination of geometric elements from a 3D medical image of a bone | |
CN110338841B (en) | Three-dimensional imaging data display processing method and three-dimensional ultrasonic imaging method and system | |
CN105719278B (en) | A kind of medical image cutting method based on statistics deformation model | |
Douglas | Image processing for craniofacial landmark identification and measurement: a review of photogrammetry and cephalometry | |
JP5337845B2 (en) | How to perform measurements on digital images | |
WO2018215832A2 (en) | Methods for using radial endobronchial ultrasound probes for three-dimensional reconstruction of images and improved target localization | |
CN114129240B (en) | Method, system and device for generating guide information and electronic equipment | |
CN112258516B (en) | Method for generating scoliosis image detection model | |
WO2017027638A1 (en) | 3d reconstruction and registration of endoscopic data | |
US20130170726A1 (en) | Registration of scanned objects obtained from different orientations | |
JP2011504115A (en) | Method, system and computer-readable medium for mapping a region of a model of an object containing anatomical structures from a single image data to an image used for diagnostic or therapeutic intervention | |
Michoński et al. | Automatic recognition of surface landmarks of anatomical structures of back and posture | |
CN116580068B (en) | Multi-mode medical registration method based on point cloud registration | |
CN113643176B (en) | Rib display method and device | |
JP2015530155A (en) | Analysis Morphomics: High-speed medical image automatic analysis method | |
CN111260704A (en) | Vascular structure 3D/2D rigid registration method and device based on heuristic tree search | |
CN111166332B (en) | Radiotherapy target region delineation method based on magnetic resonance spectrum and magnetic resonance image | |
US20200250815A1 (en) | Heart Position Estimation | |
Gibson et al. | Optic nerve head registration via hemispherical surface and volume registration | |
CN117078840A (en) | Automatic quantitative calculation method for three-dimensional modeling of hip joint based on CT image | |
CN110811829B (en) | Construction method and system based on femoral rotation axis and varus analysis model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |