CN116452661A - Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data - Google Patents
Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data Download PDFInfo
- Publication number
- CN116452661A CN116452661A CN202310334635.4A CN202310334635A CN116452661A CN 116452661 A CN116452661 A CN 116452661A CN 202310334635 A CN202310334635 A CN 202310334635A CN 116452661 A CN116452661 A CN 116452661A
- Authority
- CN
- China
- Prior art keywords
- wing
- point
- body surface
- points
- outer edge
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- 102000056548 Member 3 Solute Carrier Family 12 Human genes 0.000 title claims abstract description 14
- 108091006623 SLC12A3 Proteins 0.000 title claims abstract description 14
- 210000004709 eyebrow Anatomy 0.000 claims abstract description 30
- 210000000216 zygoma Anatomy 0.000 claims abstract description 13
- 210000003625 skull Anatomy 0.000 claims abstract description 10
- 238000001514 detection method Methods 0.000 claims abstract description 9
- 238000012545 processing Methods 0.000 claims abstract description 4
- 210000003128 head Anatomy 0.000 claims description 22
- 210000000988 bone and bone Anatomy 0.000 claims description 14
- 238000012937 correction Methods 0.000 claims description 9
- 210000004556 brain Anatomy 0.000 claims description 7
- 210000005013 brain tissue Anatomy 0.000 claims description 3
- 238000013507 mapping Methods 0.000 claims description 3
- 239000007787 solid Substances 0.000 claims description 2
- 238000004590 computer program Methods 0.000 claims 3
- 210000001638 cerebellum Anatomy 0.000 claims 1
- 230000004807 localization Effects 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000007428 craniotomy Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 210000003811 finger Anatomy 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 206010008111 Cerebral haemorrhage Diseases 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 210000002454 frontal bone Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 210000003455 parietal bone Anatomy 0.000 description 1
- 210000003582 temporal bone Anatomy 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
- G06V10/763—Non-hierarchical techniques, e.g. based on statistics of modelling distributions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20164—Salient point detection; Corner detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30016—Brain
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Probability & Statistics with Applications (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
The invention discloses a wing point body surface projection positioning method based on NCCT image data, and belongs to the technical field of medical image processing. The method utilizes routine images at any time to check NCCT sequence data, rebuilds the appearance of the skull, automatically and individually identifies the characteristics of the eyebrows through angular point detection and Kmeans mean value clustering, accurately determines the outer edge points of the eyebrows, adopts the outer edge of the eyebrows to push a curved surface to the direction of the extension line of the eyebrows for 3.5cm, and pushes a straight line to the middle point of a zygomatic arch for 4cm upwards, and has important clinical significance.
Description
Technical Field
The invention belongs to the technical field of medical image processing, and particularly relates to a wing point body surface projection positioning method based on NCCT image data.
Background
The wing points are H-shaped bone seams formed at the intersections of frontal bone, parietal bone, temporal bone and pterygoid bone. The location of the wing points is located in the temporal fossa, above the midpoint of the zygomatic arch at the two lateral fingers. Wing points are also called "butterfly points". About 3.5cm posterior to the outer canthus of the two eye fissures, 4cm above the mid-point of the zygomatic arch. That is, after the thumb of one hand is placed on the forehead of the cheekbone, the food of the other hand, the middle finger, is placed on the cheekbone arch to form a triangle, namely the winged point.
In clinical surgery, wingspot craniotomy is one of the important surgical methods, namely, wingspot approach, craniotomy from the lateral side and exposure of brain tissue by removal of a portion of the frontal, temporal and sphenoid large wings. In cerebral hemorrhage operation drainage surgery, the wing points are used as golden reference points for locating the movement functional areas (the central part is back and forth), and the current wing point locating method is that doctors depend on experience and are measured by a ruler before operation.
Disclosure of Invention
In view of the defects existing in the prior art, the invention mainly provides a wing point body surface projection positioning method based on NCCT image data.
In order to achieve the above purpose, the invention adopts the following technical scheme: a wing point body surface projection positioning method based on NCCT image data comprises the following steps:
first, correcting head CT data;
secondly, extracting the overall outline of the head;
thirdly, extracting a skull region according to the overall head outline obtained in the second step to obtain three-dimensional data of the whole head;
fourthly, positioning the eyebrows, and determining three-dimensional space coordinates (x, y) of wing points;
extracting cross section fault data of a brow arch bone region, performing Harris_T corner detection on the outer edge of a bone structure of a fault, and determining corner points of all faults;
kmeans mean clustering is carried out on the obtained point clusters to obtain the outer edge point P of the eyebrow 1 ;
Mapping the outer edge point of the eyebrow to the body surface to obtain a body surface projection point P of the outer edge of the eyebrow 2 ;
According to the projection point P of the body surface 2 Body surface Point P3.5 cm below the sagittal plane 3 Is determined as the coordinate of the wing point x, y axis;
fifthly, upwards moving the middle point of the zygomatic arch by 4cm, and determining the z-axis coordinate of the wing point;
and sixthly, obtaining wing point coordinates (x, y, z).
Compared with the prior art, the invention has the following advantages: a wing point body surface projection positioning method based on NCCT image data belongs to the technical field of medical image processing. The method utilizes routine images at any time to check NCCT sequence data, rebuilds the appearance of the skull, automatically and individually identifies the characteristics of the eyebrows through angular point detection and Kmeans mean value clustering, accurately determines the outer edge point of the eyebrows, adopts the outer edge of the eyebrows to push a curved surface to the direction of the extension line of the eyebrows for 3.5cm, and pushes a straight line to the middle point of a zygomatic arch for 4cm upwards, and automatically and rapidly positions wing points. The body surface wing point positioning is realized automatically, intelligently and individually, manual measurement is replaced, and time and the positioning precision of the large wing point are saved.
Drawings
Fig. 1 is a flowchart of a wing point body surface projection positioning method based on NCCT image data.
Fig. 2 is a graph of standard CT data de-bedplate effect.
Fig. 3 is an effect diagram of a partial fault after three-dimensional correction.
Fig. 4 is a graph of the results of partial tomosynthesis of head contours.
Fig. 5 is a diagram of the three-dimensional reconstruction effect of the skull region and the definition of the three-dimensional coordinate system in space.
Fig. 6 is a coordinate plane definition diagram.
Fig. 7 is a schematic illustration of extraction of candidate layers of the outer edge of the eyebrow arch bone.
Fig. 8 is a schematic view of the image limit division of the outer edge candidate layer of the eyebrow arch bone.
Fig. 9 is a diagram of the detection result of the outer edge corner point of a part of the candidate fracture structure.
Fig. 10 is a schematic view of the projection of the outer edge points of the eyebrows.
FIG. 11 is a schematic illustration of a zygomatic arch midpoint determination z-axis coordinates.
Detailed Description
The invention is described in further detail below with reference to specific examples and figures. The following examples are merely illustrative of the present invention and should not be construed as limiting the invention.
First, three-dimensional correction is performed on the image.
1) Brain CT without focus, obvious abnormality of bone characteristic and angular offset of head during CT scanning is selected as correction reference and is called standard data.
2) And (5) eliminating interference of standard data bed boards. The deboard operation is performed by conventional methods in the art, such as by finding the largest connected region of the bony structure and removing the highlights (as shown in fig. 2).
3) And (5) removing the interference of the data (brain CT of the wing points to be determined) to be corrected on the bed board.
4) Three-dimensional correction of the data to be corrected (brain CT of the wing points to be determined) is performed using an ITK three-dimensional rigid registration tool, and a correction chart of three of the faults is shown in fig. 3.
Second, extracting the whole outline of the head
Extracting the head integral outline from the brain CT to be detected after the correction in the step one, wherein the head integral outline refers to the head skin and all head areas wrapped by the skin, and the assumed head outline ROI area is vAirMask. The CT value (HU) of air is about-1000, and since the algorithm of three-dimensional correction can normalize partial area pixels near the head to be 0, the area with the pixel value larger than 0 is selected to determine the outline area of each flat scan CT image:
vAirMask i =1,vImage i >0
where vmage is a flat scan CT image, i represents i Zhang Duanceng, and the value range of i is determined according to the number of scan layers of a specific CT image, and taking a head CT image with a scan layer thickness of 5mm and a total of 32 flat scan CT images as an example, i=1, 2, and 3 … ….
Fig. 4 shows a contour map of three of the slices, with solid vAirMask acquired using a three-dimensional morphological layer-by-layer process, i.e., a head contour ROI region.
And thirdly, extracting a skull Mask (gray value).
And (3) extracting the skull part within the vAirmask range obtained in the step two. According to the CT value (HU) 150-1000 of the bone structure, the skull region is taken out, and a three-dimensional space coordinate system is determined, and fig. 5 shows the three-dimensional reconstruction effect diagram of the extracted skull.
Fourthly, positioning the eyebrows, and determining the space coordinates (x, y) of the wing points.
1) The fault of the eyebrow is locked at the xoz plane.
The sagittal plane of the brain was defined as the xoz plane, the transverse plane was defined as the xoy plane, the coronal plane was defined as the yoz plane (as shown in FIG. 6), and all the faults iSlice within 2.5cm above the sagittal plane were taken as the reference of the fault (transverse plane) with the longest brain tissue area ROI As a candidate layer for extraction of the outer edge of the arch bone (as shown in fig. 7).
2) Locking the quadrant in which the eyebrow is located.
The image data is subjected to 3D correction, and the region where the eyebrows are located is located in the first quadrant and the second quadrant of the cross section, and because the left side and the right side are symmetrical, when in actual use, which flank point needs to be located, the same side of the eyebrows is detected, and the left flank point is detected as an example and is located in the second quadrant of the cross section (as shown in fig. 8).
And (3) corner detection, namely performing Harris_T corner detection algorithm (shown in fig. 9) on the outer edge of the bone structure of the candidate fault determined in the step (1), and finding out the corners of all faults (shown in fig. 9).
3) Carrying out Kmeans mean value clustering on the point clusters (angular points) obtained in the step 2 to obtain the final coordinate P of the outer edge point of the eyebrow 1 (x 1 ,y 1 ,z 1 )。
5) The outer coordinates of the eyebrows are mapped to the body surface. Through P 1 (x 1 ,y 1 ,z 1 ) The coordinates determine the sagittal plane (xoz plane) in which they lie, and P 1 (x 1 ,y 1 ,z 1 ) The z-direction straight line in the sagittal plane moves along the x-direction with the head contour vAirmask (: iSlice) ROI ) Intersection, i.e. P 1 (x 1 ,y 1 Z) projecting the obtained projections onto the skin surface to obtain projections P of the outer edge of the eyebrow 2 (x 2 ,y 2 ,z 2 )。
6) Will P 2 (x 2 ,y 2 ,z 2 ) The x-direction straight line where z is located advances downwards (z-direction) along the current layer vAirMask (i.e.: iSlice) by 3.5cm to P 3 (x 3 ,y 3 ,z 3 ) The projection coordinate x of the wing point on the xoy plane is determined 3 ,y 3 (as shown in fig. 10).
Fifth, the three-dimensional z-axis coordinates of the wing points are determined by using the mid-points of the zygomatic arches (as shown in fig. 11).
The mid-point of the zygomatic arch is pushed up by 4cm, and the z-axis coordinate of the wing point is determined. The middle point of the zygomatic arch is the position (the reference point nasal root) with the length of the connecting line of the nasal root and the occipital protuberance being 0.4 times, and the determination methods of the nasal root and the occipital protuberance are respectively according to the prior art.
And sixthly, combining the fourth step and the fifth step to accurately determine the space three-dimensional coordinates of the wing points.
Claims (9)
1. The wing point body surface projection positioning method based on NCCT image data is characterized by comprising the following steps of:
first, correcting head CT data;
secondly, extracting the overall outline of the head according to the corrected brain CT to be detected obtained in the first step;
thirdly, extracting a skull region according to the overall head outline obtained in the second step to obtain three-dimensional data of the whole head;
fourthly, positioning the eyebrows, and determining coordinates (x, y) of wing points in the space x and the space y under a three-dimensional space (x, y, z) coordinate system;
extracting cross section fault data of a brow arch bone region, performing Harris_T corner detection on the outer edge of a bone structure of a fault, and extracting corner points of all faults;
kmeans mean value clustering is carried out on the obtained point clusters to obtain the outer edge points of the eyebrowsP 1 ;
Mapping the outer edge points of the eyebrows to the body surface to obtain the projection points of the outer edges of the eyebrowsP 2 ;
According to the projection points of the body surfaceP 2 Body surface Point 3.5cm below the sagittal planeP 3 Is determined as a wing pointx,yAn axis coordinate;
fifthly, upwards moving the middle point of the zygomatic arch by 4cm, and determining the z-axis coordinate of the wing point;
and sixthly, obtaining wing point coordinates (x, y, z).
2. The method according to claim 1, characterized in that:
in the first step, an ITK three-dimensional rigid registration tool is adopted, and three-dimensional correction is carried out on brain CT data of the wing points to be determined through standard CT data.
3. The method according to claim 1, characterized in that:
and step two, adopting three-dimensional morphology layer-by-layer processing to obtain a solid head outline area.
4. The method according to claim 1, characterized in that:
and thirdly, extracting the skull part in the vAirmask range obtained in the second step to obtain head three-dimensional data.
5. The method according to claim 1, characterized in that
In the fourth step, the eyebrow is positioned, and the specific steps for determining the three-dimensional space coordinates (x, y) of the wing point are as follows:
(1) Locking the fault of the eyebrow in the plane
Taking the fault with the longest brain tissue area on the sagittal plane as a reference, and taking all faults within the range of 2.5cm above the fault as candidate layers for extracting the outer edge of the eyebrow arch bone;
(2) Locking the quadrant of the eyebrow
Dividing the candidate layer determined in the step (1) into four quadrants, and determining the quadrant in which the eyebrow is positioned according to the position of the wing point to be determined;
(3) Performing corner detection, namely performing Harris_T corner detection algorithm on the outer edge of the bone structure of the candidate fault determined in the step 1, and finding out corners in the target quadrant areas of all faults;
(4) Performing Kmeans mean clustering on the corner point clusters obtained in the step (3) to obtain coordinates of the outer edge points of the eyebrowsP 1 (x 1 , y 1 , z 1 );
(5) Coordinates of the outer edge of the eyebrowP 1 (x 1 , y 1 , z 1 ) Mapping to the body surface to obtain the projection of the outer edge of the eyebrowP 2 (x 2 , y 2 , z 2 );
(6) Will beP 2 (x 2 , y 2 , z 2 ) The point advances 3.5cm to the cerebellum along the surface of the cross sectionP 3 (x 3 , y 3 , z) The projection coordinates of the wing points on the plane are as followsx 3 , y 3 The projection coordinates of the wing points on the Z plane are determined to be as follows by combining the forward direction of the mid-point of the zygomatic arch to 4cmz 3 To sum up to obtain the three-dimensional coordinates of the wing points in spacex 3 , y 3 ,z 3 。
6. A model for positioning the body surface projection of a wing point, which is constructed by the method for positioning the body surface projection of the wing point based on NCCT image data according to any one of claims 1 to 5.
7. A wing-point body-surface projection positioning device, comprising the wing-point body-surface projection positioning model according to claim 6.
8. An electronic device comprising a memory and a processor, the memory storing a computer program executable on the processor, wherein the processor, when executing the program, performs the steps in the NCCT image data based wing point body surface projection localization method of any one of claims 1 to 5.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps in the NCCT image data based wing point body surface projection positioning method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310334635.4A CN116452661A (en) | 2023-03-30 | 2023-03-30 | Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310334635.4A CN116452661A (en) | 2023-03-30 | 2023-03-30 | Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116452661A true CN116452661A (en) | 2023-07-18 |
Family
ID=87131468
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310334635.4A Pending CN116452661A (en) | 2023-03-30 | 2023-03-30 | Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116452661A (en) |
-
2023
- 2023-03-30 CN CN202310334635.4A patent/CN116452661A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105250062B (en) | A kind of 3D printing bone orthopedic brace preparation method based on medical image | |
CN110599494B (en) | Rib image reconstruction system and method, terminal and readable storage medium | |
CN106997594B (en) | Method and device for positioning eye tissue | |
CN110946652B (en) | Method and device for planning screw path of bone screw | |
CN108765483B (en) | Method and system for determining mid-sagittal plane from brain CT image | |
US20170340390A1 (en) | Computer-Assisted Osteocutaneous Free Flap Reconstruction | |
CN106960439B (en) | A kind of vertebrae identification device and method | |
CN109360213A (en) | A kind of automation centrum recognition methods based on backbone ultrasound coronal image | |
CN108898578B (en) | Medical image processing method and device and computer storage medium | |
CN114404039B (en) | Tissue drift correction method and device for three-dimensional model, electronic equipment and storage medium | |
BR102018076393A2 (en) | COLOR-CODED FACIAL MAPS WITH DISTANCE BETWEEN EAR, NOSE AND THROAT BONES | |
Liu et al. | Accuracy validation for medical image registration algorithms: a review | |
CN113469935B (en) | Automatic detection and positioning method for posterior superior iliac spine based on CT image | |
Li et al. | A fully automatic surgical registration method for percutaneous abdominal puncture surgical navigation | |
CN113274130A (en) | Markless surgery registration method for optical surgery navigation system | |
CN116452661A (en) | Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data | |
Chen et al. | 3-D printing based production of head and neck masks for radiation therapy using CT volume data: a fully automatic framework | |
Rasoulian et al. | A statistical multi-vertebrae shape+ pose model for segmentation of CT images | |
CN112017275B (en) | Auxiliary positioning method for face midsagittal reference plane | |
Seo et al. | Mandible shape modeling using the second eigenfunction of the Laplace-Beltrami operator | |
CN110858412A (en) | Image registration-based heart coronary artery CTA model establishing method | |
CN114283179A (en) | Real-time fracture far-near end space pose acquisition and registration system based on ultrasonic images | |
Eggers et al. | Image data acquisition and segmentation for accurate modeling of the calvarium | |
TWI790023B (en) | Method and system for motion detection and correction of medical images, and computer readable medium thereof | |
WO2023103975A1 (en) | Medical image movement detection and correction method and system, and computer readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |