CN116452661A - Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data - Google Patents

Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data Download PDF

Info

Publication number
CN116452661A
CN116452661A CN202310334635.4A CN202310334635A CN116452661A CN 116452661 A CN116452661 A CN 116452661A CN 202310334635 A CN202310334635 A CN 202310334635A CN 116452661 A CN116452661 A CN 116452661A
Authority
CN
China
Prior art keywords
wing
point
body surface
points
outer edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310334635.4A
Other languages
Chinese (zh)
Inventor
鲍龙
马丽娟
吴安华
张霞
蔡睿锴
蔡巍
程文
王希
崔晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shengjing Hospital of China Medical University
Shenyang Neusoft Intelligent Medical Technology Research Institute Co Ltd
Original Assignee
Shengjing Hospital of China Medical University
Shenyang Neusoft Intelligent Medical Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shengjing Hospital of China Medical University, Shenyang Neusoft Intelligent Medical Technology Research Institute Co Ltd filed Critical Shengjing Hospital of China Medical University
Priority to CN202310334635.4A priority Critical patent/CN116452661A/en
Publication of CN116452661A publication Critical patent/CN116452661A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30016Brain
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention discloses a wing point body surface projection positioning method based on NCCT image data, and belongs to the technical field of medical image processing. The method utilizes routine images at any time to check NCCT sequence data, rebuilds the appearance of the skull, automatically and individually identifies the characteristics of the eyebrows through angular point detection and Kmeans mean value clustering, accurately determines the outer edge points of the eyebrows, adopts the outer edge of the eyebrows to push a curved surface to the direction of the extension line of the eyebrows for 3.5cm, and pushes a straight line to the middle point of a zygomatic arch for 4cm upwards, and has important clinical significance.

Description

Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data
Technical Field
The invention belongs to the technical field of medical image processing, and particularly relates to a wing point body surface projection positioning method based on NCCT image data.
Background
The wing points are H-shaped bone seams formed at the intersections of frontal bone, parietal bone, temporal bone and pterygoid bone. The location of the wing points is located in the temporal fossa, above the midpoint of the zygomatic arch at the two lateral fingers. Wing points are also called "butterfly points". About 3.5cm posterior to the outer canthus of the two eye fissures, 4cm above the mid-point of the zygomatic arch. That is, after the thumb of one hand is placed on the forehead of the cheekbone, the food of the other hand, the middle finger, is placed on the cheekbone arch to form a triangle, namely the winged point.
In clinical surgery, wingspot craniotomy is one of the important surgical methods, namely, wingspot approach, craniotomy from the lateral side and exposure of brain tissue by removal of a portion of the frontal, temporal and sphenoid large wings. In cerebral hemorrhage operation drainage surgery, the wing points are used as golden reference points for locating the movement functional areas (the central part is back and forth), and the current wing point locating method is that doctors depend on experience and are measured by a ruler before operation.
Disclosure of Invention
In view of the defects existing in the prior art, the invention mainly provides a wing point body surface projection positioning method based on NCCT image data.
In order to achieve the above purpose, the invention adopts the following technical scheme: a wing point body surface projection positioning method based on NCCT image data comprises the following steps:
first, correcting head CT data;
secondly, extracting the overall outline of the head;
thirdly, extracting a skull region according to the overall head outline obtained in the second step to obtain three-dimensional data of the whole head;
fourthly, positioning the eyebrows, and determining three-dimensional space coordinates (x, y) of wing points;
extracting cross section fault data of a brow arch bone region, performing Harris_T corner detection on the outer edge of a bone structure of a fault, and determining corner points of all faults;
kmeans mean clustering is carried out on the obtained point clusters to obtain the outer edge point P of the eyebrow 1
Mapping the outer edge point of the eyebrow to the body surface to obtain a body surface projection point P of the outer edge of the eyebrow 2
According to the projection point P of the body surface 2 Body surface Point P3.5 cm below the sagittal plane 3 Is determined as the coordinate of the wing point x, y axis;
fifthly, upwards moving the middle point of the zygomatic arch by 4cm, and determining the z-axis coordinate of the wing point;
and sixthly, obtaining wing point coordinates (x, y, z).
Compared with the prior art, the invention has the following advantages: a wing point body surface projection positioning method based on NCCT image data belongs to the technical field of medical image processing. The method utilizes routine images at any time to check NCCT sequence data, rebuilds the appearance of the skull, automatically and individually identifies the characteristics of the eyebrows through angular point detection and Kmeans mean value clustering, accurately determines the outer edge point of the eyebrows, adopts the outer edge of the eyebrows to push a curved surface to the direction of the extension line of the eyebrows for 3.5cm, and pushes a straight line to the middle point of a zygomatic arch for 4cm upwards, and automatically and rapidly positions wing points. The body surface wing point positioning is realized automatically, intelligently and individually, manual measurement is replaced, and time and the positioning precision of the large wing point are saved.
Drawings
Fig. 1 is a flowchart of a wing point body surface projection positioning method based on NCCT image data.
Fig. 2 is a graph of standard CT data de-bedplate effect.
Fig. 3 is an effect diagram of a partial fault after three-dimensional correction.
Fig. 4 is a graph of the results of partial tomosynthesis of head contours.
Fig. 5 is a diagram of the three-dimensional reconstruction effect of the skull region and the definition of the three-dimensional coordinate system in space.
Fig. 6 is a coordinate plane definition diagram.
Fig. 7 is a schematic illustration of extraction of candidate layers of the outer edge of the eyebrow arch bone.
Fig. 8 is a schematic view of the image limit division of the outer edge candidate layer of the eyebrow arch bone.
Fig. 9 is a diagram of the detection result of the outer edge corner point of a part of the candidate fracture structure.
Fig. 10 is a schematic view of the projection of the outer edge points of the eyebrows.
FIG. 11 is a schematic illustration of a zygomatic arch midpoint determination z-axis coordinates.
Detailed Description
The invention is described in further detail below with reference to specific examples and figures. The following examples are merely illustrative of the present invention and should not be construed as limiting the invention.
First, three-dimensional correction is performed on the image.
1) Brain CT without focus, obvious abnormality of bone characteristic and angular offset of head during CT scanning is selected as correction reference and is called standard data.
2) And (5) eliminating interference of standard data bed boards. The deboard operation is performed by conventional methods in the art, such as by finding the largest connected region of the bony structure and removing the highlights (as shown in fig. 2).
3) And (5) removing the interference of the data (brain CT of the wing points to be determined) to be corrected on the bed board.
4) Three-dimensional correction of the data to be corrected (brain CT of the wing points to be determined) is performed using an ITK three-dimensional rigid registration tool, and a correction chart of three of the faults is shown in fig. 3.
Second, extracting the whole outline of the head
Extracting the head integral outline from the brain CT to be detected after the correction in the step one, wherein the head integral outline refers to the head skin and all head areas wrapped by the skin, and the assumed head outline ROI area is vAirMask. The CT value (HU) of air is about-1000, and since the algorithm of three-dimensional correction can normalize partial area pixels near the head to be 0, the area with the pixel value larger than 0 is selected to determine the outline area of each flat scan CT image:
vAirMask i =1,vImage i >0
where vmage is a flat scan CT image, i represents i Zhang Duanceng, and the value range of i is determined according to the number of scan layers of a specific CT image, and taking a head CT image with a scan layer thickness of 5mm and a total of 32 flat scan CT images as an example, i=1, 2, and 3 … ….
Fig. 4 shows a contour map of three of the slices, with solid vAirMask acquired using a three-dimensional morphological layer-by-layer process, i.e., a head contour ROI region.
And thirdly, extracting a skull Mask (gray value).
And (3) extracting the skull part within the vAirmask range obtained in the step two. According to the CT value (HU) 150-1000 of the bone structure, the skull region is taken out, and a three-dimensional space coordinate system is determined, and fig. 5 shows the three-dimensional reconstruction effect diagram of the extracted skull.
Fourthly, positioning the eyebrows, and determining the space coordinates (x, y) of the wing points.
1) The fault of the eyebrow is locked at the xoz plane.
The sagittal plane of the brain was defined as the xoz plane, the transverse plane was defined as the xoy plane, the coronal plane was defined as the yoz plane (as shown in FIG. 6), and all the faults iSlice within 2.5cm above the sagittal plane were taken as the reference of the fault (transverse plane) with the longest brain tissue area ROI As a candidate layer for extraction of the outer edge of the arch bone (as shown in fig. 7).
2) Locking the quadrant in which the eyebrow is located.
The image data is subjected to 3D correction, and the region where the eyebrows are located is located in the first quadrant and the second quadrant of the cross section, and because the left side and the right side are symmetrical, when in actual use, which flank point needs to be located, the same side of the eyebrows is detected, and the left flank point is detected as an example and is located in the second quadrant of the cross section (as shown in fig. 8).
And (3) corner detection, namely performing Harris_T corner detection algorithm (shown in fig. 9) on the outer edge of the bone structure of the candidate fault determined in the step (1), and finding out the corners of all faults (shown in fig. 9).
3) Carrying out Kmeans mean value clustering on the point clusters (angular points) obtained in the step 2 to obtain the final coordinate P of the outer edge point of the eyebrow 1 (x 1 ,y 1 ,z 1 )。
5) The outer coordinates of the eyebrows are mapped to the body surface. Through P 1 (x 1 ,y 1 ,z 1 ) The coordinates determine the sagittal plane (xoz plane) in which they lie, and P 1 (x 1 ,y 1 ,z 1 ) The z-direction straight line in the sagittal plane moves along the x-direction with the head contour vAirmask (: iSlice) ROI ) Intersection, i.e. P 1 (x 1 ,y 1 Z) projecting the obtained projections onto the skin surface to obtain projections P of the outer edge of the eyebrow 2 (x 2 ,y 2 ,z 2 )。
6) Will P 2 (x 2 ,y 2 ,z 2 ) The x-direction straight line where z is located advances downwards (z-direction) along the current layer vAirMask (i.e.: iSlice) by 3.5cm to P 3 (x 3 ,y 3 ,z 3 ) The projection coordinate x of the wing point on the xoy plane is determined 3 ,y 3 (as shown in fig. 10).
Fifth, the three-dimensional z-axis coordinates of the wing points are determined by using the mid-points of the zygomatic arches (as shown in fig. 11).
The mid-point of the zygomatic arch is pushed up by 4cm, and the z-axis coordinate of the wing point is determined. The middle point of the zygomatic arch is the position (the reference point nasal root) with the length of the connecting line of the nasal root and the occipital protuberance being 0.4 times, and the determination methods of the nasal root and the occipital protuberance are respectively according to the prior art.
And sixthly, combining the fourth step and the fifth step to accurately determine the space three-dimensional coordinates of the wing points.

Claims (9)

1. The wing point body surface projection positioning method based on NCCT image data is characterized by comprising the following steps of:
first, correcting head CT data;
secondly, extracting the overall outline of the head according to the corrected brain CT to be detected obtained in the first step;
thirdly, extracting a skull region according to the overall head outline obtained in the second step to obtain three-dimensional data of the whole head;
fourthly, positioning the eyebrows, and determining coordinates (x, y) of wing points in the space x and the space y under a three-dimensional space (x, y, z) coordinate system;
extracting cross section fault data of a brow arch bone region, performing Harris_T corner detection on the outer edge of a bone structure of a fault, and extracting corner points of all faults;
kmeans mean value clustering is carried out on the obtained point clusters to obtain the outer edge points of the eyebrowsP 1
Mapping the outer edge points of the eyebrows to the body surface to obtain the projection points of the outer edges of the eyebrowsP 2
According to the projection points of the body surfaceP 2 Body surface Point 3.5cm below the sagittal planeP 3 Is determined as a wing pointxyAn axis coordinate;
fifthly, upwards moving the middle point of the zygomatic arch by 4cm, and determining the z-axis coordinate of the wing point;
and sixthly, obtaining wing point coordinates (x, y, z).
2. The method according to claim 1, characterized in that:
in the first step, an ITK three-dimensional rigid registration tool is adopted, and three-dimensional correction is carried out on brain CT data of the wing points to be determined through standard CT data.
3. The method according to claim 1, characterized in that:
and step two, adopting three-dimensional morphology layer-by-layer processing to obtain a solid head outline area.
4. The method according to claim 1, characterized in that:
and thirdly, extracting the skull part in the vAirmask range obtained in the second step to obtain head three-dimensional data.
5. The method according to claim 1, characterized in that
In the fourth step, the eyebrow is positioned, and the specific steps for determining the three-dimensional space coordinates (x, y) of the wing point are as follows:
(1) Locking the fault of the eyebrow in the plane
Taking the fault with the longest brain tissue area on the sagittal plane as a reference, and taking all faults within the range of 2.5cm above the fault as candidate layers for extracting the outer edge of the eyebrow arch bone;
(2) Locking the quadrant of the eyebrow
Dividing the candidate layer determined in the step (1) into four quadrants, and determining the quadrant in which the eyebrow is positioned according to the position of the wing point to be determined;
(3) Performing corner detection, namely performing Harris_T corner detection algorithm on the outer edge of the bone structure of the candidate fault determined in the step 1, and finding out corners in the target quadrant areas of all faults;
(4) Performing Kmeans mean clustering on the corner point clusters obtained in the step (3) to obtain coordinates of the outer edge points of the eyebrowsP 1 (x 1 , y 1 , z 1 );
(5) Coordinates of the outer edge of the eyebrowP 1 (x 1 , y 1 , z 1 ) Mapping to the body surface to obtain the projection of the outer edge of the eyebrowP 2 (x 2 , y 2 , z 2 );
(6) Will beP 2 (x 2 , y 2 , z 2 ) The point advances 3.5cm to the cerebellum along the surface of the cross sectionP 3 (x 3 , y 3 , z) The projection coordinates of the wing points on the plane are as followsx 3 , y 3 The projection coordinates of the wing points on the Z plane are determined to be as follows by combining the forward direction of the mid-point of the zygomatic arch to 4cmz 3 To sum up to obtain the three-dimensional coordinates of the wing points in spacex 3 , y 3 ,z 3
6. A model for positioning the body surface projection of a wing point, which is constructed by the method for positioning the body surface projection of the wing point based on NCCT image data according to any one of claims 1 to 5.
7. A wing-point body-surface projection positioning device, comprising the wing-point body-surface projection positioning model according to claim 6.
8. An electronic device comprising a memory and a processor, the memory storing a computer program executable on the processor, wherein the processor, when executing the program, performs the steps in the NCCT image data based wing point body surface projection localization method of any one of claims 1 to 5.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps in the NCCT image data based wing point body surface projection positioning method according to any one of claims 1 to 5.
CN202310334635.4A 2023-03-30 2023-03-30 Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data Pending CN116452661A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310334635.4A CN116452661A (en) 2023-03-30 2023-03-30 Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310334635.4A CN116452661A (en) 2023-03-30 2023-03-30 Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data

Publications (1)

Publication Number Publication Date
CN116452661A true CN116452661A (en) 2023-07-18

Family

ID=87131468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310334635.4A Pending CN116452661A (en) 2023-03-30 2023-03-30 Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data

Country Status (1)

Country Link
CN (1) CN116452661A (en)

Similar Documents

Publication Publication Date Title
CN105250062B (en) A kind of 3D printing bone orthopedic brace preparation method based on medical image
CN110599494B (en) Rib image reconstruction system and method, terminal and readable storage medium
CN106997594B (en) Method and device for positioning eye tissue
CN110946652B (en) Method and device for planning screw path of bone screw
CN108765483B (en) Method and system for determining mid-sagittal plane from brain CT image
US20170340390A1 (en) Computer-Assisted Osteocutaneous Free Flap Reconstruction
CN106960439B (en) A kind of vertebrae identification device and method
CN109360213A (en) A kind of automation centrum recognition methods based on backbone ultrasound coronal image
CN108898578B (en) Medical image processing method and device and computer storage medium
CN114404039B (en) Tissue drift correction method and device for three-dimensional model, electronic equipment and storage medium
BR102018076393A2 (en) COLOR-CODED FACIAL MAPS WITH DISTANCE BETWEEN EAR, NOSE AND THROAT BONES
Liu et al. Accuracy validation for medical image registration algorithms: a review
CN113469935B (en) Automatic detection and positioning method for posterior superior iliac spine based on CT image
Li et al. A fully automatic surgical registration method for percutaneous abdominal puncture surgical navigation
CN113274130A (en) Markless surgery registration method for optical surgery navigation system
CN116452661A (en) Wing point body surface projection positioning method, device, storage medium and equipment based on NCCT image data
Chen et al. 3-D printing based production of head and neck masks for radiation therapy using CT volume data: a fully automatic framework
Rasoulian et al. A statistical multi-vertebrae shape+ pose model for segmentation of CT images
CN112017275B (en) Auxiliary positioning method for face midsagittal reference plane
Seo et al. Mandible shape modeling using the second eigenfunction of the Laplace-Beltrami operator
CN110858412A (en) Image registration-based heart coronary artery CTA model establishing method
CN114283179A (en) Real-time fracture far-near end space pose acquisition and registration system based on ultrasonic images
Eggers et al. Image data acquisition and segmentation for accurate modeling of the calvarium
TWI790023B (en) Method and system for motion detection and correction of medical images, and computer readable medium thereof
WO2023103975A1 (en) Medical image movement detection and correction method and system, and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination