KR20130013506A - Marker recognition method according to the direction of x-ray scan - Google Patents

Marker recognition method according to the direction of x-ray scan Download PDF

Info

Publication number
KR20130013506A
KR20130013506A KR1020110075203A KR20110075203A KR20130013506A KR 20130013506 A KR20130013506 A KR 20130013506A KR 1020110075203 A KR1020110075203 A KR 1020110075203A KR 20110075203 A KR20110075203 A KR 20110075203A KR 20130013506 A KR20130013506 A KR 20130013506A
Authority
KR
South Korea
Prior art keywords
image
ultrasound
present disclosure
registration
marker according
Prior art date
Application number
KR1020110075203A
Other languages
Korean (ko)
Inventor
김철영
양우석
Original Assignee
주식회사 사이버메드
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 사이버메드 filed Critical 주식회사 사이버메드
Priority to KR1020110075203A priority Critical patent/KR20130013506A/en
Publication of KR20130013506A publication Critical patent/KR20130013506A/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Theoretical Computer Science (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Pulmonology (AREA)
  • Software Systems (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The present disclosure provides a method of recognizing a marker according to a change in X-ray photographing angle, the method comprising: rotating the x-ray equipment; Correcting a recognition range of the marker according to the shooting angle by utilizing state information in consideration of a decrease in the recognition rate of the marker according to the shooting angle; And a method of recognizing a marker according to a change in X-ray photographing angle, the method comprising: recognizing as a normal marker.

Description

MARKER RECOGNITION METHOD ACCORDING TO THE DIRECTION OF X-RAY SCAN}

The present disclosure relates to a method of recognizing a marker according to a change in X-ray imaging angle as a whole.

Herein, the background art relating to the present disclosure is provided, and these are not necessarily meant to be known arts.

Ultrasonic diagnostic systems show the internal structure of an object in real time in a non-destructive and non-invasive manner using ultrasound, and unlike computerized tomography (CT) or magnetic resonance (MR), it is almost harmless to the human body. However, since the ultrasound image has a low signal-to-noise ratio, registration of the CT image and the ultrasound image is performed to compensate for this.

Non-rigid image registration is a key process in a variety of medical applications. Image guided intervention is one of the applications that requires registration between pre- and post-operative images. In the case of liver, three-dimensional ultrasound images are often used for image guided interventions. In order to overcome the shortcomings of the poor quality ultrasound image, a high quality pre-procedure CT or MR image corresponding to the 3D ultrasound image is displayed during the procedure. For this purpose, alignment of ultrasound and CT images of the liver is necessary. Because ultrasound and CT images are captured at different breathing stages, local deformations appear and non-rigid image registration must be performed.

Because of the different characteristics, matching CT and ultrasound images is a difficult task. Several algorithms have been proposed for non-rigid image registration of liver 3D ultrasound images and CT (or MR) images. In registration of the 3D ultrasound image and the MR, the ultrasound image and the MR image are respectively converted into a vessel probability image, and are maximized and matched by maximizing a normalized cross correlation between the two vessel probability images. A method for performing registration between a 3D ultrasound image and a CT image by extracting and registering centerlines of blood vessels has also been proposed. However, in these methods, the registration accuracy is affected by the accuracy of the segmentation of the ultrasound and CT images.

This will be described later in the Specification for Implementation of the Invention.

SUMMARY OF THE INVENTION Herein, a general summary of the present disclosure is provided, which should not be construed as limiting the scope of the present disclosure. of its features).

According to an aspect of the present disclosure, there is provided a method of recognizing a marker according to an X-ray photographing angle change, the method comprising: rotating an X-ray apparatus; Correcting a recognition range of the marker according to the shooting angle by utilizing state information in consideration of a decrease in the recognition rate of the marker according to the shooting angle; A method of recognizing a marker according to a change in X-ray photographing angle is provided.

This will be described later in the Specification for Implementation of the Invention.

1 is a block diagram showing a non-rigid image registration system of an ultrasound image and a CT image according to an embodiment of the present disclosure.
2 is a flowchart illustrating a method of performing non-rigid image registration of an ultrasound image and a CT image in the matching unit 130 according to an exemplary embodiment of the present disclosure.
Figure 3 is an exemplary view showing the ultrasound image (A), non-rigid image pre-CT image (B) and the ultrasound-CT registration image (C) for each slice.
4 is an exemplary view showing the accuracy of the non-rigid image registration method according to the present disclosure.
5 is a view showing a CT 3D model matching method using the original data of the X-ray according to the present disclosure.
FIG. 6 illustrates a method of constructing a 3D model using bi-plane x-rays according to the present disclosure. FIG.
7 is a view illustrating a method of recognizing a marker according to a change in X-ray photographing angle according to the present disclosure.

The present disclosure will now be described in detail with reference to the accompanying drawing (s).

In this study, two features of the liver, such as blood vessels and diaphragms, are used for non-rigid image registration. In order to find an appropriate transform for non-rigid image registration, we define a cost function using the objective function of two features and minimize the cost through the optimization process.

1 is a block diagram illustrating a non-rigid image registration system of an ultrasound image and a CT image according to an exemplary embodiment of the present disclosure. Referring to FIG. 1, the matching system 100 includes an ultrasound image forming unit 110 that transmits and receives an ultrasound signal to an object (eg, liver) to form an ultrasound image. In an embodiment of the present disclosure, the ultrasound image may be a 3D ultrasound image obtained in a B-mode.

The registration system 100 further includes a CT image forming unit 120 to form a CT image. In an embodiment of the present disclosure, the CT image may be a 3D CT image. Ultrasound and CT images may be formed at different breaths.

The registration system 100 may further include a matching unit 130 for performing non-rigid image registration on the ultrasound image and the CT image. Hereinafter, the operation of the matching unit 130 will be described in detail with reference to FIG. 2.

2 is a flowchart illustrating a method of performing non-rigid image registration of an ultrasound image and a CT image in the matching unit 130 according to an exemplary embodiment of the present disclosure. Referring to FIG. 2, when the ultrasound image forming unit 110 and the CT image forming unit 120 form a 3D B-mode ultrasound image and a 3D CT image of the liver in an arbitrary breath from the same patient, the ultrasound image and the CT image Iterative closest point (ICP) based affine registration (affine registration) is performed (210).

In an embodiment of the present disclosure, a B-spline free form deformation (FFD) may be used as a transformation for modeling local deformation of the liver in the ultrasound image and the CT image. This FED equation is defined as the displacement of control points with uniform spacing. The displacement can be expressed by the deformation variable Φ. Since local deformation is expected to be smooth throughout the liver image, it is assumed to be a smooth transformation. Therefore, the 3D constraint C smooth corresponding to the 2D bending energy of the thin-plate of metal is defined and used.

The relationship between the brightness values of blood vessels and diaphragms differs between ultrasound and CT images. In the ultrasound image, the diaphragm exhibits a large brightness value due to the strong reflection of the ultrasound signal. This brightness characteristic can be related to the fact that the diaphragm is represented by a large gradient size in the CT image. On the other hand, although the contrast is reversed in the blood vessel region, the brightness value of the ultrasound image may be related to the brightness value of the CT image. The first objective function C diaphragm of the diaphragm region is obtained by the brightness value in the ultrasound image and the gradient size in the CT image, whereas the second objective function C vessel in the vessel region is obtained by using the brightness value of the ultrasound image and CT image. Is saved.

In order to obtain an objective function in the diaphragm and blood vessel region, a plurality of ROI regions are defined in the CT image (220). That is, the blood vessel and the liver are divided using a region-growing scheme, and edge regions of the divided regions are extended to define respective ROI regions. Since the calculation necessary for registration is performed only on the portion where the CT image and the ultrasound image overlap each other during registration, it is sufficient that the ROI region is defined in only one of the CT image and the ultrasound image. In an embodiment of the present disclosure, a ROI region is defined in a CT image in which vessels or liver boundaries are relatively distinguished. On the other hand, ROI masking for defining the ROI area is performed for the blood vessel area and the diaphragm area, respectively. Accordingly, ROI masking divides the two regions, the vascular region and the diaphragm region. Thereafter, the objective function C diaphragm of the diaphragm region and the objective function C vessel of the blood vessel region are formed in the ROI region (230). In an embodiment of the present disclosure, for accurate matching, an objective function is calculated based on brightness values, gradient sizes, and edge orientation angles of the ultrasound image and the CT image.

The gradient size and edge direction angle are obtained as follows. First, a structure matrix is calculated in each voxel. An eigen-vector and an eigen-value are extracted through eigen-analysis of the structural matrix. Here, the eigenvector having the largest eigenvalue is defined in the edge direction of the image, and the gradient size is defined as the eigenvalue. If the two images are perfectly matched, the corresponding edge direction angles will be the same. Considering this relationship, the degree of edge direction coincidence can be defined by the square of the inner product of two edge direction vectors.

The two objective functions C vessel and C diaphragm are given by

In the blood vessel region, the ultrasound image and the CT image have a correlation between brightness values. Therefore, statistical entropy is measured using ultrasonic image brightness value, CT image brightness value, and edge matching, and is defined as C vessel .

In the diaphragm region, the brightness value of the ultrasound image and the gradient size of the CT image are correlated with each other. Therefore, we define C diaphragm by measuring statistical entropy using the brightness of ultrasound image, the magnitude of CT image gradient and the degree of coincidence of edge direction.

Subsequently, two objective functions, C diaphragm and C vessel , form a cost function as shown in Equation 1 below (240).

Figure pat00001

Here, lambda represents a trade-off between alignment of two images and smoothness C smooth of transformation. According to the gradient descent scheme, an optimization process is performed to update the deformation variable Φ by using the gradient of the cost function as shown in Equation 2 below (250).

Figure pat00002

In Equation 2, μ represents a step size, and k represents a repetition state of the gradient falling as a natural number. For implementation, we can approximate the difference. If the condition as shown in Equation 3 is satisfied for the small amount ε, the optimization process ends.

Figure pat00003

Finally, a CT image matched with the ultrasound image is generated by forming the modified CT image by reflecting the optimized parameters in the CT image (260).

The registration system 100 according to the present disclosure may include a display unit 140 displaying an ultrasound image, a CT image, and a matched ultrasound-CT image.

Figure 3 shows the ultrasound image (A), non-rigid image pre-CT image (B) and the ultrasound-CT registration image (C) for each slice.

4 is an ultrasound image showing the accuracy of the non-rigid registration method according to the present disclosure, the markers (M1, M2) displayed on the ultrasound image (A) and pre-non-rigid CT image (B), respectively, the blood vessels in all slice images In the diaphragm image as well as the region, it is shown to be an exact match.

FIG. 5 is a diagram illustrating a CT 3D model matching method using original data of X-rays according to the present disclosure. In the CT 3D model matching method using original data of X-rays, 1944 * 1636 14bit original data using X-ray equipment is illustrated. Forming a; Performing DICOM conversion via medical image processing software; Windowing the DICOM converted image; And, there is provided an 8-bit gray image conversion and screen output from the CT 3D model matching method using the original data of the X-ray, characterized in that it comprises a.

FIG. 6 illustrates a method of constructing a 3D model using bi-plane x-rays according to the present disclosure. FIG. A method for constructing a 3D model using bi-plane x-rays, comprising: performing bi-plane x-ray angiography; Determining a position of the robot in the matched image; In addition, a method of constructing a 3D model using bi-plane x-rays is provided, including displaying the position of the robot in the matched 3D model.

7 is a view illustrating a method of recognizing a marker according to a change in X-ray photographing angle according to the present disclosure, comprising: rotating the X-ray apparatus; Correcting a recognition range of the marker according to the shooting angle by utilizing state information in consideration of a decrease in the recognition rate of the marker according to the shooting angle; A method of recognizing a marker according to a change in X-ray photographing angle is provided.

Ultrasound imaging: A Non-rigid CT image: B

Claims (1)

In the method of recognizing a marker according to the change of X-ray imaging angle,
Rotating the x-ray equipment;
Correcting a recognition range of the marker according to the shooting angle by utilizing state information in consideration of a decrease in the recognition rate of the marker according to the shooting angle; And,
Recognizing a normal marker; Method of recognizing the marker according to the X-ray photographing angle change, characterized in that it comprises a.
KR1020110075203A 2011-07-28 2011-07-28 Marker recognition method according to the direction of x-ray scan KR20130013506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110075203A KR20130013506A (en) 2011-07-28 2011-07-28 Marker recognition method according to the direction of x-ray scan

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110075203A KR20130013506A (en) 2011-07-28 2011-07-28 Marker recognition method according to the direction of x-ray scan

Publications (1)

Publication Number Publication Date
KR20130013506A true KR20130013506A (en) 2013-02-06

Family

ID=47893947

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110075203A KR20130013506A (en) 2011-07-28 2011-07-28 Marker recognition method according to the direction of x-ray scan

Country Status (1)

Country Link
KR (1) KR20130013506A (en)

Similar Documents

Publication Publication Date Title
KR101059312B1 (en) Non-rigid Image Matching Device for Ultrasound and CT Images Using Brightness and Gradient Information
RU2663649C2 (en) Segmentation of large objects from multiple three-dimensional views
US8165372B2 (en) Information processing apparatus for registrating medical images, information processing method and program
US20180211391A1 (en) Coupled segmentation in 3d conventional ultrasound and contrast-ehhanced ultrasound images
US11672505B2 (en) Correcting probe induced deformation in an ultrasound fusing imaging system
JP4584553B2 (en) An improved method for displaying temporal changes in spatially matched images
EP3174467B1 (en) Ultrasound imaging apparatus
JP5706389B2 (en) Image processing apparatus, image processing method, and image processing program
US10667786B2 (en) Ultrasound imaging apparatus and method for segmenting anatomical objects
KR20120000729A (en) Position tracking method for vascular treatment micro robot through registration between x ray image and ct 3d model
KR20120000722A (en) Non rigid fusion method for multi medicla modality
EP2710553B1 (en) Determination of a physically-varying anatomical structure
KR20130013504A (en) Ct 3d model registration method using x-ray raw data
CN111587449B (en) Image data processing method, device and system
KR20120000727A (en) Virtual x-ray image generation method with fiducial marker for multi medical modality
KR20120000731A (en) Vascular path generation method using x-ray image
KR20130013506A (en) Marker recognition method according to the direction of x-ray scan
KR20130013505A (en) 3d model construction method using bi-plane x-ray
KR20120000725A (en) Position tracking method for vascular treatment micro robot through non-rigid fusion

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination