CN113205459B - Motion correction of angiographic images for 3D reconstruction of coronary arteries - Google Patents

Motion correction of angiographic images for 3D reconstruction of coronary arteries Download PDF

Info

Publication number
CN113205459B
CN113205459B CN202110052649.8A CN202110052649A CN113205459B CN 113205459 B CN113205459 B CN 113205459B CN 202110052649 A CN202110052649 A CN 202110052649A CN 113205459 B CN113205459 B CN 113205459B
Authority
CN
China
Prior art keywords
medical image
tree
detected
mapping
markers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110052649.8A
Other languages
Chinese (zh)
Other versions
CN113205459A (en
Inventor
世碧波
L·C·加尔西亚-佩拉扎赫雷拉
A·卡普尔
M·A·古尔森
T·帕赛利尼
T·曼斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Healthineers AG
Original Assignee
Siemens Healthineers AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/744,295 external-priority patent/US11151732B2/en
Application filed by Siemens Healthineers AG filed Critical Siemens Healthineers AG
Publication of CN113205459A publication Critical patent/CN113205459A/en
Application granted granted Critical
Publication of CN113205459B publication Critical patent/CN113205459B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

Systems and methods for computing a transformation for corrective motion between a first medical image and a second medical image are provided. One or more markers are detected in the first medical image and the second medical image. A first tree of anatomical structures is generated from the first medical image based on the one or more landmarks detected in the first medical image, and a second tree of anatomical structures is generated from the second medical image based on the one or more landmarks detected in the second medical image. One or more markers detected in the first medical image are mapped to one or more markers detected in the second medical image based on the first tree and the second tree. A transformation is calculated based on the mapping to align the first medical image and the second medical image.

Description

Motion correction of angiographic images for 3D reconstruction of coronary arteries
Technical Field
The present invention relates generally to motion correction of angiographic images, and more particularly to motion correction of angiographic images for 3D reconstruction of coronary arteries.
Background
Coronary heart disease is caused by the blockage or stenosis (narrowing) of the arteries supplying the heart, usually due to the build-up of cholesterol plaques on the arterial wall. X-ray coronary angiography is an imaging modality used to diagnose and guide the course of coronary heart disease treatment. X-ray coronary angiography is popular because of its ability to aid in diagnosis and treatment of coronary heart disease and its high spatial-temporal resolution. However, due to the loss of information in projection radiography, accurately reconstructing 3D coronary arteries from angiographic images remains a challenge. In particular, respiratory motion must be accounted for in order to accurately reconstruct the 3D coronary arteries.
Disclosure of Invention
In accordance with one or more embodiments, systems and methods are provided for computing a transformation to correct for motion between a plurality of medical images. One or more markers are detected in the first medical image and the second medical image. A first tree of anatomical structures is generated from the first medical image based on the one or more landmarks detected in the first medical image, and a second tree of anatomical structures is generated from the second medical image based on the one or more landmarks detected in the second medical image. One or more markers detected in the first medical image are mapped to one or more markers detected in the second medical image based on the first tree and the second tree. A transformation for aligning the first medical image and the second medical image is calculated based on the mapping.
In one embodiment, the first tree comprises one or more markers detected in the first medical image, the second tree comprises one or more markers detected in the second medical image, and the mapping is performed by: for each respective flag of the one or more flags in the first tree, calculating a set of candidate mappings between the respective flag and the one or more flags in the second tree, filtering the set of candidate mappings to remove candidate mappings of offspring of the particular flag where offspring of the respective flag is not mapped to candidate mappings in the second tree, and selecting a candidate mapping from the filtered set of candidate mappings based on a distance associated with each candidate mapping. The candidate mapping set may include all possible mappings between respective tokens and one or more tokens in the second tree.
In one embodiment, the transformation of the second medical image is calculated by projecting one or more markers detected in the first medical image onto respective epipolar lines of the one or more markers in the second medical image, determining the transformation of the second medical image to move the one or more markers in the second medical image towards a closest point of their respective epipolar lines, applying the transformation to the second medical image to move the one or more markers in the second medical image, and repeating the projecting, determining and applying until a stop condition is met.
In one embodiment, a first tree is generated to include one or more markers detected in a first medical image between a first starting point and a first ending point selected by a user, and a second tree is generated to include one or more markers detected in a second medical image between a second starting point and a second ending point selected by the user.
In one embodiment, the anatomical structure is a coronary artery and the one or more landmarks are detected by detecting one or more bifurcation of the coronary arteries in the first medical image and the second medical image. The first medical image and the second medical image may be different views of the anatomical structure and may be X-ray angiographic images.
In one embodiment, one or more landmarks may be detected in one or more additional medical images of the anatomical structure, a tree of the anatomical structure may be generated for each respective image of the additional medical images based on the landmarks detected in the respective images, the landmarks detected in the first medical image may be mapped with the landmarks detected in the second medical image and the landmarks detected in the additional medical images, and a transformation may be calculated to align the first medical image, the second medical image and the additional medical images based on the mapping.
These and other advantages of the present invention will be apparent to those of ordinary skill in the art by reference to the following detailed description and accompanying drawings.
Drawings
FIG. 1 illustrates a workflow for compensating for motion between X-ray angiographic images in accordance with one or more embodiments;
FIG. 2 illustrates a method for aligning medical images in accordance with one or more embodiments;
FIG. 3 illustrates a network architecture for training a full convolutional network in accordance with one or more embodiments;
FIG. 4 illustrates a method for mapping markers in a first tree to the same corresponding markers in a second tree in accordance with one or more embodiments;
FIG. 5 illustrates a method for determining a transformation to align a first medical image and a second medical image in accordance with one or more embodiments;
FIG. 6 illustrates a schematic diagram depicting epipolar geometry of markers depicted in different imaging planes in accordance with one or more embodiments; and
Fig. 7 depicts a high-level block diagram of a computer.
Detailed Description
The present invention relates generally to a method and system for motion correction of angiographic images for 3D reconstruction of coronary arteries. Embodiments of the present invention are described herein to give a visual understanding of such methods and systems for motion correction of angiographic images for 3D reconstruction of coronary arteries. A digital image is typically made up of a digital representation of one or more objects (or shapes). The digital representation of an object is generally described herein in terms of identifying and manipulating the object. Such manipulations are virtual manipulations that are accomplished in the memory or other circuitry/hardware of a computer system. Thus, it should be understood that embodiments of the invention may be performed within a computer system using data stored within the computer system.
Furthermore, it should be appreciated that while the embodiments discussed herein may be discussed with respect to motion correction of angiographic images for 3D reconstruction of coronary arteries, the invention is not so limited. Embodiments of the present invention may be applied to alignment images for any application.
FIG. 1 illustrates a workflow 100 for compensating for motion between x-ray angiographic images in accordance with one or more embodiments. In workflow 100, x-ray angiographic images 102 and 104 are input into an Artificial Intelligence (AI) system 106 trained for marker detection. The X-ray angiographic images 102 and 104 are different views of the same coronary artery of the patient and are misaligned due to respiratory motion of the patient during image acquisition. The AI system 106 outputs heat map images 108 and 110 that identify bifurcation 112 and 114 (or other anatomical landmarks) of the coronary arteries in the x-ray angiographic images 102 and 104, respectively. A mapping 116 between the prongs 112 and 114 is determined and an in-plane motion matrix 118 is calculated. The in-plane motion matrix 118 represents a transformation used to align the x-ray angiographic images 102 and 104 to spatially correspond to each other to compensate for patient motion. In-plane movement matrix 118 may be used to transform bifurcation 114 of x-ray angiographic image 104 into a projected epipolar line closer to bifurcation 112 of x-ray angiographic image 102.
FIG. 2 illustrates a method 200 for aligning medical images in accordance with one or more embodiments. The steps of method 200 may be performed by any suitable computing device, such as computer 702 of fig. 7. The method 200 will be described with reference to the workflow 100 of fig. 1.
At step 202, a first medical image and a second medical image of an anatomical structure are received. In one embodiment, the anatomical structure is a coronary artery of the patient, however, the anatomical structure may be any suitable anatomical structure of the patient. In one embodiment, the first medical image and the second medical image received at step 202 are the x-ray angiographic images 102 and 104 of fig. 1 depicting coronary arteries.
In one embodiment, the first medical image and the second medical image depict different views of the anatomical structure. For example, the first medical image and the second medical image may be acquired simultaneously or at different times at different locations relative to the anatomical structure (e.g., with a certain separation angle between the acquisitions of the first medical image and the second medical image). In another embodiment, the first medical image and the second medical image depict the same view of the anatomical structure acquired at different times but in different deformed states of the anatomical structure, e.g., due to movement of the patient (e.g., respiratory movement).
In one embodiment, the first medical image and the second medical image are x-ray angiographic images, however, it should be appreciated that the first medical image and the second medical image may also be of any suitable modality, such as, for example, x-rays, magnetic Resonance Imaging (MRI), ultrasound (US), single Photon Emission Computed Tomography (SPECT), positron Emission Tomography (PET), or any other suitable modality or combination of modalities. The first medical image and the second medical image may be received directly from an image acquisition device for acquiring the medical images, such as, for example, image acquisition device 714 of fig. 7 (e.g., an x-ray scanner, etc.). Alternatively, the first medical image and the second medical image may also be received by loading medical images previously stored on a memory or storage means of a computer system (e.g. a picture archiving and communication system PACS) or by receiving medical image data from a remote computer system via a network transmission.
At step 204, one or more markers are detected in the first medical image and the second medical image. In an embodiment, for example in case the anatomical structure is a coronary artery, the marker comprises a corresponding bifurcation of the coronary artery detected in both the first medical image and the second medical image. Detecting such bifurcation is advantageous because the bifurcation defines the geometry of the underlying coronary arteries and generally co-exists across different views of the coronary arteries. The detected flag may be identified in any suitable form, such as, for example, a heat map, binary map, or the like. In one embodiment, the markers detected at step 204 are bifurcation 112 and 114 identified on heat maps 108 and 110 detected from x-ray angiographic images 102 and 104, respectively, in fig. 1. In one embodiment, the marker comprises a catheter tip or stenosis. It should be appreciated that the markers may be any suitable markers representing anatomically significant locations on organs, bones, blood vessels, etc.
In one embodiment, the markers are detected using a machine learning network. Such a machine learning network is illustratively represented in fig. 1 as AI system 106. In one example, the machine learning network may be a Full Convolutional Network (FCN) with an encoder-decoder structure, as shown in fig. 3. However, it should be appreciated that the machine learning network may have any suitable design or architecture and is not limited to the network architecture shown in fig. 3.
Fig. 3 illustrates a network architecture 300 for training FCNs in accordance with one or more embodiments. As shown in fig. 3, the FCN has an encoder-decoder structure that includes an encoder 304 and a decoder 306. During a training or offline phase, as shown in fig. 3, encoder 304 receives as input a training image 302 depicting a coronary artery (e.g., an x-ray angiographic training image), and encodes training image 302 into a code that is substantially smaller in size than training image 302. The decoder 306 decodes the code to generate a heat map 310 that identifies the bifurcation of the coronary arteries depicted in the training image 302. Layer 308 is the last layer feature tensor before the final output layer. All intermediate information generated in the encoder 304 is shared with the decoder 306 so that there is no information loss during encoding. Training loss 314 is defined between heat map 310 and bifurcated base real annotation image 312 in training image 302. The location of the bifurcation in the base real annotation image 312 is diffused with gaussian blur to account for the uncertainty of the annotators and to ease training. According to one embodiment, during an online or inference phase, the trained FCN may be applied to detect markers in the first medical image and the second medical image at step 204 of fig. 2. In particular, the trained FCN receives one or more input images (e.g., the first medical image and the second medical image received at step 202 of fig. 2) and outputs a heat map for each input image that identifies the markers in that input image. The heat map has the same resolution and size as the input image. The location (e.g., coordinates) of the markers may be determined by applying image processing techniques such as, for example, thresholding and independent component analysis. In one embodiment, in addition to one or more input images, temporally adjacent frames are also input into the trained FCN. In one embodiment, the input image includes a channel that includes a centerline of the coronary artery in the input image, which may be used as an attention map and to improve overall detection performance.
At step 206 of fig. 2, a first tree of anatomical structures is generated from the first medical image based on the one or more landmarks detected in the first medical image, and a second tree of anatomical structures is generated from the second medical image based on the one or more landmarks detected in the second medical image. The tree includes a plurality of points representing paths from a starting point on the anatomical structure to one or more ending points in the first medical image and the second medical image. For example, where the anatomical structure is a coronary artery, the tree includes a plurality of points representing paths from the root of the coronary artery to one or more lobes of the coronary artery.
A start point and an end point are defined in the first medical image and the second medical image based on input received from a user (e.g., a clinician). For example, a user may interact with the computing device (e.g., using a mouse) to select a seed defining a start point and an end point of an anatomical structure in the first medical image and the second medical image. The first tree is generated based on the first medical image, the markers detected in the first medical image, and the start and end points defined in the first medical image. A second tree is generated based on the second medical image, the markers detected in the second medical image, and the start and end points defined in the second medical image. The first tree and the second tree are generated to include points corresponding to the detected landmarks in the first medical image and the second medical image, respectively. The first tree and the second tree may be automatically constructed based on, for example, a trace-based approach, a graph-based approach, or any other suitable approach. In one embodiment, the first tree and the second tree may be generated according to the method disclosed in U.S. patent 10,206,646, titled "Method and System for Extracting Centerline Representation of Vascular Structures in Medical Images via Optimal Path in Computational Flow Field", the disclosure of which is incorporated herein by reference in its entirety.
At step 208, one or more markers detected in the first medical image are mapped to one or more markers detected in the second medical image based on the first tree and the second tree. Fig. 1 illustratively shows a mapping 116 between flags in accordance with one embodiment. The mapping may be represented as a non-bijective, unijective mapping function M that maps each marker in the first tree (i.e., each marker detected in the first medical image) to the same corresponding marker in the second tree (i.e., the same corresponding marker detected in the second medical image). Formally, given a first tree T1 (V1, E1) with a vertex V1 representing a point in tree T1 and a side E1 connecting vertex V1, and a second tree T2 (V2, E2) with a vertex V2 representing a point in tree T2 and a side E2 connecting vertex V2, the mapping function M will flagSign and signMatching. In one embodiment, a best mapping M that maps a marker in a first tree to a corresponding marker in a second tree is determined according to method 400 of fig. 4. In one embodiment, the mapping may be determined for all points in the first tree and the second tree, and is not limited to flags.
FIG. 4 illustrates a method 400 for mapping markers in a first tree to the same corresponding markers in a second tree in accordance with one or more embodiments. The steps of method 400 may be performed at step 208 of fig. 2 for each respective flag in the first tree.
At step 402, a set of candidate mappings for respective flags in a first tree is calculated. The set of candidate mappings for the respective token in the first tree represents all possible mappings between the respective token in the first tree and the token in the second tree. If the first tree and the second tree have different numbers of tokens, a mapping of N points is performed, where N is the number of tokens in the tree with the least tokens.
At step 404, the candidate mapping set for the corresponding token in the first tree is filtered to remove candidate mappings that violate ancestors (ancestry-violating). If corresponding markFlags not mapped to candidate mappingsThe candidate mapping for the corresponding token n1 is the ancestor violation. The descendants of the flag are any point further down the tree from the start point to the end point. Candidate mappings that do not violate ancestors are considered to respect the ancestors.
At step 406, candidate mappings for respective tokens in the first tree are selected from the filtered candidate mapping set for the respective token of the first tree. In one embodiment, the candidate map having the smallest cost is selected from the set of candidate maps. For example, in one embodiment, the candidate map with the smallest cost may be the candidate map associated with the shortest distance (e.g., euclidean distance metric) to the epipolar line. Specifically, in one embodiment, for each candidate map, the marker P1 in image A is mapped to the marker P2 in image B. When the marker P1 in image a is projected onto the epipolar line L1 in image B, the euclidean distance between L1 and P2 can be calculated at the cost. The sum of all euclidean distances between the markers in image B and the projected epipolar line from image a is the total cost of the candidate map. Of all candidate mappings, the mapping with the smallest cost will be the final optimal mapping. In another embodiment, the candidate map with the smallest cost may be determined based on the location of the candidate map relative to the epipolar line. In particular, the epipolar line divides the image into two regions, and different costs may be associated with the candidate map based on which region the candidate map is located in. It should be appreciated that the cost may be any other suitable metric.
In one embodiment, the method 400 is not performed for each flag in the first tree. Instead, the quality of the mapping is compared for a different number of tokens (not necessarily all tokens in the first tree). This will make the mapping more robust to false positive detection of a bifurcation. In another embodiment, the method 400 is performed for all points in the first tree and is not limited to the flags in the first tree.
At step 210 of fig. 2, a transformation for aligning the first medical image and the second medical image is calculated based on the mapping. In one embodiment, the transformation is a motion compensation transformation for compensating for patient motion (e.g., respiratory motion). In one embodiment, the transformation may be an in-plane motion matrix, as illustratively shown by in-plane motion matrix 118 in FIG. 1, however, the transformation may be of any suitable form. According to one embodiment, the transformation may be calculated according to the method 500 of FIG. 5.
FIG. 5 illustrates a method 500 for determining a transformation to align a first medical image and a second medical image in accordance with one or more embodiments. The steps of method 500 may be performed at step 210 of fig. 2.
At step 502, markers in a first medical image are projected to respective epipolar lines of markers in a second medical image. The epipolar lines in the second medical image represent possible points in the second medical image at which particular markers depicted in the first medical image may be located.
Referring to FIG. 6, a schematic diagram 600 depicting the epipolar geometry of a marker visualized in different imaging planes is illustratively shown in accordance with one or more embodiments. Schematic diagram 600 shows imaging plane 602 of marker P610 acquired by image acquisition device A1 606 and imaging plane 604 of marker P610 acquired by image acquisition device A2 608, which image acquisition device A2 608 may be the same as or different from image acquisition device A1 606. In one embodiment, imaging plane 602 is a first medical image, imaging plane 604 is a second medical image, and marker P1 612 is a corresponding marker in method 500 of FIG. 5. It should be appreciated that the marker P610 may be located between the image acquisition devices A1 and A2 606 and A2 608 and the imaging planes 602 and 604, respectively, wherein, for example, the image acquisition devices A1 and A2 606 are x-ray image acquisition devices.
The marker P610 is captured by the image acquisition device A1 606 along a line of sight 616 in the imaging plane 602 as a point P1 612 and by the image acquisition device A2 608 along a line of sight 618 in the imaging plane 604 as a point P2 614. When point P1 612 in imaging plane 602 is projected onto imaging plane 604, point P1 612 may be positioned along any point of line of sight 616 visible in imaging plane 604, such as exemplary candidate point 624. The portion of the line of sight 616 that is visible in the imaging plane 604 is referred to as the epipolar line 620.
At step 504 of fig. 5, a transformation X of the second medical image is determined to move the markers in the second medical image towards the closest point of their respective epipolar line. The transformation X may be any transformation such as, for example, a rigid or affine transformation, depending on the number of tokens. The closest point may be determined based on euclidean distance or any other suitable distance metric. As shown in fig. 6, a transform X is determined that transforms into the image plane 604 to move the point P2 614 toward the point 626.
At step 506, transform X is applied to the second medical image to move the markers in the second medical image.
At step 508, it is determined whether a stop condition is met. In one embodiment, the stop condition is satisfied when the transform X converges (i.e., approaches the identity matrix). In another embodiment, the stop condition is met after a predetermined number of iterations. Other criteria for stopping conditions are also contemplated. If the stop condition is not met at step 508, the method 500 returns to step 502 for another iteration. If the stop condition is met at step 508, the method 500 ends at step 510. The transformations determined after one or more iterations of method 500 represent transformations used to align the first medical image and the second medical image.
At step 212 of fig. 2, the transformation is output. For example, the transformation may be output by displaying the transformation on a display device of the computer system, storing the transformation on a memory or storage device of the computer system, or by sending the transformation to a remote computer system.
Advantageously, embodiments of the present invention provide an automatic sign detection and motion correction method. In one embodiment, embodiments of the invention may be applied to correct for motion between a first medical image and a second medical image for 3D reconstruction of coronary arteries.
It should be appreciated that while the method 200 of fig. 2 is described as being used to align a first medical image with a second medical image, the present invention may also be applied to align any plurality of medical images. For example, one or more landmarks may be detected in one or more additional medical images of the anatomical structure, a tree of the anatomical structure may be generated for each respective image of the additional medical images based on the landmarks detected in the respective images, step 208 may be performed by mapping the landmarks detected in the first medical image with the landmarks detected in the second medical image and the landmarks detected in the additional medical images, and step 210 may be performed by computing a transformation based on the mapping to align the first medical image, the second medical image, and the additional medical images.
The systems, apparatus, and methods described herein may be implemented using digital electronic circuitry, or using one or more computers using well known computer processors, memory units, storage devices, computer software, and other components. Generally, a computer includes a processor for executing instructions and one or more memories for storing instructions and data. A computer may also include, or be coupled to, one or more mass storage devices, such as one or more magnetic disks, internal hard and removable disks, magneto-optical disks, and the like.
The systems, apparatuses, and methods described herein may be implemented using a computer operating in a client-server relationship. Typically, in such systems, the client computer is located remotely from the server computer and interacts via a network. The client-server relationship may be defined and controlled by computer programs running on the respective client and server computers.
The systems, apparatuses, and methods described herein may be implemented within a network-based cloud computing system. In such network-based cloud computing systems, a server or another processor connected to a network communicates with one or more client computers via the network. For example, a client computer may communicate with a server via a web browser application residing on and operating on the client computer. The client computer may store data on the server and access the data via the network. The client computer may send a request for data or a request for online services to a server via a network. The server may execute the requested service and provide data to the client computer(s). The server may also transmit data suitable for causing the client computer to perform specified functions (e.g., perform calculations, display specified data on a screen, etc.). For example, the server may send a request adapted to cause the client computer to perform one or more of the steps or functions of the methods and workflows described herein, including one or more of the steps or functions of fig. 2 and 4-5. Certain steps or functions of the methods and workflows described herein (including one or more of the steps or functions of fig. 2 and 4-5) may be performed by a server or by another processor in a network-based cloud computing system. Some steps or functions of the methods and workflows described herein (including one or more of the steps of fig. 2 and 4-5) can be performed by a client computer in a network-based cloud computing system. The steps or functions of the methods and workflows described herein (including one or more of the steps of fig. 2 and 4-5) can be performed by a server and/or client computer in a network-based cloud computing system in any combination.
The systems, apparatus, and methods described herein may be implemented using a computer program product tangibly embodied in an information carrier (e.g., in a non-transitory machine-readable storage device) for execution by a programmable processor; and the method and workflow steps described herein (including one or more of the steps or functions of fig. 2 and 4-5) may be implemented using one or more computer programs executable by such a processor. A computer program is a set of computer program instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result. A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
An exemplary computer 702 that may be used to implement the systems, apparatus, and methods described herein is depicted in fig. 7. The computer 702 includes a processor 704 operatively coupled to a data storage device 712 and a memory 710. The processor 704 controls overall operation of the computer 702 by executing computer program instructions that define such operations. The computer program instructions may be stored in a data storage device 712 or other computer-readable medium and loaded into memory 710 when execution of the computer program instructions is desired. Thus, the method and workflow steps or functions of fig. 2 and 4-5 may be defined by computer program instructions stored in memory 710 and/or data storage 712 and controlled by processor 704 executing the computer program instructions. For example, the computer program instructions may be embodied as computer executable code programmed by one skilled in the art to perform the method and workflow steps or functions of FIGS. 2 and 4-5. Thus, by executing computer program instructions, the processor 704 performs the method and workflow steps or functions of FIGS. 2 and 4-5. The computer 702 may also include one or more network interfaces 706 for communicating with other devices via a network. The computer 702 may also include one or more input/output devices 708 that enable user interaction with the computer 702 (e.g., display, keyboard, mouse, speakers, buttons, etc.).
The processor 704 can include both general purpose and special purpose microprocessors, and can be a single processor or one of multiple processors of the computer 702. The processor 704 may include, for example, one or more Central Processing Units (CPUs). The processor 704, data storage 712, and/or memory 710 may include, be supplemented by, or incorporated in one or more application-specific integrated circuits (ASICs), and/or one or more field-programmable gate arrays (FPGAs).
The data storage device 712 and the memory 710 each comprise a tangible, non-transitory computer-readable storage medium. The data storage device 712 and the memory 710 may each comprise high-speed random access memory, such as Dynamic Random Access Memory (DRAM), static Random Access Memory (SRAM), double data rate synchronous dynamic random access memory (DDR RAM), or other random access solid state memory devices, and may comprise non-volatile memory, such as one or more magnetic disk storage devices, such as internal hard disks and removable disks, magneto-optical disk storage devices, flash memory devices, semiconductor memory devices, such as Erasable Programmable Read Only Memory (EPROM), electrically Erasable Programmable Read Only Memory (EEPROM), compact disk read only memory (CD-ROM), digital versatile disk read only memory (DVD-ROM) disks, or other non-volatile solid state memory devices.
Input/output devices 708 may include peripheral devices such as printers, scanners, display screens, and so forth. For example, input/output devices 708 may also include a display device such as a Cathode Ray Tube (CRT) or Liquid Crystal Display (LCD) monitor, a keyboard, and a pointing device, such as a mouse or trackball, by which a user may provide input to computer 702.
The image acquisition device 714 may be connected to the computer 702 to input image data (e.g., medical images) to the computer 702. It is possible to implement the image acquisition device 714 and the computer 702 as one device. It is also possible that the image capture device 714 and computer 702 communicate wirelessly over a network. In one possible embodiment, the computer 702 may be remotely located relative to the image capture device 714.
Any or all of the systems and devices discussed herein may be implemented using one or more computers, such as computer 702.
Those skilled in the art will appreciate that an actual computer or implementation of a computer system may have other structures and may contain other components as well, and that FIG. 7 is a high-level representation of some of the components of such a computer for purposes of illustration.
The foregoing detailed description is to be understood as being in all respects illustrative and exemplary, rather than limiting, and the scope of the invention disclosed herein is to be determined not from the detailed description, but rather from the claims as interpreted according to the full breadth permitted by the patent laws. It is to be understood that the embodiments shown and described herein are merely illustrative of the principles of this invention and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention. Various other combinations of features may be implemented by those skilled in the art without departing from the scope and spirit of the invention.

Claims (17)

1. A method, comprising:
Detecting one or more landmarks in a first medical image of an anatomical structure and a second medical image of the anatomical structure;
Generating a first tree of anatomical structures from the first medical image based on the one or more landmarks detected in the first medical image, and generating a second tree of anatomical structures from the second medical image based on the one or more landmarks detected in the second medical image;
Mapping one or more markers detected in the first medical image to one or more markers detected in the second medical image based on the first tree and the second tree; and
Computing a transformation based on the mapping to align the first medical image and the second medical image;
Wherein the first tree includes one or more markers detected in the first medical image, the second tree includes one or more markers detected in the second medical image, and mapping the one or more markers detected in the first medical image to the one or more markers detected in the second medical image based on the first tree and the second tree includes:
For each respective flag of the one or more flags in the first tree:
Calculating a set of candidate mappings between the respective token and one or more tokens in the second tree;
filtering the set of candidate mappings to remove candidate mappings of offspring of particular tokens in which offspring of corresponding tokens are not mapped to candidate mappings in the second tree; and
Candidate mappings are selected from the filtered set of candidate mappings based on a distance associated with each candidate mapping.
2. The method of claim 1, wherein the candidate mapping set includes all possible mappings between respective tokens and one or more tokens in a second tree.
3. The method of claim 1, wherein computing a transformation to align the first medical image and the second medical image based on the mapping comprises:
Projecting one or more markers detected in the first medical image to respective epipolar lines of the one or more markers in the second medical image;
Determining a transformation of the second medical image to move one or more markers in the second medical image toward a closest point of its respective epipolar line;
Applying the transformation to the second medical image to move one or more landmarks in the second medical image; and
Repeating the projecting, the determining, and the applying until a stop condition is met.
4. The method of claim 1, wherein generating a first tree of anatomical structures from the first medical image based on the one or more landmarks detected in the first medical image, and generating a second tree of anatomical structures from the second medical image based on the one or more landmarks detected in the second medical image comprises:
Generating a first tree to include one or more markers detected in the first medical image between a first starting point and a first ending point selected by the user; and
A second tree is generated to include one or more markers detected in the second medical image between a second starting point and a second ending point selected by the user.
5. The method of claim 1, wherein the anatomical structure is a coronary artery.
6. The method of claim 5, wherein detecting one or more landmarks in the first medical image of the anatomical structure and the second medical image of the anatomical structure comprises:
one or more bifurcation of the coronary arteries in the first medical image and the second medical image is detected.
7. The method of claim 1, wherein the first medical image and the second medical image are different views of the anatomical structure.
8. The method of claim 1, wherein the first medical image and the second medical image are x-ray angiographic images.
9. The method of claim 1, further comprising:
detecting one or more landmarks in one or more additional medical images of the anatomical structure; and
Generating a tree of anatomical structures for each respective image of the one or more additional medical images based on the one or more markers detected in the respective image;
Wherein mapping the one or more markers detected in the first medical image to the one or more markers detected in the second medical image based on the first tree and the second tree comprises:
Mapping the one or more markers detected in the first medical image with the one or more markers detected in the second medical image and the one or more markers detected in the one or more additional medical images; and
Wherein computing a transformation to align the first medical image and the second medical image based on the mapping comprises:
A transformation is calculated based on the mapping to align the first medical image, the second medical image, and the one or more additional medical images.
10. An apparatus, comprising:
Means for detecting one or more landmarks in a first medical image of an anatomical structure and a second medical image of the anatomical structure; means for generating a first tree of anatomical structures from the first medical image based on the one or more landmarks detected in the first medical image, and generating a second tree of anatomical structures from the second medical image based on the one or more landmarks detected in the second medical image;
Means for mapping one or more markers detected in the first medical image to one or more markers detected in the second medical image based on the first tree and the second tree; and
Means for computing a transformation based on the mapping to align the first medical image and the second medical image;
Wherein the first tree comprises one or more markers detected in the first medical image, the second tree comprises one or more markers detected in the second medical image, and the means for mapping the one or more markers detected in the first medical image to the one or more markers detected in the second medical image based on the first tree and the second tree comprises:
Means for computing a candidate mapping set for each respective one of the one or more tokens in the first tree, each candidate mapping set comprising candidate mappings between the respective token and the one or more tokens in the second tree;
Means for filtering the set of candidate mappings for each respective token in the first tree to remove candidate mappings in which the descendants of the respective token are not mapped to descendants of the particular token of the candidate mapping in the second tree; and
Means for selecting a candidate map from the filtered set of candidate maps for each respective flag in the first tree based on the distance associated with each candidate map.
11. The apparatus of claim 10, wherein the candidate mapping set comprises all possible mappings between respective tokens and one or more tokens in a second tree.
12. The apparatus of claim 10, wherein means for computing a transformation to align a first medical image and a second medical image based on the mapping comprises:
Means for projecting one or more markers detected in the first medical image to respective epipolar lines of the one or more markers in the second medical image;
Means for determining a transformation of the second medical image to move one or more markers in the second medical image towards a closest point of its respective epipolar line;
means for applying the transformation to the second medical image to move one or more landmarks in the second medical image; and
And means for repeating the projecting, the determining and the applying until a stop condition is met.
13. The apparatus of claim 10, wherein the means for generating a first tree of anatomical structures from the first medical image based on the one or more landmarks detected in the first medical image and generating a second tree of anatomical structures from the second medical image based on the one or more landmarks detected in the second medical image comprises:
Means for generating a first tree to include one or more markers detected in the first medical image between a first starting point and a first ending point selected by a user; and
Means for generating a second tree to include one or more markers detected in the second medical image between a second starting point and a second ending point selected by the user.
14. A non-transitory computer-readable medium storing computer program instructions that, when executed by a processor, cause the processor to perform operations comprising:
Detecting one or more landmarks in a first medical image of an anatomical structure and a second medical image of the anatomical structure;
Generating a first tree of anatomical structures from the first medical image based on the one or more landmarks detected in the first medical image, and generating a second tree of anatomical structures from the second medical image based on the one or more landmarks detected in the second medical image; mapping one or more markers detected in the first medical image to one or more markers detected in the second medical image based on the first tree and the second tree; and
Computing a transformation based on the mapping to align the first medical image and the second medical image;
Wherein the first tree includes one or more markers detected in the first medical image, the second tree includes one or more markers detected in the second medical image, and mapping the one or more markers detected in the first medical image to the one or more markers detected in the second medical image based on the first tree and the second tree includes:
For each respective flag of the one or more flags in the first tree:
Calculating a set of candidate mappings between the respective token and one or more tokens in the second tree;
filtering the set of candidate mappings to remove candidate mappings of offspring of particular tokens in which offspring of corresponding tokens are not mapped to candidate mappings in the second tree; and
Candidate mappings are selected from the filtered set of candidate mappings based on a distance associated with each candidate mapping.
15. The non-transitory computer-readable medium of claim 14, wherein computing a transformation to align a first medical image and a second medical image based on the mapping comprises:
Projecting one or more markers detected in the first medical image to respective epipolar lines of the one or more markers in the second medical image;
Determining a transformation of the second medical image to move one or more markers in the second medical image toward a closest point of its respective epipolar line;
Applying the transformation to the second medical image to move one or more landmarks in the second medical image; and
Repeating the projecting, the determining, and the applying until a stop condition is met.
16. The non-transitory computer-readable medium of claim 15, wherein detecting one or more markers in the first medical image of the anatomical structure and the second medical image of the anatomical structure comprises:
one or more bifurcation of the coronary arteries in the first medical image and the second medical image is detected.
17. The non-transitory computer-readable medium of claim 14, the operations further comprising:
detecting one or more landmarks in one or more additional medical images of the anatomical structure; and
Generating a tree of anatomical structures for each respective image of the one or more additional medical images based on the one or more markers detected in the respective image;
Wherein mapping the one or more markers detected in the first medical image to the one or more markers detected in the second medical image based on the first tree and the second tree comprises:
Mapping the one or more markers detected in the first medical image with the one or more markers detected in the second medical image and the one or more markers detected in the one or more additional medical images; and
Wherein computing a transformation to align the first medical image and the second medical image based on the mapping comprises:
A transformation is calculated based on the mapping to align the first medical image, the second medical image, and the one or more additional medical images.
CN202110052649.8A 2020-01-16 2021-01-15 Motion correction of angiographic images for 3D reconstruction of coronary arteries Active CN113205459B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/744,295 US11151732B2 (en) 2020-01-16 2020-01-16 Motion correction of angiography images for 3D reconstruction of coronary arteries
US16/744295 2020-01-16

Publications (2)

Publication Number Publication Date
CN113205459A CN113205459A (en) 2021-08-03
CN113205459B true CN113205459B (en) 2024-07-16

Family

ID=

Similar Documents

Publication Publication Date Title
US9754390B2 (en) Reconstruction of time-varying data
US9471987B2 (en) Automatic planning for medical imaging
JP5268365B2 (en) System for determining vessel geometry and flow characteristics
US9949701B2 (en) Registration for tracked medical tools and X-ray systems
US10685438B2 (en) Automated measurement based on deep learning
CN100528083C (en) Method for digita image reducing angiography using primary stereo data
JP2018139693A (en) Image classification device, method and program
US10083511B2 (en) Angiographic roadmapping mask
CN107787203B (en) Image registration
CN111145160B (en) Method, device, server and medium for determining coronary artery branches where calcified regions are located
EP3696724A1 (en) Continuous learning for automatic view planning for image acquisition
Aksoy et al. Template‐based CTA to x‐ray angio rigid registration of coronary arteries in frequency domain with automatic x‐ray segmentation
CN108430376B (en) Providing a projection data set
US11615267B2 (en) X-ray image synthesis from CT images for training nodule detection systems
US9786069B2 (en) Refined reconstruction of time-varying data
CN101802877A (en) Path proximity rendering
CN113205459B (en) Motion correction of angiographic images for 3D reconstruction of coronary arteries
US11166689B1 (en) Providing a dynamic mask image
US20220039769A1 (en) Method, device, and system for determining abnormality in myocardium region
US11151732B2 (en) Motion correction of angiography images for 3D reconstruction of coronary arteries
EP3624693B1 (en) Improving ct scan results
US10977792B2 (en) Quantitative evaluation of time-varying data
US20220270256A1 (en) Compensation of organ deformation for medical image registration
US20220249014A1 (en) Intuitive display for rotator cuff tear diagnostics
Panayiotou et al. 3D reconstruction of coronary veins from a single X-ray fluoroscopic image and pre-operative MR

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant