CN116188412A - Heart blood vessel branch identification method, system and storage medium - Google Patents

Heart blood vessel branch identification method, system and storage medium Download PDF

Info

Publication number
CN116188412A
CN116188412A CN202310120949.4A CN202310120949A CN116188412A CN 116188412 A CN116188412 A CN 116188412A CN 202310120949 A CN202310120949 A CN 202310120949A CN 116188412 A CN116188412 A CN 116188412A
Authority
CN
China
Prior art keywords
information
point
blood vessel
points
centerline
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310120949.4A
Other languages
Chinese (zh)
Inventor
马中柱
余磊
陈辽
金朝汇
谌明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Herui Medical Technology Co ltd
Original Assignee
Zhejiang Herui Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Herui Medical Technology Co ltd filed Critical Zhejiang Herui Medical Technology Co ltd
Priority to CN202310120949.4A priority Critical patent/CN116188412A/en
Publication of CN116188412A publication Critical patent/CN116188412A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10108Single photon emission computed tomography [SPECT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Abstract

The application discloses a method, a system and a storage medium for identifying cardiac vascular branches, wherein the method comprises the following steps: obtaining a heart blood vessel segmentation result of a target heart image; wherein, the heart blood vessel segmentation result at least comprises heart blood vessel central line information; acquiring a plurality of center line points of a heart blood vessel based on a heart blood vessel segmentation result, and acquiring point information of the plurality of center line points, wherein the point information comprises point adjacency information and point characteristic information, and the point characteristic information comprises point coordinate information and at least one of the following information: distance information of the centerline points and the target cardiac structure, and distance information of the centerline points and the vessel wall; processing point information of a plurality of centerline points through a target network to obtain blood vessel branch results corresponding to the centerline points; and obtaining a heart blood vessel branch identification result of the target heart image based on blood vessel branch results corresponding to the plurality of center line points.

Description

Heart blood vessel branch identification method, system and storage medium
Technical Field
The present application relates to the field of medical images, and in particular, to a method, system, and storage medium for identifying cardiac vascular branches.
Background
The identification of cardiac vascular branches in cardiac images is used in various fields such as medical imaging examinations, and the identification of cardiac vascular branches in images can be used for assisting in examinations, identification, analysis of physical conditions of subjects such as patients, and the like. The heart blood vessels in the heart image comprise multiple walking complex blood vessel branches, and the heart blood vessel branches of different patients are different, so that each heart blood vessel branch is difficult to accurately identify from the heart image according to the heart blood vessel branch characteristics.
Therefore, it is necessary to provide a scheme for identifying cardiac vascular branches, which can improve the accuracy of cardiac vascular branch identification.
Disclosure of Invention
One aspect of the present specification provides a method of cardiac vascular branch identification, the method comprising: obtaining a heart blood vessel segmentation result of a target heart image; wherein, the heart blood vessel segmentation result at least comprises heart blood vessel central line information; acquiring a plurality of center line points of a heart blood vessel based on a heart blood vessel segmentation result, and acquiring point information of the plurality of center line points, wherein the point information comprises point adjacency information and point characteristic information, and the point characteristic information comprises point coordinate information and at least one of the following information: distance information of the centerline points and the target cardiac structure, and distance information of the centerline points and the vessel wall; processing point information of a plurality of centerline points through a target network to obtain blood vessel branch results corresponding to the centerline points; and obtaining a heart blood vessel branch identification result of the target heart image based on blood vessel branch results corresponding to the plurality of center line points.
Another aspect of the present specification provides a cardiovascular branch identification system, the system comprising: the segmentation result acquisition module is used for acquiring a heart blood vessel segmentation result of the target heart image; wherein the cardiac vessel segmentation result at least comprises cardiac vessel centerline information; a point information acquisition module, configured to acquire a plurality of centerline points of a cardiac blood vessel based on the cardiac blood vessel segmentation result, and acquire point information of the plurality of centerline points, where the point information includes point adjacency relation information and point feature information, and the point feature information includes point coordinate information and at least one of the following information: distance information of the centerline points and the target cardiac structure, and distance information of the centerline points and the vessel wall; the point information processing module is used for processing the point information of the plurality of central line points through a target network to obtain blood vessel branch results corresponding to the plurality of central line points; and the recognition module is used for obtaining a heart blood vessel branch recognition result of the target heart image based on the blood vessel branch results corresponding to the plurality of center line points.
Another aspect of the present description provides a computer-readable storage medium, wherein the storage medium stores computer instructions that, when executed by a processor, implement a method of cardiac vascular branch identification.
Drawings
The present specification will be further elucidated by way of example embodiments, which will be described in detail by means of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a diagram of an application scenario of a cardiac vascular branch recognition system according to some embodiments of the present description;
FIG. 2 is an exemplary block diagram of a cardiac vascular branch recognition system according to some embodiments of the present disclosure;
FIG. 3 is an exemplary flow chart of a method of cardiac vascular branch identification according to some embodiments of the present disclosure;
FIG. 4 is an exemplary flow chart of a graph rolling network processing point information for a plurality of centerline points, according to some embodiments of the present description;
FIG. 5 is an exemplary flow chart of training a target network shown in accordance with some embodiments of the present description;
FIG. 6 is an exemplary schematic diagram of an adjacency matrix for a plurality of centerline points shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It should be appreciated that "system," "apparatus," "unit," and/or "module" as used in this specification is a method for distinguishing between different components, elements, parts, portions, or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Fig. 1 is a diagram of an application scenario of a cardiac vascular branch recognition system according to some embodiments of the present description.
The cardiac vessel branch identification system 100 may automatically identify branch categories of cardiac vessels by implementing the methods and/or processes disclosed herein. As shown in fig. 1, the cardiac vascular branch identification system 100 may include an imaging device 110, a processing device 120, a terminal device 130, a network 140, and a storage device 150. The components of the cardiac vascular branch identification system 100 may be connected in one or more ways. For example only, as shown in fig. 1, imaging device 110 may be connected to processing device 120 through network 140 or directly.
The imaging device 110 may acquire an image. For example, a target heart image and/or a sample heart image. In some embodiments, the imaging device 110 may include, but is not limited to, a Magnetic Resonance Imaging (MRI) device (also referred to as an MR scanner), a Computed Tomography (CT) device, an ultrasound scanner, a Digital Radiography (DR) scanner, a Digital Subtraction Angiography (DSA), a Positron Emission Tomography (PET) device, a Single Photon Emission Computed Tomography (SPECT) device, or the like, or any combination thereof.
The processing device 120 may process data and/or information acquired from the imaging device 110, the terminal device 130, and/or the storage device 150. For example, the processing device 120 may acquire a target heart image from the imaging device 110 and process the target heart image by a segmentation algorithm to acquire a cardiac vessel segmentation result and/or a cardiac structure segmentation result. For another example, the processing device 120 may obtain point information of a plurality of centerline points based on the target cardiac image, the cardiac vascular segmentation result, and/or the cardiac structure segmentation result through a segmentation algorithm, and process the point information using a target network to obtain a cardiac vascular branch recognition result of the target cardiac image. For another example, the processing device 120 may train the target network based on point information of a plurality of sample centerline points of the cardiac blood vessels of the sample cardiac image and their corresponding vessel branch labels. In some embodiments, processing device 120 may include a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a system on a chip (SoC), a microcontroller unit (MCU), etc., and/or any combination thereof. In some embodiments, processing device 120 may comprise a computer, a user console, a single server or group of servers, or the like. The server farm may be centralized or distributed. In some embodiments, the processing device 120 may be local or remote. For example, processing device 120 may access information and/or data stored in imaging device 110, terminal device 130, and/or storage device 150 via network 140. As another example, processing device 120 may access stored information and/or data directly connected to imaging device 110, terminal device 130, and/or storage device 150. In some embodiments, the processing device 120 may be implemented on a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof. In some embodiments, the processing device 120 or a portion of the processing device 120 may be integrated into the imaging device 110.
The terminal device 130 may display data to the user, such as a target heart image, a heart vessel segmentation result, and/or a heart structure segmentation result. Terminal device 130 may include a mobile device 131, a tablet computer 132, a notebook computer 133, or the like, or any combination thereof. In some embodiments, the terminal device 130 may be part of the processing device 120.
Network 140 may include any suitable network that facilitates the exchange of information and/or data by cardiac vascular branch identification system 100. In some embodiments, one or more components of the cardiac vascular branch identification system 100 (e.g., the imaging device 110, the processing device 120, the terminal device 130, the storage device 150) may communicate information and/or data with one or more other components of the cardiac vascular branch identification system 100 over the network 140. In some embodiments, network 140 may be and/or include a public network, a private network, a Wide Area Network (WAN)), a wired network, a wireless network, a cellular network, a frame relay network, a virtual private network, a satellite network, a telephone network, a router, a hub, a switch, and the like, or any combination thereof.
Storage device 150 may store data, instructions, and/or any other information. In some embodiments, the storage device 150 may store data acquired from the imaging device 110, the terminal device 130, and/or the processing device 120. For example, the storage device 150 may store the target heart image acquired from the imaging device 110. In some embodiments, the storage device 150 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, the storage device 150 may execute on a cloud platform. In some embodiments, the storage device 150 may be connected to the network 140 to communicate with one or more other components of the cardiac vascular branch identification system 100 (e.g., the imaging device 110, the processing device 120, the terminal device 130). One or more components of the cardiac vascular branch identification system 100 may access data or instructions stored in the storage device 150 over the network 140. In some embodiments, the storage device 150 may be directly connected to or in communication with one or more other components of the cardiac vascular branch identification system 100 (e.g., the imaging device 110, the processing device 120, the storage device 150, the terminal device 130). In some embodiments, the storage device 150 may be part of the processing device 120.
It should be noted that the above description is provided for illustrative purposes only and is not intended to limit the scope of the present description. Many variations and modifications will be apparent to those of ordinary skill in the art, given the benefit of this disclosure. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. However, such changes and modifications do not depart from the scope of the present specification.
Fig. 2 is a block diagram of a cardiovascular branch identification system according to some embodiments of the present description.
As shown in fig. 2, the cardiac vascular branch recognition system 200 may include a segmentation result acquisition module 210, a point information acquisition module 220, a point information processing module 230, a recognition module 240, and a training module 250.
The segmentation result acquisition module 210 may be configured to acquire a cardiac vessel segmentation result of the target cardiac image. In some embodiments, the cardiac vessel segmentation results may include at least cardiac vessel centerline information.
The point information obtaining module 220 may be configured to obtain a plurality of centerline points of the cardiac vessel based on the cardiac vessel segmentation result, and obtain point information of the plurality of centerline points. In some embodiments, the point information may include point adjacency information and point feature information. In some embodiments, the point feature information may include point coordinate information and at least one of the following: distance information between the centerline point and the target cardiac structure, and distance information between the centerline point and the vessel wall. In some embodiments, the distance information of the centerline points to the target cardiac structure may include: the minimum distance of the centerline point to the boundary of the target cardiac structure. In some embodiments, the target cardiac structure may include one or more of the following: left ventricle, right ventricle, left atrium and right atrium.
The point information processing module 230 may be configured to process point information of a plurality of centerline points through the target network, so as to obtain a blood vessel branch result corresponding to the plurality of centerline points. In some embodiments, the point adjacency information for the plurality of centerline points may be represented in an adjacency matrix. In some embodiments, the point feature information for the plurality of centerline points may be represented in a feature matrix. In some embodiments, the target network may comprise a graph roll-up network. In some embodiments, a graph roll-up network may include at least one graph roll-up layer and a linear layer. In some embodiments, the point information processing module 230 may perform one or more of the following: the method comprises the steps of taking a plurality of center line points as a plurality of nodes, taking a feature matrix as node information of the plurality of nodes, taking an adjacent matrix as side information of inter-node sides of the plurality of nodes, and processing the node information and the side information through at least one graph roll lamination to obtain an intermediate result; and processing the intermediate result through the linear layer to obtain the vascular branch result corresponding to the plurality of center line points.
The recognition module 240 may be configured to obtain a cardiac vessel branch recognition result of the target cardiac image based on vessel branch results corresponding to the plurality of centerline points.
The training module 250 may be used to train the target network. In some embodiments, training module 250 may perform one or more of the following: acquiring point information of a plurality of sample central line points of a heart blood vessel of a sample heart image and corresponding blood vessel branch labels; the target network is trained based on point information of the plurality of sample centerline points and corresponding vessel branch labels. In some embodiments, point information for a plurality of sample centerline points may be input into a target network process and output resulting in vessel branch predictions for the plurality of sample centerline points. In some embodiments, the vessel branch label may be a network output label corresponding to the vessel branch prediction result.
In some embodiments the acquisition module 210, the point information acquisition module 220, the point information processing module 230, the identification module 240, and the training module 250 may be implemented on the same or different processing devices. For example, the acquisition module 210, the point information acquisition module 220, the point information processing module 230, and the identification module 240 may be implemented at an application end of the system, and the training module 250 may be implemented at a supply end and/or a design end of the system.
Fig. 3 is an exemplary flow chart of a method of cardiac vascular branch identification according to some embodiments of the present description.
In some embodiments, the process 300 may be implemented by the processing device 120 and/or the cardiac vascular branch identification system 200. As shown in fig. 3, the cardiac vascular branch identification method 300 may include:
in step 310, a cardiac vessel segmentation result of the target cardiac image is acquired. Specifically, step 310 may be performed by the segmentation result acquisition module 210.
The target cardiac image may be a cardiac image requiring vessel branch identification. The target heart image may include a plurality of elements (e.g., voxels or pixels), each of which may correspond to a physical point in the heart. For example, a two-dimensional image of the target heart may comprise 300000×200000 pixels, corresponding to 300000×200000 physical points in the heart, respectively.
In some embodiments, the cardiac image of interest may include, but is not limited to, one or more of an X-ray image, a Computed Tomography (CT) image, a Positron Emission Tomography (PET) image, a Single Photon Emission Computed Tomography (SPECT), a Magnetic Resonance Image (MRI), an Ultrasound Scan (US) image, a Digital Subtraction Angiography (DSA) image, a Magnetic Resonance Angiography (MRA) image, a time-of-flight magnetic resonance image (TOF-MRI), a Magnetoencephalography (MEG), and the like.
In some embodiments, the format of the target heart image may include Joint Photographic Experts Group (JPEG) image format, tagged Image File Format (TIFF) image format, graphics Interchange Format (GIF) image format, kodak Flash PiX (FPX) image format, digital Imaging and Communications in Medicine (DICOM) image format, and the like.
In some embodiments, the target heart image may be a two-dimensional (2 d) image, or a three-dimensional (3 d) image. In some embodiments, the three-dimensional image may be composed of a series of two-dimensional slices or layers.
The cardiac vessel segmentation result may be a result of determining whether an element in the target cardiac image belongs to a cardiac vessel and/or a cardiac vessel centerline. In some embodiments, the cardiac vessel segmentation result may include a vessel segmentation map. The vessel segmentation map may comprise a plurality of elements (e.g. voxels or pixels), each of which may correspond to an element in the image of the target heart and/or a physical point in the heart. Each element of the vessel segmentation map may have a corresponding value. In some embodiments, the value of each element in the vessel segmentation map may represent the probability that the corresponding physical point belongs to a cardiac vessel. For example, the value of each element ranges from 0 to 1, and a higher value indicates a higher probability that the corresponding physical point belongs to the cardiac vessel. For another example, the value of each element may be 0 or 1,0 indicating that the corresponding physical point does not belong to a cardiac vessel, and 1 indicating that the corresponding physical point belongs to a cardiac vessel. For convenience of description, the physical points corresponding to the elements belong to the cardiac blood vessel may be simply referred to as the elements belong to the cardiac blood vessel.
In some embodiments, the segmentation result acquisition module 210 may acquire the cardiac vessel segmentation result from the target cardiac image based on a segmentation algorithm. In some embodiments, the segmentation algorithm may include a combination of one or more of a conventional segmentation algorithm (e.g., thresholding, region growing, edge detection, etc.), an image segmentation algorithm incorporating a specific tool (e.g., genetic algorithm, wavelet analysis, wavelet transformation, active contour model, etc.), and a neural network model (full convolutional network (Fully Convolutional Networks, FCN) model, visual geometry group network (Visual Geometry Group, VGG Net) model, mask Region convolutional neural network (Mask Region-based Convolutional Neural Network, mask R-CNN) model, etc.).
In some embodiments, the cardiac vessel segmentation results may include at least cardiac vessel centerline information. The heart vessel centerline may be a line segment formed by a center point on the heart vessel cross-section. The cardiac vessel centerline information may include a determination of whether an element in the target cardiac image belongs to a cardiac vessel centerline.
In some embodiments, the segmentation result acquisition module 210 may extract the centerline of the cardiac vessel (i.e., the element belonging to the centerline of the cardiac vessel in the target cardiac image) from the target cardiac image by a centerline extraction algorithm based on the vessel segmentation map.
In some embodiments, the centerline extraction algorithm may include, but is not limited to, at least one or a combination of more of a direct centerline tracking algorithm, a minimum path algorithm, a model-based centerline extraction algorithm, and the like.
Step 320, acquiring a plurality of centerline points of the cardiac blood vessel based on the cardiac blood vessel segmentation result, and acquiring point information of the plurality of centerline points. Specifically, step 320 may be performed by the point information acquisition module 220.
The centerline points of the heart blood vessel may be elements corresponding to the centerline points of the heart blood vessel cross-section. In some embodiments, the point information acquisition module 220 may acquire a plurality of centerline points of the cardiac vessel based on the cardiac vessel centerline information. In some embodiments, the number of centerline points may be a preset number determined according to the accuracy requirements of the target network. For example, the preset number may be 2048, 4096, or the like. The detailed description of the target network may refer to step 330, which is not described herein.
Specifically, the point information obtaining module 220 may determine, first, based on the cardiac vessel centerline information, a total number of elements belonging to the cardiac vessel centerline in the target cardiac image, and if the total number exceeds a preset number, obtain a sampling rate based on a ratio of the preset number to the total number, and extract a preset number of centerline points from the elements belonging to the cardiac vessel centerline based on the sampling rate; assuming that the total number does not exceed the preset number, all elements belonging to the heart vessel centerline may be taken as centerline points, and the encryption rate may be further obtained based on the ratio of the total number and the preset number, and encryption points may be added as centerline points between the centerline points at intervals or at no intervals based on the encryption rate.
For example, the total number is 3072, the preset number is 2048, the point information obtaining module 220 may obtain a sampling rate of 2048/3072=2/3 based on a ratio of the preset number to the total number, and extract 2 out of the consecutive 3 elements belonging to the center line of the cardiac blood vessel as the center line points based on the sampling rate of 2/3, thereby extracting 2048 center line points.
For another example, the total number is 3072, the preset number is 3072, and the point information obtaining module 220 may use all 3072 elements belonging to the center line of the cardiac blood vessel as the center line point.
For another example, the total number is 3072, the preset number is 4096, the point information obtaining module 220 may obtain an encryption rate 4096/3072=4/3 based on a ratio of the total number to the preset number, and add one encryption point to every 3 elements belonging to the center line of the cardiac blood vessel based on the encryption rate 4/3, thereby obtaining 1024 encryption points, and use the 1024 encryption points and 3072 elements together as the center line point.
The point information may be information about a centerline point. In some embodiments, the point information may include point adjacency information and point feature information.
The point adjacency information may be information representing adjacency between each centerline point and other centerline points. In some embodiments, the point adjacency information may include whether each centerline point is adjacent to other centerline points.
In some embodiments, the point adjacency information for the plurality of centerline points may be represented in an adjacency matrix. The dimension of the adjacency matrix may be the number of centerline points x the number of centerline points. For example, the dimension of the adjacency matrix corresponding to 2048 centerline points may be 2048×2048. Each element of the adjacency matrix may represent whether two center points are adjacent. For example, the value of an element may be "0" or "1", indicating non-adjacent or adjacent, respectively.
FIG. 6 is an exemplary schematic diagram of an adjacency matrix for a plurality of centerline points shown in accordance with some embodiments of the present description. As shown in fig. 6, the plurality of centerline points may include 2048 centerline points: n1, n2, n3, n4, n5, n6, n7, n8 … n2048, corresponding adjacency matrix E0 2048×2048 2048×2048 elements may be included. For example, a value of "0" for the 1 st element of row 3 may indicate that centerline points n3 and n1 are not adjacent, and a value of "1" for the 2 nd element may indicate that centerline points n3 and n2 are adjacent.
The point feature information may be information representing each centerline point feature. In some embodiments, the point feature information may include point coordinate information. In some embodiments, the point feature information may also include at least one of the following: distance information between the centerline point and the target cardiac structure, and distance information between the centerline point and the vessel wall.
The point coordinate information may be coordinates of the centerline point in the target heart image. For example, in a target heart image including 300000×200000 pixels, the coordinates of n1 may be (150000,1000) assuming that the center line point n1 is the pixel of the 150000 th column and 1000 th row in the target heart image, the pixel of the lower left corner is the origin, the longitudinal direction of the target heart image is the X axis, the width direction is the Y axis, and 1 pixel size is the unit. It will be appreciated that the point coordinate information of the centerline points may reflect the absolute position of the centerline points in the target heart image.
In some embodiments, the point information acquisition module 220 may represent the point coordinate information of the plurality of centerline points with a first feature matrix. The dimension of the first matrix feature may be the number of centerline points x the coordinate dimension. Continuing with the above example, the two-dimensional coordinates of 2048 centerline points may be used with the first feature matrix E1 2048×2 And (3) representing.
The distance information of the centerline points to the target heart structure may be information of the relative positions of the centerline points and the target heart structure in the target heart image.
The target cardiac structure may be a chamber of the heart. In some embodiments, the target cardiac structure may include one or more of the following: left ventricle, right ventricle, left atrium and right atrium.
In some embodiments, the point information acquisition module 220 may acquire the segmentation result of the target cardiac structure based on the target cardiac image. The segmentation result of the target heart structure may be a result of determining whether an element in the target heart image belongs to the target heart structure. In some embodiments, the point information acquisition module 220 may acquire a segmentation result for each target cardiac structure from the target cardiac image based on a segmentation algorithm. For a detailed description of the segmentation algorithm, reference may be made to the relevant description of step 310, which is not repeated here.
It will be appreciated that there is a boundary between the region of the target heart structure in the target heart image and the background region (i.e. the region outside the region of the target heart structure). In some embodiments, the segmentation result of the target cardiac structure may be represented by delineating a boundary between the target cardiac structure region and the background region in the target cardiac image.
In some embodiments, the distance information of the centerline point from the target cardiac structure may include a minimum distance of the centerline point from a boundary of the target cardiac structure.
Illustratively, the distance information of the centerline point n1 from the target cardiac structure may include a minimum distance of the centerline point n1 to the boundary of the left ventricle, e.g., 1000 pixels. Similarly, the distance information of the centerline point n1 from the target cardiac structure may also include a minimum distance of the centerline point n1 to the right ventricular boundary (e.g., 2000 pixels), a minimum distance of the centerline point n1 to the left atrial boundary (e.g., 10000 pixels), and/or a minimum distance of the centerline point n1 to the right atrial boundary (e.g., 15000 pixels).
In some embodiments, the distance information of the centerline point from the target cardiac structure may further include a distance of the centerline point from a center of the target cardiac structure, a maximum distance of the centerline point from a boundary of the target cardiac structure, an average distance of the centerline point from the boundary of the target cardiac structure, and the like, which embodiments of the present disclosure are not limited.
In some embodiments, the point information acquisition module 220 may represent distance information of the plurality of centerline points from the target cardiac structure with a second feature matrix. The dimension of the second feature matrix may be the number of centerline points x the number of target cardiac structures. For example, distance information of 2048 centerline points to 4 target cardiac structures, respectively, may be used with the second feature matrix E2 2048×4 And (3) representing. Wherein each row of the second feature matrix may represent distance information of one centerline point from each target cardiac structure. For example, a second feature matrix E2 2048×4 The first row (1000, 2000, 10000, 15000) in (b) may represent distance information of the centerline point n1 from the left ventricle, the right ventricle, the left atrium, the right atrium, respectively.
The distance information between the centerline point and the vessel wall may be radius information of the vessel corresponding to the centerline point in the target heart image. From the foregoing, each centerline point may correspond to a cardiovascular cross section. Thus, the distance information of the centerline point from the vessel wall may be the radius of the heart vessel cross-section corresponding to the centerline point in the target heart image.
For example, the centerline point n1 may correspond to the heart vessel cross-section 1-1, and the distance information of the centerline n1 from the vessel wall may be a radius of the heart vessel cross-section 1-1 in the target heart image, e.g., 500 pixels.
In some embodiments, the point information acquisition module 220 may represent distance information of the plurality of centerline points from the vessel wall with a third feature matrix. The dimension of the third feature matrix may be the number of centerline points x 1. For example, the distance information of 2048 centerline points to the vessel wall may be represented by a third feature matrix e32048×1.
In some embodiments of the present disclosure, the absolute position of the centerline point in the target heart image, the relative positions of the centerline point and the target heart structure in the target heart image, and the size of the vessel radius corresponding to the centerline point in the target heart image construct the point feature information of each centerline point, so that the point feature information of the centerline point is fully fused with the vessel structure information of the cardiac vessel, the anatomical relationship between the vessel and the heart structure, thereby improving the accuracy of the identification of the subsequent vessel branches.
And 330, processing the point information of the plurality of center line points through the target network to obtain the blood vessel branch results corresponding to the plurality of center line points. In particular, step 330 may be performed by the point information processing module 230.
The input to the target network may include point information for a plurality of centerline points and the output may be a vessel branch result for the plurality of centerline points.
In some embodiments, the point information processing module 230 may pre-process the point information for a plurality of centerline points prior to entering the target network. Specifically, the point information processing module 230 may perform normalization processing on the adjacent matrix, the first feature matrix, the second feature matrix, and the third feature matrix, then splice the normalized first feature matrix, second feature matrix, and third feature matrix to obtain a feature matrix, and use the normalized adjacent matrix and feature matrix as input of the target network. For example, the point information processing module 230 may normalize E1 2048×2 、E2 2048×4 And E3 2048×1 Spliced into a feature matrix E 2048×7 And matrix E of features 2048×7 And the adjacency matrix as an input to the target network.
The vessel branch outcome for the plurality of centerline points may include a determination of a vessel branch category to which the plurality of centerline points belong. Illustratively, the vessel branch categories may include right coronary artery, right coronary artery posterior descending branch, right posterior branch, left coronary artery trunk, left anterior descending branch, diagonal branch, circumflex branch, blunt edge branch, left posterior descending branch atrial branch, and the like.
Specifically, the target network may map the input adjacency matrix and feature matrix into a plurality of values or a plurality of probabilities, and obtain the blood vessel branch results corresponding to the plurality of centerline points based on the plurality of values or the plurality of probabilities.
In some embodiments, the target network may comprise a graph roll-up network. The detailed description of the graph convolutional network can be found in fig. 4 and the description thereof, and will not be repeated here. In some embodiments, the target network may also include, but is not limited to, a combination of one or more of a convolutional neural network, a recurrent neural network, a long-term memory network, and the like, as embodiments of the present disclosure are not limited.
In some embodiments, the target network may be obtained through training. The detailed description of the training target network can be found in fig. 5 and the related description thereof, and will not be repeated here.
Step 340, obtaining a cardiac vessel branch identification result of the target cardiac image based on vessel branch results corresponding to the plurality of centerline points. Specifically, step 340 may be performed by identification module 240.
In some embodiments, the identification module 240 may obtain the branch centerline corresponding to the cardiac vascular branch result based on centerline points of the plurality of centerline points that continuously correspond to the same cardiac vascular branch result. For example, as shown in fig. 6, out of 2048 centerline points, each of consecutive centerline points n1, n2, n3, n5, and n8 corresponds to a blood vessel branch result "left coronary artery trunk", and then the line connecting the centerline points n1, n2, n3, n5, and n8 can be determined as the branch centerline of the "left coronary artery trunk".
In some embodiments, identification module 240 may determine the corresponding cardiovascular branch based on the branch centerline corresponding to the cardiovascular branch outcome. Specifically, a vessel corresponding to a branch centerline may be identified as a cardiac vessel branch outcome corresponding to the branch centerline. For example, a blood vessel corresponding to the branch center lines n1 to n8 of the "left coronary artery trunk" may be identified as the "left coronary artery trunk", thereby obtaining the "left coronary artery trunk".
In some embodiments of the present disclosure, the point information of the centerline points is used to fuse the vessel structure information of the cardiac vessels, the anatomical relationship between the vessels and the cardiac structure, and the topological relationship between the branches of the cardiac vessels is fused via the target network, thereby improving the accuracy of identifying the branches of the cardiac vessels.
FIG. 4 is an exemplary flow chart of a graph rolling network processing point information for a plurality of centerline points, according to some embodiments of the present description.
A graph rolling network (Graph Convolutional Network, GCN) is a neural network that acts directly on a graph, which is a data structure made up of nodes and edges. In some embodiments, a graph roll-up network may include at least one graph roll-up layer and a linear layer.
In some embodiments, the flow 400 may be performed by the point information processing module 230. As shown in fig. 4, the process 400 may include:
in step 410, the node information with the plurality of center line points as a plurality of nodes, the feature matrix as node information of the plurality of nodes, and the adjacent matrix as side information of the inter-node sides of the plurality of nodes are processed by at least one graph roll stacking to obtain an intermediate result.
The graph convolution layer may cause each node to update the information of the node by exchanging information with each other based on the information propagation mechanism. Wherein each node may determine an adjacent node for updating information of the own node based on the side information.
The input to the graph convolutional layer may be a feature matrix and an adjacency matrix for a plurality of centerline points and the output may be an updated feature matrix. Specifically, for each node, the graph roll layer may determine an adjacent node for updating the node information based on the adjacent matrix, for example, as shown in fig. 6, for the node n3, the value of the 1 st element of the 3 rd row is "0" and the value of the 2 nd element is "1" in the adjacent matrix may indicate that the adjacent node for updating the node n3 information does not include n1, including n2. Further, the graph volume stacking layer can update the node information of the node by carrying out convolution operation on the node information of the adjacent node, and the updated feature matrix is obtained. Further, the next scroll laminate may update the node information of each node again through a convolution operation based on the node information updated by the previous scroll laminate (i.e., the updated feature matrix). In some embodiments, updated node information output by the graph convolution layer may enter the next graph convolution layer after being processed by an activation function. The activation function may use common ReLU, sigmoid, etc., or may use Dropout method to perform activation processing.
The intermediate result may be node information after the plurality of nodes complete the update. In some embodiments, the intermediate result may be a feature matrix updated by the last layer of graph convolution. For example, assuming that the graph convolutional network includes 5 graph convolution layers, the updated feature matrix output by the 5 th graph convolution layer is an intermediate result.
And step 420, processing the intermediate results through the linear layer to obtain blood vessel branch results corresponding to the plurality of center line points.
The linear layer can map node information (i.e. intermediate results) after updating of the plurality of nodes into a plurality of values or probabilities, and then obtain blood vessel branch results corresponding to the plurality of nodes based on the plurality of values or the plurality of probabilities.
For example, there are 15 kinds of blood vessel branch categories, the linear layer may map node information (i.e., intermediate result) after 2048 nodes are updated to 2048×15 probabilities, each node may correspond to 15 probabilities, which respectively represent probabilities that the node belongs to 15 kinds of blood vessel branches, and then the blood vessel branch category corresponding to the maximum value of the 15 probabilities corresponding to each node is taken as the blood vessel branch result corresponding to the node.
In some embodiments, the linear layer may be a fully connected neural network.
In some embodiments of the present disclosure, the graph structure information is constructed through a centerline point on a centerline of a cardiac vessel, and the topological structure relationship between cardiac vessel branches is effectively utilized through a graph neural network, so that the accuracy of identifying cardiac vessel branches can be improved.
Fig. 5 is an exemplary flow chart of training a target network according to some embodiments of the present description. In some embodiments, the process 500 may be performed by the training module 250. As shown in fig. 5, the process 500 may include:
step 510, acquiring point information of a plurality of sample center line points of a cardiac blood vessel of a sample cardiac image and corresponding blood vessel branch labels.
In some embodiments, the training module 250 may obtain a sample cardiac vessel segmentation result and a sample cardiac structure segmentation result from the sample cardiac image, respectively, based on a segmentation algorithm. A detailed description of the acquisition of the cardiac vascular segmentation result of the sample may be referred to in step 310, and a detailed description of the acquisition of the cardiac structural segmentation result of the sample may be referred to in step 320, which will not be repeated here.
Further, in some embodiments, the training module 250 may extract a plurality of sample centerline points from the sample cardiac image based on the sample cardiac vessel segmentation result, and then obtain point information for the plurality of sample centerline points based on the plurality of sample centerline points, the sample cardiac image, the sample cardiac vessel segmentation result, and the sample cardiac structure segmentation result. A detailed description of acquiring the point information of the plurality of sample centerline points may refer to step 320, and will not be described herein.
In some embodiments, the vessel branch labels corresponding to the plurality of sample centerline points may be determined by manual labeling.
Step 520, training the target network based on the point information of the plurality of sample centerline points and the corresponding vessel branch labels.
Specifically, the training module 250 may input point information for a plurality of sample centerline points into the target network, and update parameters of the target network through training. The point information of the plurality of sample center line points can be input into a target network for processing and output to obtain the blood vessel branch prediction results corresponding to the plurality of sample center line points. And the vessel branch label is used as a network output label corresponding to the vessel branch prediction result.
In some embodiments, the training module 250 may construct a loss function based on the vessel branch prediction results and the corresponding vessel branch labels corresponding to the plurality of sample centerline points, and update parameters of the target network based on the loss function, to obtain a trained target network.
It should be noted that the above description of the flow is provided for illustrative purposes only and is not intended to limit the scope of the present description. Various changes and modifications may be made by one of ordinary skill in the art in light of the description herein. However, such changes and modifications do not depart from the scope of the present specification. The operational schematic of the process presented above is illustrative. In some embodiments, the above-described processes may be accomplished with one or more additional operations not described and/or one or more operations not discussed. For example, the flow may be stored in a storage device (e.g., storage device 150, a memory unit of a system) in the form of a program or instructions that, when executed by processing device 120 and/or image processing system 200, may implement the flow. In addition, the order of the operations of the flows shown in the figures and described above is not limiting.
Possible benefits of embodiments of the present description include, but are not limited to: (1) The central line of the heart blood vessel is extracted, the central line point is used for constructing the graph structure information, the data volume of input information can be reduced, and the shape information of the blood vessel is not lost, so that the recognition efficiency of the heart blood vessel branch is improved; (2) The absolute position of the central line point in the target heart image, the relative positions of the central line point and the target heart structure in the target heart image, and the size of the blood vessel radius corresponding to the central line point in the target heart image are used for constructing point characteristic information of each central line point, so that the point characteristic information of the central line point is fully fused with the blood vessel structure information of the heart blood vessel and the anatomical relation between the blood vessel and the heart structure, and the recognition accuracy of the subsequent blood vessel branches is improved; (3) The graph structure information is constructed through the central line point on the central line of the cardiac blood vessel, and the topological structure relationship between the cardiac blood vessel branches is effectively utilized through the graph neural network, so that the recognition accuracy of the cardiac blood vessel branches can be improved.
It should be noted that, the advantages that may be generated by different embodiments may be different, and in different embodiments, the advantages that may be generated may be any one or a combination of several of the above, or any other possible advantages that may be obtained.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, those skilled in the art will appreciate that the various aspects of the specification can be illustrated and described in terms of several patentable categories or circumstances, including any novel and useful procedures, machines, products, or materials, or any novel and useful modifications thereof. Accordingly, aspects of the present description may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.), or by a combination of hardware and software. The above hardware or software may be referred to as a "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the specification may take the form of a computer product, comprising computer-readable program code, embodied in one or more computer-readable media.
The computer storage medium may contain a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take on a variety of forms, including electro-magnetic, optical, etc., or any suitable combination thereof. A computer storage medium may be any computer readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated through any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or a combination of any of the foregoing.
The computer program code necessary for operation of portions of the present description may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb net, python and the like, a conventional programming language such as C language, visual Basic, fortran2003, perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, ruby and Groovy, or other programming languages and the like. The program code may execute entirely on the user's computer or as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or the use of services such as software as a service (SaaS) in a cloud computing environment.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing processing device or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the present description. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.

Claims (10)

1. A method of cardiac vascular branch identification, the method comprising:
obtaining a heart blood vessel segmentation result of a target heart image; wherein the cardiac vessel segmentation result at least comprises cardiac vessel centerline information;
acquiring a plurality of center line points of the heart blood vessel based on the heart blood vessel segmentation result, and acquiring point information of the plurality of center line points, wherein the point information comprises point adjacency relation information and point characteristic information, and the point characteristic information comprises point coordinate information and at least one of the following information: distance information of the centerline points and the target cardiac structure, and distance information of the centerline points and the vessel wall;
processing the point information of the plurality of central line points through a target network to obtain blood vessel branch results corresponding to the plurality of central line points;
And obtaining a heart blood vessel branch identification result of the target heart image based on the blood vessel branch results corresponding to the plurality of center line points.
2. The method of claim 1, the distance information of the centerline points from a target cardiac structure comprising: the centerline point is a minimum distance from the boundary of the target cardiac structure.
3. The method of claim 1, the target cardiac structure comprising one or more of: left ventricle, right ventricle, left atrium and right atrium.
4. The method of claim 1, the target network comprising a graph roll-up network.
5. The method of claim 4, the point adjacency information for the plurality of centerline points being represented by an adjacency matrix, the point feature information for the plurality of centerline points being represented by a feature matrix;
the graph roll-up network includes at least one graph roll-up layer and a linear layer, the processing of the point information for the plurality of centerline points through the target network includes:
the plurality of center line points are taken as a plurality of nodes, the feature matrix is taken as node information of the plurality of nodes, the adjacent matrix is taken as side information of inter-node sides of the plurality of nodes, and the node information and the side information are processed through the at least one graph roll lamination to obtain an intermediate result;
And processing the intermediate result through the linear layer to obtain the blood vessel branch result corresponding to the plurality of center line points.
6. The method of claim 1, the target network being trained by:
acquiring the point information of a plurality of sample central line points of a heart blood vessel of a sample heart image and corresponding blood vessel branch labels;
training the target network based on the point information of the plurality of sample center line points and the corresponding vascular branch labels, wherein the point information of the plurality of sample center line points is input into the target network to be processed and output to obtain vascular branch prediction results corresponding to the plurality of sample center line points, and the vascular branch labels are used as network output labels corresponding to the vascular branch prediction results.
7. A cardiovascular branch identification system, the system comprising:
the segmentation result acquisition module is used for acquiring a heart blood vessel segmentation result of the target heart image; wherein the cardiac vessel segmentation result at least comprises cardiac vessel centerline information;
a point information acquisition module, configured to acquire a plurality of centerline points of a cardiac blood vessel based on the cardiac blood vessel segmentation result, and acquire point information of the plurality of centerline points, where the point information includes point adjacency relation information and point feature information, and the point feature information includes point coordinate information and at least one of the following information: distance information of the centerline points and the target cardiac structure, and distance information of the centerline points and the vessel wall;
The point information processing module is used for processing the point information of the plurality of central line points through a target network to obtain blood vessel branch results corresponding to the plurality of central line points;
and the recognition module is used for obtaining a heart blood vessel branch recognition result of the target heart image based on the blood vessel branch results corresponding to the plurality of center line points.
8. The system of claim 7, the point adjacency information for the plurality of centerline points represented in an adjacency matrix, the point feature information for the plurality of centerline points represented in a feature matrix;
the target network comprises a graph roll-up network comprising at least one graph roll-up layer and a linear layer, the point information processing module further configured to:
the plurality of center line points are taken as a plurality of nodes, the feature matrix is taken as node information of the plurality of nodes, the adjacent matrix is taken as side information of inter-node sides of the plurality of nodes, and the node information and the side information are processed through the at least one graph roll lamination to obtain an intermediate result;
and processing the intermediate result through the linear layer to obtain the blood vessel branch result corresponding to the plurality of center line points.
9. The system of claim 7, further comprising a training module to:
acquiring the point information of a plurality of sample central line points of a heart blood vessel of a sample heart image and corresponding blood vessel branch labels;
training the target network based on the point information of the plurality of sample center line points and the corresponding vascular branch labels, wherein the point information of the plurality of sample center line points is input into the target network to be processed and output to obtain vascular branch prediction results corresponding to the plurality of sample center line points, and the vascular branch labels are used as network output labels corresponding to the vascular branch prediction results.
10. A computer-readable storage medium storing computer instructions that, when read by a computer in the storage medium, the computer performs the cardiac vascular branch identification method of any one of claims 1-6.
CN202310120949.4A 2023-01-18 2023-01-18 Heart blood vessel branch identification method, system and storage medium Pending CN116188412A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310120949.4A CN116188412A (en) 2023-01-18 2023-01-18 Heart blood vessel branch identification method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310120949.4A CN116188412A (en) 2023-01-18 2023-01-18 Heart blood vessel branch identification method, system and storage medium

Publications (1)

Publication Number Publication Date
CN116188412A true CN116188412A (en) 2023-05-30

Family

ID=86445786

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310120949.4A Pending CN116188412A (en) 2023-01-18 2023-01-18 Heart blood vessel branch identification method, system and storage medium

Country Status (1)

Country Link
CN (1) CN116188412A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115150A (en) * 2023-10-20 2023-11-24 柏意慧心(杭州)网络科技有限公司 Method, computing device and medium for determining branch vessels

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115150A (en) * 2023-10-20 2023-11-24 柏意慧心(杭州)网络科技有限公司 Method, computing device and medium for determining branch vessels
CN117115150B (en) * 2023-10-20 2024-01-26 柏意慧心(杭州)网络科技有限公司 Method, computing device and medium for determining branch vessels

Similar Documents

Publication Publication Date Title
US9968257B1 (en) Volumetric quantification of cardiovascular structures from medical imaging
CN107563983B (en) Image processing method and medical imaging device
JP6884853B2 (en) Image segmentation using neural network method
CN111008984B (en) Automatic contour line drawing method for normal organ in medical image
US20200118266A1 (en) Automated segmentation of organ chambers using deep learning methods from medical imaging
JP2022001261A (en) System and methods for image segmentation using convolutional neural network
US9959486B2 (en) Voxel-level machine learning with or without cloud-based support in medical imaging
WO2021244661A1 (en) Method and system for determining blood vessel information in image
CN109949276B (en) Lymph node detection method for improving SegNet segmentation network
US20180165305A1 (en) Systems and methods for image search
Duan et al. Deep nested level sets: Fully automated segmentation of cardiac MR images in patients with pulmonary hypertension
Peng et al. A-LugSeg: Automatic and explainability-guided multi-site lung detection in chest X-ray images
CN112396606B (en) Medical image segmentation method, system and device based on user interaction
Larrey-Ruiz et al. Automatic image-based segmentation of the heart from CT scans
Wang et al. Left atrial appendage segmentation based on ranking 2-D segmentation proposals
CN116188412A (en) Heart blood vessel branch identification method, system and storage medium
US20220301224A1 (en) Systems and methods for image segmentation
Bernier et al. Graph cut-based method for segmenting the left ventricle from MRI or echocardiographic images
Sforazzini et al. Deep Learning–based Automatic Lung Segmentation on Multiresolution CT Scans from Healthy and Fibrotic Lungs in Mice
Qiao et al. Fully automated left atrium cavity segmentation from 3D GE-MRI by multi-atlas selection and registration
Xu et al. AAR‐LN‐DQ: automatic anatomy recognition based disease quantification in thoracic lymph node zones via FDG PET/CT images without nodal delineation
CN113192069A (en) Semantic segmentation method and device for tree structure in three-dimensional tomography image
Yi et al. Global multi-level attention network for the segmentation of clinical target volume in the planning CT for cervical cancer
Dharwadkar et al. Right ventricle segmentation of magnetic resonance image using the modified convolutional neural network
Narayanan et al. Automated segmentation of the thyroid gland on thoracic CT scans by multiatlas label fusion and random forest classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination