CN112785580A - Method and device for determining blood vessel flow velocity - Google Patents
Method and device for determining blood vessel flow velocity Download PDFInfo
- Publication number
- CN112785580A CN112785580A CN202110120550.7A CN202110120550A CN112785580A CN 112785580 A CN112785580 A CN 112785580A CN 202110120550 A CN202110120550 A CN 202110120550A CN 112785580 A CN112785580 A CN 112785580A
- Authority
- CN
- China
- Prior art keywords
- blood vessel
- image
- determining
- contrast agent
- coronary angiography
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 210000004204 blood vessel Anatomy 0.000 title claims abstract description 203
- 238000000034 method Methods 0.000 title claims abstract description 67
- 238000002586 coronary angiography Methods 0.000 claims abstract description 90
- 239000002872 contrast media Substances 0.000 claims abstract description 76
- 230000017531 blood circulation Effects 0.000 claims abstract description 40
- 238000013528 artificial neural network Methods 0.000 claims description 25
- 238000012549 training Methods 0.000 claims description 24
- 238000004364 calculation method Methods 0.000 claims description 19
- 210000004351 coronary vessel Anatomy 0.000 claims description 9
- 244000181917 Rubus leucodermis Species 0.000 claims description 6
- 235000011036 Rubus leucodermis Nutrition 0.000 claims description 6
- 235000003942 Rubus occidentalis Nutrition 0.000 claims description 6
- 238000009499 grossing Methods 0.000 claims description 5
- 230000015572 biosynthetic process Effects 0.000 claims description 4
- 239000003795 chemical substances by application Substances 0.000 claims description 4
- 238000003786 synthesis reaction Methods 0.000 claims description 4
- 238000001514 detection method Methods 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 18
- 238000004590 computer program Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000003321 amplification Effects 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 6
- 238000003199 nucleic acid amplification method Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 238000003709 image segmentation Methods 0.000 description 5
- 238000013135 deep learning Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000002583 angiography Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000000747 cardiac effect Effects 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 238000013136 deep learning model Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 108700028490 CAP protocol 2 Proteins 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 210000000709 aorta Anatomy 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 230000010339 dilation Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003628 erosive effect Effects 0.000 description 1
- 230000010247 heart contraction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000002792 vascular Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/026—Measuring blood flow
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
- G06T2207/30104—Vascular flow; Blood flow; Perfusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/15—Biometric patterns based on physiological signals, e.g. heartbeat, blood flow
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Physiology (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Hematology (AREA)
- Cardiology (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
Embodiments of the application provide a method, a device, a computer readable medium and an electronic device for determining blood vessel flow velocity. The method for determining the blood vessel flow rate comprises the following steps: the method comprises the steps of predicting position deformation information corresponding to a blood vessel in a coronary angiography image at the next acquisition time based on the coronary angiography image, then determining a blood vessel image corresponding to the next acquisition time based on the position deformation information and the blood vessel in the coronary angiography image, predicting the position of a contrast agent in the blood vessel, determining the position of the contrast agent corresponding to the next time, determining the blood flow velocity in the blood vessel based on the positions of the contrast agent corresponding to a plurality of times in the blood vessel, and determining the blood flow velocity by performing front wave tracking on the contrast agent through the predicted position of the blood vessel corresponding to the next acquisition time, so that the accuracy and the efficiency of blood flow velocity detection are improved.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for determining a blood vessel flow rate, a computer-readable medium, and an electronic device.
Background
In many methods of determining the blood flow velocity, the related art generally detects the blood flow velocity based on a center line by image segmentation and center line extraction steps. However, in practical applications, the accuracy of the calculation of the length of the blood vessel is sensitive to both, and is easily affected by the image segmentation and the position of the center line. Moreover, image segmentation is used as a key step in the image segmentation, and is also the most difficult step, and the precision of the image segmentation is often influenced by various factors and is difficult to achieve the ideal condition; the extraction of the central line deviates from the real situation due to the adverse factors of deviation, insufficient smoothness and the like, so that the calculation accuracy of the length of the blood vessel is influenced, and the problem of inaccurate determination of the blood flow velocity is caused.
Disclosure of Invention
Embodiments of the present application provide a method, an apparatus, a computer-readable medium, and an electronic device for determining a blood flow velocity, so that a contrast agent can be subjected to a front wave tracking at least to a certain extent to determine a blood flow velocity, thereby improving accuracy and efficiency of blood flow velocity detection.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned by practice of the application.
According to an aspect of an embodiment of the present application, there is provided a method of determining a flow velocity of a blood vessel, including: acquiring a coronary angiography image; predicting position deformation information corresponding to the blood vessel in the coronary angiography image at the next acquisition moment based on the coronary angiography image; determining a blood vessel image corresponding to the next acquisition moment based on the position deformation information and the blood vessel in the coronary angiography image; predicting the position of the contrast agent in the blood vessel based on the blood vessel image, and determining the position of the contrast agent corresponding to the next moment; determining a blood flow velocity in a blood vessel based on the positions of the contrast agent in the blood vessel corresponding to a plurality of time instants.
According to an aspect of the embodiments of the present application, there is provided an apparatus for determining a flow rate of a blood vessel, including: an acquisition unit configured to acquire a coronary angiography image; the deformation unit is used for predicting position deformation information corresponding to the blood vessel in the coronary angiography image at the next acquisition moment based on the coronary angiography image; the blood vessel unit is used for determining a blood vessel image corresponding to the next acquisition moment based on the position deformation information and the blood vessel in the coronary angiography image; the contrast unit is used for predicting the position of the contrast agent in the blood vessel based on the blood vessel image and determining the position of the contrast agent corresponding to the next moment; and the flow rate unit is used for determining the blood flow rate in the blood vessel based on the positions of the contrast agent in the blood vessel corresponding to a plurality of moments.
In some embodiments of the present application, based on the foregoing scheme, the apparatus for determining a blood vessel flow rate further comprises: the identification unit is used for identifying the blood vessel type in the coronary angiography image based on a blood vessel classification network obtained through pre-training; the blood vessel classification network is obtained by training a neural network based on a coronary image sample; the vessel types include the left anterior descending branch, the left circumflex branch, and the right coronary artery.
In some embodiments of the present application, based on the foregoing solution, the shape changing unit includes: the deformation model unit is used for acquiring a deformation prediction model corresponding to the blood vessel type based on the blood vessel type; the deformation prediction model is obtained by training a neural network through coronary image samples corresponding to various blood vessel types; and the model prediction unit is used for inputting the coronary angiography image into the deformation prediction model for prediction and outputting the position deformation information corresponding to the blood vessel in the coronary angiography image at the next acquisition moment.
In some embodiments of the present application, based on the foregoing scheme, the angiography unit is configured to obtain a front wave tracking model corresponding to the blood vessel type based on the blood vessel type; the front wave tracking model is obtained by training a neural network through an angiogram agent sample corresponding to each blood vessel type; and inputting the blood vessel image into the front wave tracking model for prediction, and outputting the position of the contrast agent corresponding to the next moment.
In some embodiments of the present application, based on the foregoing solution, the device for determining a blood vessel flow rate is further configured to perform a multi-scale white cap operation on the coronary angiography image based on a set structural operator, and extract a bright region in the coronary angiography image; carrying out multi-scale black cap operation on the coronary angiography image based on a set structural operator, and extracting a dark region in the coronary angiography image; and performing image synthesis based on the bright area and the dark area to generate an enhanced coronary angiography image.
In some embodiments of the present application, based on the foregoing, the flow rate unit includes: a curve unit for generating a position curve of the contrast agent based on positions of the contrast agent in the blood vessel corresponding to a plurality of time instants; a slope unit for determining the blood flow velocity in the blood vessel based on the slope of the position linear curve.
In some embodiments of the present application, based on the foregoing scheme, the curve unit is configured to convert the spatial resolution and the projection ratio of the image into a three-dimensional spatial length based on the positions of the contrast agent in the blood vessel corresponding to the plurality of time instants; and generating a position curve corresponding to the three-dimensional space length by taking time as an abscissa axis based on the three-dimensional space lengths corresponding to the multiple moments.
In some embodiments of the present application, based on the foregoing scheme, the slope unit includes: a smoothing unit, configured to smooth the position curve and generate a position linear curve based on time; the interval unit is used for identifying the position linear curve and determining a speed calculation interval in the position linear curve; a curve slope unit for determining the blood flow velocity in the blood vessel based on the slope of the curve in the velocity calculation interval.
In some embodiments of the present application, based on the foregoing scheme, the interval unit is configured to obtain the local velocity by a least square method based on a slope of the position linear curve; and expanding to obtain a speed calculation interval corresponding to the target time based on the target time corresponding to the maximum speed in the local speeds.
According to an aspect of embodiments of the present application, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a method of determining a blood vessel flow velocity as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic device including: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method of determining a blood vessel flow rate as described in the embodiments above.
According to an aspect of embodiments herein, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method for determining a blood vessel flow rate provided in the various alternative implementations described above.
In the technical solutions provided in some embodiments of the present application, based on a coronary angiography image, location deformation information corresponding to a blood vessel in the coronary angiography image at a next acquisition time is predicted, then based on the location deformation information and the blood vessel in the coronary angiography image, a blood vessel image corresponding to the next acquisition time is determined, a location of a contrast agent in the blood vessel is predicted, a location of the contrast agent corresponding to the next time is determined, so as to determine a blood flow rate in the blood vessel based on the locations of the contrast agent corresponding to a plurality of times in the blood vessel, and by obtaining the location of the blood vessel corresponding to the next acquisition time in the prediction, a blood flow rate is determined by performing a forward wave tracking on the contrast agent, thereby improving accuracy and efficiency of blood flow rate detection.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 shows a schematic diagram of an exemplary system architecture to which aspects of embodiments of the present application may be applied;
FIG. 2 schematically illustrates a flow chart of a method of determining a blood vessel flow rate according to an embodiment of the present application;
FIG. 3 schematically illustrates a schematic diagram of coronary contrast image enhancement according to an embodiment of the present application;
FIG. 4 schematically illustrates a diagram of predicting a blood vessel type based on a neural network according to an embodiment of the present application;
FIG. 5 schematically illustrates a diagram of predicting contrast agent location based on a neural network according to an embodiment of the present application;
FIG. 6 schematically illustrates a diagram of contrast agent lengths corresponding to a multi-frame coronary angiography, in accordance with an embodiment of the present application;
FIG. 7 schematically illustrates a schematic diagram of a position profile according to an embodiment of the present application;
FIG. 8 schematically illustrates a schematic diagram of a method of determining a blood vessel flow rate according to an embodiment of the present application;
FIG. 9 schematically illustrates a block diagram of an apparatus for determining a flow rate of a blood vessel according to an embodiment of the present application;
FIG. 10 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject matter of the present application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the application.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Fig. 1 shows an exemplary system architecture to which the technical solution of the embodiment of the present application may be applied, as shown in fig. 1, the system architecture may include a medical image acquisition apparatus 101, a network 102, a server 103, and a terminal device 104. In this embodiment, the acquisition device 101 is used for acquiring a medical image of an aorta, and may be a Computed Tomography (CT) device, a Magnetic Resonance Imaging (MRI) device, and the like, which is not limited herein; the network 104 in this embodiment is used to provide a communication link between the terminal device and the server 103, and may include various connection types, such as a wired communication link, a wireless communication link, or a bluetooth, 5G network, etc., which are not limited herein, and is used to transmit the acquired medical image to the blood vessel detection device; in this embodiment, the terminal device 104 may be one or more of a smart phone, a tablet computer, and a portable computer 104, and certainly may also be a desktop computer, and the like, which is not limited herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 103 may be a server cluster composed of a plurality of servers.
It should be noted that, in this embodiment, the server 103 may have the same function as the terminal device 104, that is, determining the blood vessel flow rate. Specifically, by acquiring a coronary angiography image; predicting position deformation information corresponding to the blood vessel in the coronary angiography image at the next acquisition moment based on the coronary angiography image; determining a blood vessel image corresponding to the next acquisition moment based on the position deformation information and the blood vessel in the coronary angiography image; predicting the position of the contrast agent in the blood vessel based on the blood vessel image, and determining the position of the contrast agent corresponding to the next moment; the blood flow velocity in the blood vessel is determined based on the positions of the contrast agent in the blood vessel corresponding to the plurality of time instants.
According to the scheme, the position deformation information corresponding to the blood vessel at the next acquisition time in the coronary angiography image is predicted based on the coronary angiography image, then the blood vessel image corresponding to the next acquisition time is determined based on the position deformation information and the blood vessel in the coronary angiography image, the position of the contrast agent in the blood vessel is predicted, the position of the contrast agent corresponding to the next time is determined, the blood flow velocity in the blood vessel is determined based on the positions of the contrast agent in the blood vessel corresponding to a plurality of times, the blood flow velocity is determined by performing front wave tracking on the contrast agent through the position of the blood vessel corresponding to the next acquisition time obtained through prediction, and the accuracy and the efficiency of blood flow velocity detection are improved.
The implementation details of the technical solution of the embodiment of the present application are set forth in detail below:
fig. 2 shows a flow diagram of a method of determining a blood vessel flow rate according to an embodiment of the present application, which may be performed by a server, which may be the server shown in fig. 1. Referring to fig. 2, the method for determining the blood vessel flow rate at least includes steps S210 to S250, which are described in detail as follows:
in step S210, a coronary angiography image is acquired.
In an embodiment of the present application, the acquired coronary angiography image may be an original two-dimensional coronary angiography medical image sequence, and the image is taken in a region where the coronary artery of the subject is located.
In an embodiment of the present application, after acquiring the coronary angiography image, the method further includes: performing multi-scale white cap operation on the coronary angiography image based on a set structural operator, and extracting a bright area in the coronary angiography image; performing multi-scale black cap operation on the coronary angiography image based on a set structural operator, and extracting a dark region in the coronary angiography image; and performing image synthesis based on the bright area and the dark area to generate an enhanced coronary angiography image. The blood vessel region is enhanced by the image enhancement mode, the background noise is inhibited, and the contrast between the blood vessel and the background is increased.
As shown in fig. 3, the white cap operation is used to extract the bright regions:
the black cap operation is used to extract the dark regions:
BHT(x,y)=f·B(x,y)-f(x,y)
wherein,representing a closed operation for smoothing dark regions;it is shown that the operation of the dilation is,representing an erosion operation; b includes B0, B1, B2.. Bn, respectively for representing different size structure operators for extracting features of different scales.
In the embodiment, the multi-scale white cap and black cap combination algorithm is performed based on the structural operators with different sizes, so that the image enhancement is realized, the vascular structure is clearer and clearer, and the background noise is suppressed.
In an embodiment of the present application, after acquiring the coronary angiography image, the method further includes: identifying the blood vessel type in the coronary angiography image based on a blood vessel classification network obtained through pre-training; the blood vessel classification network is obtained by training a neural network based on a coronary image sample; the vessel types include the left anterior descending branch, the left circumflex branch, and the right coronary artery.
The blood vessel types in this embodiment include Left Anterior Descending (LAD), Left circumflex branch (LCX), Right Coronary Artery (RCA), and the like. In addition, in order to more accurately measure the blood flow velocity corresponding to each blood vessel, in this embodiment, each type of blood vessel has its corresponding neural network model, so as to determine the blood vessel deformation information corresponding to the next acquisition time, or predict the position of the contrast agent.
As shown in fig. 4, the process of the classification task training in this embodiment is as follows: step 1: enabling the 2D original coronary angiography image to pass through a deep neural network to obtain a prediction result; step 2: comparing the prediction result with the artificial label, and feeding back to the neural network; and 3, step 3: the neural network is updated to evolve toward reducing the prediction error. In this embodiment, by using a large amount of data and repeating the above iteration process thousands of times, the final prediction result will approach the manual criterion. The scheme has low complexity of a network structure and light magnitude, directly classifies the whole coronary artery image, realizes an end-to-end algorithm processing mode, and is quick, simple and high in precision.
Furthermore, before the neural network is trained, the medical image sample can be amplified and preprocessed so as to increase the complexity of the training sample and improve the training precision of the neural network. The amplification of the angiography image comprises data preprocessing and data amplification, and the data preprocessing and data amplification work is carried out on the original coronary angiography medical image data, so that the accuracy and the robustness of deep learning classification, registration and tracking algorithms are improved. In the actual situation, different medical imaging devices are adopted by various hospitals, so that the same deep learning model has larger algorithm result difference. The original medical image and the like are whitened, namely the gray value of the image pixel is linearly mapped to [0,1] from 0-255, so that the robustness of the deep learning model is improved.
The number of samples of medical image data is small, and the data samples need to be artificially increased during deep learning training to improve the robustness of an algorithm result. Compared with the prior art that a method of simply copying a sample is used for amplifying a data sample, when data amplification is performed on an original coronary angiography, the following steps are adopted in this embodiment: the algorithm result is improved by data amplification methods such as image translation, rotation, mirror image, brightness change, scaling and the like, and particularly the classification effect is obviously improved by adding a data amplification method which obeys normal distribution noise.
In addition, in the embodiment, when the blood vessel type is determined, automatic main branch blood vessel classification can be performed through an empirical angle judgment standard, wherein the empirical angle judgment standard comprises a posture angle commonly used in coronary angiography. The method specifically comprises a Left Anterior Oblique (LAO), a Right Anterior Oblique (RAO), an Anteroposterior (AP), a pedicel (CAU) or a head (CRA). Specifically, the main branch vessel of the current coronary angiography is judged through the common experience angle of the coronary angiography: two shooting angles LAO/RAO and CRA/CAU of coronary angiography recorded in DICOM data are read, and the combination closest to the experience angle is selected as the final classification basis. The drawback of this method is that when an important attention is desired to a certain vessel, the physician may not follow the empirical angle and choose the contrast angle arbitrarily, in which case the accuracy of this method is affected.
In step S220, based on the coronary angiography image, position deformation information corresponding to the blood vessel in the coronary angiography image at the next acquisition time is predicted.
In an embodiment of the present application, predicting, based on a coronary angiography image, position deformation information corresponding to a blood vessel in the coronary angiography image at a next acquisition time includes: based on the blood vessel type, obtaining a deformation prediction model corresponding to the blood vessel type; the deformation prediction model is obtained by training a neural network through coronary image samples corresponding to various blood vessel types; inputting the coronary angiography image into a deformation prediction model for prediction, and outputting position deformation information corresponding to the next acquisition time of the blood vessel in the coronary angiography image.
In an embodiment of the present application, the position deformation information in this embodiment is used to indicate information about changes in the position and shape of the blood vessel from a first acquisition time to a next acquisition time, where fine adjustments of the position of the blood vessel occur due to the influence of the heartbeat and the blood flow pressure during the heartbeat.
As shown in fig. 5, the method for calculating the length of the main blood vessel based on the deep learning front wave tracking method is divided into two main parts: an image registration part and a front wave tracking regression part. The position deformation information is determined by the image registration part, and the position of the contrast agent is determined by the front wave tracking regression part, which is specifically described as follows:
the former frame image of coronary angiography to be tracked is used as the input of the model, end-to-end training is carried out, the output of the network is a deformation field with the size consistent with that of the angiography image and the channel number of 2, so as to represent the shape change of the blood vessel caused by the heart beating from the former frame image to the current image, and therefore, more definite position information is obtained when the tracking task is carried out.
Specifically, the deformation field in this embodiment includes a displacement parameter of each pixel in the X and Y directions, and the displacement parameter may be the number of pixels.
The process of training the deep neural network 1 in fig. 5 is as follows: step 1: the 2D image is subjected to neural network registration to obtain a prediction result; step 2: comparing the prediction result with the template image, and feeding back to the registration neural network; and 3, step 3: the registration neural network is updated to evolve toward reducing the prediction error. In this embodiment, a large amount of data is used, and the above iterative process is repeated thousands of times, and the final prediction result is close to the template image. The image registration method provided by the scheme is an unsupervised learning method, manual marking is not needed, and marking time cost is saved.
In step S230, a blood vessel image corresponding to the next acquisition time is determined based on the position deformation information and the blood vessel in the coronary angiography image.
In an embodiment of the present application, after the position deformation information corresponding to the next acquisition time is determined, a blood vessel image corresponding to the next acquisition time is predicted and obtained based on the position deformation information and a blood vessel in a coronary angiography image, where the blood vessel image is mainly reflected on attributes such as a position and a shape of the blood vessel. The specific prediction method may be based on the blood vessel position in the coronary angiography image, and add the position deformation information, and then obtain the blood vessel image corresponding to the next acquisition time, where the blood vessel image includes the specific information of the blood vessel position, the blood vessel shape, and the like.
In step S240, the position of the contrast agent in the blood vessel is predicted based on the blood vessel image, and the position of the contrast agent corresponding to the next time is determined.
In an embodiment of the present application, predicting a location of a contrast agent in a blood vessel based on a blood vessel image, and determining a location of the contrast agent corresponding to a next time includes: acquiring a front wave tracking model corresponding to the blood vessel type based on the blood vessel type; the front wave tracking model is obtained by training a neural network through an angiogram agent sample corresponding to each blood vessel type; and then inputting the blood vessel image into a front wave tracking model for prediction, and outputting the position of the contrast agent corresponding to the next moment.
Specifically, in the process of performing the front wave tracking based on the deep learning mode, the processing can be performed through the trained neural network. In the processing process, the result of the registration of the image to be tracked and the previous frame in the previous step is simultaneously used as the input of a front wave tracking network, and the output of the network is the position of the current front wave of the contrast agent.
In addition, in the embodiment of the present application, in the process of training the front wave tracking model, that is, the deep neural network 2 in fig. 5, the specific training process includes: step 1: the registration result is combined with the image to be tracked and simultaneously input into a front wave tracking network for front wave tracking; step 2: comparing the predicted result with the true value, and feeding back to the front wave tracking network; and 3, step 3: and updating the wave front tracking network to evolve towards reducing the prediction error. In the embodiment of the application, by using a large amount of data and repeating the iteration process thousands of times, the final prediction result is close to the manual marking. The network structure of the scheme has low complexity, light magnitude and high speed, simplicity and precision.
In step S250, the blood flow velocity in the blood vessel is determined based on the positions of the contrast agent in the blood vessel corresponding to the plurality of times.
As shown in fig. 6, in the present embodiment, the positions of the contrast agent in the blood vessel are acquired based on a plurality of time instants, for example, the coronary angiography acquired in each frame in fig. 6, and the positions of the contrast agent therein, such as the gray dots in the figure, are determined by performing the front wave tracking based on the coronary angiography corresponding to each frame. And then obtaining the position of the front wave point of the contrast agent on each frame of image. The corresponding length is then determined based on the position of the contrast agent, and the blood flow velocity in the blood vessel is then determined based on the length.
The method for determining the blood flow velocity in a blood vessel based on the positions of a contrast agent corresponding to a plurality of times includes steps S251 to S252:
s251: a position curve of the contrast agent is generated based on the positions of the contrast agent in the blood vessel corresponding to the plurality of times.
As shown in fig. 7, the position of the contrast agent in the blood vessel corresponding to a plurality of time instants is converted into a three-dimensional space length through the spatial resolution and the projection ratio of the image in the embodiment; the pixel length of the main blood vessel connected by the contrast agent front wave points obtained frame by frame is converted into the actual three-dimensional space length by the spatial resolution and the projection ratio of the image, and the time is taken as the abscissa axis to generate the position curve corresponding to the three-dimensional space length. Wherein the horizontal axis is the acquisition time corresponding to each coronary angiography image, and the vertical axis is the position of the front wave point of the contrast agent, namely the flow length of the contrast agent.
S252: based on the slope of the positional linear curve, the blood flow velocity in the vessel is determined.
In an embodiment of the present application, after determining the position curve of the contrast agent, determining the blood flow velocity in the blood vessel based on the slope of the position linear curve comprises: because the coronary artery shows the situation that the shape and the length of the coronary artery continuously change along with the time in the contrast due to the change of the cardiac cycle, the curve fitting can be carried out on the position curve, and the noise of the length change is weakened as much as possible in a smooth mode; identifying the position linear curve and determining a speed calculation interval in the position linear curve; based on the slope of the curve in the velocity calculation interval, the blood flow velocity in the vessel is determined.
Optionally, in this embodiment, the position linear curve may also be identified, and the determining the speed calculation interval includes: obtaining a local speed by a least square method based on the slope of the position linear curve; and expanding to obtain a speed calculation interval corresponding to the target time based on the target time corresponding to the maximum speed in the local speeds. In the embodiment, the local speed is captured by a least square method, the slope of the curve is fitted, the size of a local convolution kernel is adjustable, the position of the maximum speed point is automatically obtained, half cardiac cycles are respectively expanded forwards and backwards by taking the maximum speed point as a center, and the fitting speed of the least square method in the expanded area is automatically given.
Optionally, in this embodiment, the extended area may be manually adjusted to reach the expected calculation area, and the least square fitting speed value is displayed in real time. By the provided manual adjustment method, good interactive experience can be provided, and the complexity of operation is reduced.
Embodiments of the apparatus of the present application are described below, which may be used to perform the method of determining the flow rate of a blood vessel of the above-described embodiments of the present application. It will be appreciated that the apparatus may be a computer program (comprising program code) running on a computer device, for example an application software; the apparatus may be used to perform the corresponding steps in the methods provided by the embodiments of the present application. For details not disclosed in the embodiments of the device of the present application, please refer to the embodiments of the method for determining the blood vessel flow rate described above in the present application.
As shown in fig. 8, in the above-described scheme, based on a coronary contrast image, position deformation information corresponding to a blood vessel in the coronary contrast image at the next acquisition time is predicted based on a machine learning method, then based on the position deformation information and the blood vessel in the coronary contrast image, a blood vessel image corresponding to the next acquisition time is determined, a contrast agent wavefront tracking is performed based on the machine learning method, the position of a contrast agent in the blood vessel is predicted, the position of the contrast agent corresponding to the next time is determined, a time blood vessel length curve is automatically generated based on the positions of the contrast agent in the blood vessel corresponding to a plurality of times, and the curve is automatically smoothed, a velocity calculation space is automatically selected, and a manual adjustment calculation section is manually selected to determine the blood flow velocity in the blood vessel. The position of the blood vessel corresponding to the next acquisition time is obtained through prediction, the contrast agent is subjected to front wave tracking to determine the blood flow velocity, and the accuracy and the efficiency of blood flow velocity detection are improved.
Fig. 9 shows a block diagram of an apparatus for determining a blood vessel flow rate according to an embodiment of the present application.
Referring to fig. 9, an apparatus 900 for determining a flow rate of a blood vessel according to an embodiment of the present application includes:
an acquisition unit 910 configured to acquire a coronary angiography image; a deformation unit 920, configured to predict, based on the coronary angiography image, position deformation information corresponding to a blood vessel in the coronary angiography image at a next acquisition time; a blood vessel unit 930, configured to determine, based on the position deformation information and a blood vessel in the coronary angiography image, a blood vessel image corresponding to the next acquisition time; a contrast unit 940, configured to predict a location of a contrast agent in a blood vessel based on the blood vessel image, and determine a location of the contrast agent corresponding to the next time; a flow rate unit 950 for determining a blood flow rate in a blood vessel based on the positions of the contrast agent in the blood vessel corresponding to the plurality of time instants.
In some embodiments of the present application, based on the foregoing scheme, the apparatus 900 for determining a blood vessel flow rate further comprises: the identification unit is used for identifying the blood vessel type in the coronary angiography image based on a blood vessel classification network obtained through pre-training; the blood vessel classification network is obtained by training a neural network based on a coronary image sample; the vessel types include the left anterior descending branch, the left circumflex branch, and the right coronary artery.
In some embodiments of the present application, based on the foregoing solution, the morphing unit 920 includes: the deformation model unit is used for acquiring a deformation prediction model corresponding to the blood vessel type based on the blood vessel type; the deformation prediction model is obtained by training a neural network through coronary image samples corresponding to various blood vessel types; and the model prediction unit is used for inputting the coronary angiography image into the deformation prediction model for prediction and outputting the position deformation information corresponding to the blood vessel in the coronary angiography image at the next acquisition moment.
In some embodiments of the present application, based on the foregoing solution, the contrast unit 940 is configured to obtain a front wave tracking model corresponding to the blood vessel type based on the blood vessel type; the front wave tracking model is obtained by training a neural network through an angiogram agent sample corresponding to each blood vessel type; and inputting the blood vessel image into the front wave tracking model for prediction, and outputting the position of the contrast agent corresponding to the next moment.
In some embodiments of the present application, based on the foregoing solution, the device 900 for determining a blood vessel flow rate is further configured to perform a multi-scale white cap operation on the coronary angiography image based on a set structural operator, and extract a bright region in the coronary angiography image; carrying out multi-scale black cap operation on the coronary angiography image based on a set structural operator, and extracting a dark region in the coronary angiography image; and performing image synthesis based on the bright area and the dark area to generate an enhanced coronary angiography image.
In some embodiments of the present application, based on the foregoing scheme, the flow rate unit 950 includes: a curve unit for generating a position curve of the contrast agent based on positions of the contrast agent in the blood vessel corresponding to a plurality of time instants; a slope unit for determining the blood flow velocity in the blood vessel based on the slope of the position linear curve.
In some embodiments of the present application, based on the foregoing scheme, the curve unit is configured to convert the spatial resolution and the projection ratio of the image into a three-dimensional spatial length based on the positions of the contrast agent in the blood vessel corresponding to the plurality of time instants; and generating a position curve corresponding to the three-dimensional space length by taking time as an abscissa axis based on the three-dimensional space lengths corresponding to the multiple moments.
In some embodiments of the present application, based on the foregoing scheme, the slope unit includes: a smoothing unit, configured to smooth the position curve and generate a position linear curve based on time; the interval unit is used for identifying the position linear curve and determining a speed calculation interval in the position linear curve; a curve slope unit for determining the blood flow velocity in the blood vessel based on the slope of the curve in the velocity calculation interval.
In some embodiments of the present application, based on the foregoing scheme, the interval unit is configured to obtain the local velocity by a least square method based on a slope of the position linear curve; and expanding to obtain a speed calculation interval corresponding to the target time based on the target time corresponding to the maximum speed in the local speeds.
FIG. 10 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
It should be noted that the computer system 1000 of the electronic device shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 10, the computer system 1000 includes a Central Processing Unit (CPU)1001 that can perform various appropriate actions and processes, such as performing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 1002 or a program loaded from a storage portion 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data necessary for system operation are also stored. The CPU 1001, ROM 1002, and RAM 1003 are connected to each other via a bus 1004. An Input/Output (I/O) interface 1005 is also connected to the bus 1004.
The following components are connected to the I/O interface 1005: an input section 1006 including a keyboard, a mouse, and the like; an output section 1007 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 1008 including a hard disk and the like; and a communication section 1009 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The driver 1010 is also connected to the I/O interface 1005 as necessary. A removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1010 as necessary, so that a computer program read out therefrom is mounted into the storage section 1008 as necessary.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication part 1009 and/or installed from the removable medium 1011. When the computer program is executed by a Central Processing Unit (CPU)1001, various functions defined in the system of the present application are executed.
It should be noted that the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with a computer program embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations described above.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.
Claims (10)
1. A method of determining a flow rate of a blood vessel, comprising:
acquiring a coronary angiography image;
predicting position deformation information corresponding to the blood vessel in the coronary angiography image at the next acquisition moment based on the coronary angiography image;
determining a blood vessel image corresponding to the next acquisition moment based on the position deformation information and the blood vessel in the coronary angiography image;
predicting the position of the contrast agent in the blood vessel based on the blood vessel image, and determining the position of the contrast agent corresponding to the next moment;
determining a blood flow velocity in a blood vessel based on the positions of the contrast agent in the blood vessel corresponding to a plurality of time instants.
2. The method of claim 1, wherein after acquiring the coronary angiographic image, further comprising:
identifying the blood vessel type in the coronary angiography image based on a blood vessel classification network obtained through pre-training;
the blood vessel classification network is obtained by training a neural network based on a coronary image sample; the vessel types include the left anterior descending branch, the left circumflex branch, and the right coronary artery.
3. The method of claim 2, wherein predicting location deformation information corresponding to a blood vessel in the coronary image at a next acquisition time based on the coronary image comprises:
based on the blood vessel type, acquiring a deformation prediction model corresponding to the blood vessel type; the deformation prediction model is obtained by training a neural network through coronary image samples corresponding to various blood vessel types;
and inputting the coronary angiography image into the deformation prediction model for prediction, and outputting position deformation information corresponding to the blood vessel in the coronary angiography image at the next acquisition time.
4. The method of claim 2, wherein predicting a location of a contrast agent in a blood vessel based on the blood vessel image, and determining a location of the contrast agent corresponding to the next time comprises:
acquiring a front wave tracking model corresponding to the blood vessel type based on the blood vessel type; the front wave tracking model is obtained by training a neural network through an angiogram agent sample corresponding to each blood vessel type;
and inputting the blood vessel image into the front wave tracking model for prediction, and outputting the position of the contrast agent corresponding to the next moment.
5. The method of claim 1, wherein after the acquiring the coronary angiography image, further comprising:
carrying out multi-scale white cap operation on the coronary angiography image based on a set structural operator, and extracting a bright area in the coronary angiography image;
carrying out multi-scale black cap operation on the coronary angiography image based on a set structural operator, and extracting a dark region in the coronary angiography image;
and performing image synthesis based on the bright area and the dark area to generate an enhanced coronary angiography image.
6. The method of claim 1, wherein determining the blood flow rate in the blood vessel based on the locations of the contrast agent corresponding to the plurality of time instances comprises:
generating a position curve of the contrast agent based on the positions of the contrast agent in the blood vessel corresponding to a plurality of moments;
determining a blood flow velocity in the vessel based on a slope of the positional linear curve.
7. The method of claim 6, wherein generating a location profile of the contrast agent based on locations of the contrast agent in the blood vessel corresponding to the plurality of time instants comprises:
converting the position of the contrast agent in the blood vessel based on the plurality of moments into a three-dimensional space length through the spatial resolution and the projection ratio of the image;
and generating a position curve corresponding to the three-dimensional space length by taking time as an abscissa axis based on the three-dimensional space lengths corresponding to the multiple moments.
8. The method of claim 6, wherein determining the blood flow rate in the blood vessel based on the slope of the positional linear curve comprises:
smoothing the position curve to generate a position linear curve based on time;
identifying the position linear curve and determining a speed calculation interval in the position linear curve;
determining a blood flow velocity in the blood vessel based on a slope of a curve in the velocity calculation interval.
9. The method of claim 8, wherein identifying the position linearity curve and determining a velocity calculation interval therein comprises:
obtaining a local speed by a least square method based on the slope of the position linear curve;
and expanding to obtain a speed calculation interval corresponding to the target time based on the target time corresponding to the maximum speed in the local speeds.
10. An apparatus for determining a flow rate of a blood vessel, comprising:
an acquisition unit configured to acquire a coronary angiography image;
the deformation unit is used for predicting position deformation information corresponding to the blood vessel in the coronary angiography image at the next acquisition moment based on the coronary angiography image;
the blood vessel unit is used for determining a blood vessel image corresponding to the next acquisition moment based on the position deformation information and the blood vessel in the coronary angiography image;
the contrast unit is used for predicting the position of the contrast agent in the blood vessel based on the blood vessel image and determining the position of the contrast agent corresponding to the next moment;
and the flow rate unit is used for determining the blood flow rate in the blood vessel based on the positions of the contrast agent in the blood vessel corresponding to a plurality of moments.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110120550.7A CN112785580B (en) | 2021-01-28 | 2021-01-28 | Method and device for determining vascular flow velocity |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110120550.7A CN112785580B (en) | 2021-01-28 | 2021-01-28 | Method and device for determining vascular flow velocity |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112785580A true CN112785580A (en) | 2021-05-11 |
CN112785580B CN112785580B (en) | 2024-02-02 |
Family
ID=75759501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110120550.7A Active CN112785580B (en) | 2021-01-28 | 2021-01-28 | Method and device for determining vascular flow velocity |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112785580B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114170114A (en) * | 2021-12-14 | 2022-03-11 | 北京柏惠维康科技有限公司 | Method and device for enhancing spine CT image and spine surgical robot |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101779967A (en) * | 2009-01-21 | 2010-07-21 | 重庆医科大学超声影像学研究所 | Method for directly measuring blood flowing velocity by using ultrasound microbubble (microsphere) contrast agent |
US20120072190A1 (en) * | 2010-09-16 | 2012-03-22 | Siemens Corporation | Method and System for Non-Invasive Assessment of Coronary Artery Disease |
JP2015097724A (en) * | 2013-11-20 | 2015-05-28 | 株式会社東芝 | Blood vessel analysis device and blood vessel analysis program |
CN110448319A (en) * | 2018-05-08 | 2019-11-15 | 博动医学影像科技(上海)有限公司 | Based on radiography image and blood flow velocity calculation method coronarius |
US20200022664A1 (en) * | 2018-07-17 | 2020-01-23 | International Business Machines Corporation | Fluid-injector for a simultaneous anatomical and fluid dynamic analysis in coronary angiography |
CN110786842A (en) * | 2019-11-04 | 2020-02-14 | 苏州润迈德医疗科技有限公司 | Method, device, system and storage medium for measuring diastolic blood flow velocity |
-
2021
- 2021-01-28 CN CN202110120550.7A patent/CN112785580B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101779967A (en) * | 2009-01-21 | 2010-07-21 | 重庆医科大学超声影像学研究所 | Method for directly measuring blood flowing velocity by using ultrasound microbubble (microsphere) contrast agent |
US20120072190A1 (en) * | 2010-09-16 | 2012-03-22 | Siemens Corporation | Method and System for Non-Invasive Assessment of Coronary Artery Disease |
JP2015097724A (en) * | 2013-11-20 | 2015-05-28 | 株式会社東芝 | Blood vessel analysis device and blood vessel analysis program |
CN110448319A (en) * | 2018-05-08 | 2019-11-15 | 博动医学影像科技(上海)有限公司 | Based on radiography image and blood flow velocity calculation method coronarius |
US20200022664A1 (en) * | 2018-07-17 | 2020-01-23 | International Business Machines Corporation | Fluid-injector for a simultaneous anatomical and fluid dynamic analysis in coronary angiography |
CN110786842A (en) * | 2019-11-04 | 2020-02-14 | 苏州润迈德医疗科技有限公司 | Method, device, system and storage medium for measuring diastolic blood flow velocity |
Non-Patent Citations (2)
Title |
---|
陈兴新;骆秉铨;杨瑞华;陈莉莉;: "冠脉造影数字跟踪技术测量冠脉血流速度的临床研究", 生物医学工程学杂志, no. 02, pages 295 - 296 * |
骆秉铨, 陈兴新, 钱菊英, 张义勤, 杨瑞华, 石怀林, 夏项, 葛均波: "冠脉造影数字跟踪法测定冠脉血流速度与Doppler血流速度相关分析", 中国微循环, no. 03 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114170114A (en) * | 2021-12-14 | 2022-03-11 | 北京柏惠维康科技有限公司 | Method and device for enhancing spine CT image and spine surgical robot |
Also Published As
Publication number | Publication date |
---|---|
CN112785580B (en) | 2024-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11568533B2 (en) | Automated classification and taxonomy of 3D teeth data using deep learning methods | |
CN110475505B (en) | Automatic segmentation using full convolution network | |
CN107665736B (en) | Method and apparatus for generating information | |
CN110348515B (en) | Image classification method, image classification model training method and device | |
CN107886508B (en) | Differential subtraction method and medical image processing method and system | |
CN112184888B (en) | Three-dimensional blood vessel modeling method and device | |
WO2021136368A1 (en) | Method and apparatus for automatically detecting pectoralis major region in molybdenum target image | |
CN112465834B (en) | Blood vessel segmentation method and device | |
US11684333B2 (en) | Medical image analyzing system and method thereof | |
CN110046627B (en) | Method and device for identifying mammary gland image | |
KR102228087B1 (en) | Method and apparatus for segmentation of specific cartilage in medical image | |
CN113643354B (en) | Measuring device of vascular caliber based on fundus image with enhanced resolution | |
JP2023515367A (en) | Out-of-distribution detection of input instances to model | |
CN110197472B (en) | Method and system for stable quantitative analysis of ultrasound contrast image | |
CN111223158B (en) | Artifact correction method for heart coronary image and readable storage medium | |
CN117809122B (en) | Processing method, system, electronic equipment and medium for intracranial large blood vessel image | |
Tummala et al. | Liver tumor segmentation from computed tomography images using multiscale residual dilated encoder‐decoder network | |
CN116245832A (en) | Image processing method, device, equipment and storage medium | |
CN112801999B (en) | Method and device for determining heart coronary artery dominance | |
CN112785580B (en) | Method and device for determining vascular flow velocity | |
CN117274216B (en) | Ultrasonic carotid plaque detection method and system based on level set segmentation | |
CN117036253B (en) | Method for training a segmentation model for segmenting cerebral vessels and related products | |
Lainé et al. | Carotid artery wall segmentation in ultrasound image sequences using a deep convolutional neural network | |
CN111539926B (en) | Image detection method and device | |
CN113222985A (en) | Image processing method, image processing device, computer equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |