CN110738635A - feature tracking method and device - Google Patents

feature tracking method and device Download PDF

Info

Publication number
CN110738635A
CN110738635A CN201910857479.3A CN201910857479A CN110738635A CN 110738635 A CN110738635 A CN 110738635A CN 201910857479 A CN201910857479 A CN 201910857479A CN 110738635 A CN110738635 A CN 110738635A
Authority
CN
China
Prior art keywords
magnetic resonance
resonance image
feature
cardiac
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910857479.3A
Other languages
Chinese (zh)
Inventor
朱燕杰
梁栋
邹莉娴
柯子文
刘新
郑海荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201910857479.3A priority Critical patent/CN110738635A/en
Publication of CN110738635A publication Critical patent/CN110738635A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Quality & Reliability (AREA)
  • Computational Linguistics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Multimedia (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

The application is suitable for the technical field of medical imaging, and provides feature tracking methods and devices.

Description

feature tracking method and device
Technical Field
The application belongs to the technical field of medical imaging, and particularly relates to feature tracking methods and devices.
Background
With the rapid development of magnetic resonance imaging technology, feature tracking technology for magnetic resonance images becomes an important means for atrial strain assessment. The feature tracking technique for the magnetic resonance image is to mark feature points in the magnetic resonance image and obtain myocardial strain parameters according to the position changes of the feature points. And then the vexed atrial strain is evaluated based on the myocardial strain parameter information.
At present, usually adopts manual marking method to mark feature points in magnetic resonance images, and medical staff marks feature points in every frames of magnetic resonance images in a cardiac cine, and this method is very labor-consuming and time-consuming.
Disclosure of Invention
The embodiment of the application provides feature tracking methods and devices, and can solve the problems of low efficiency and high error rate of feature point marking.
, the embodiment of the application provides feature tracking methods, including acquiring a magnetic resonance image to be tracked, inputting the magnetic resonance image into a trained neural network for processing, and obtaining a magnetic resonance image marked with preset feature points, where the neural network is used to mark the feature points for the input magnetic resonance image.
By adopting the characteristic tracking method provided by the application, the designated characteristic points in the magnetic resonance image are marked through the trained neural network without manual marking. Therefore, the labor is saved, the efficiency of characteristic point marking is improved, and the problem that the characteristic point marking has errors due to manual errors is solved.
Optionally, the magnetic resonance image is a cardiac magnetic resonance image and the characteristic points comprise at least atrioventricular junctions and/or a posterior left atrial wall midpoint.
Optionally, the cardiac magnetic resonance image is an image in a cardiac cine, and the method further includes, after feature point labeling is performed on each frames of cardiac magnetic resonance images in the cardiac cine, determining myocardial strain parameter information according to positions of the feature points in each frames of cardiac magnetic resonance images.
In a second aspect, the application provides kinds of feature tracking devices, which include an obtaining module configured to obtain a magnetic resonance image to be tracked, and a labeling module configured to input the magnetic resonance image obtained by the obtaining module into a trained neural network for processing, so as to obtain a magnetic resonance image labeled with preset feature points, where the neural network is configured to label the feature points for the input magnetic resonance image.
Optionally, the magnetic resonance image is a cardiac magnetic resonance image and the characteristic points comprise at least atrioventricular junctions and/or a posterior left atrial wall midpoint.
Optionally, the cardiac magnetic resonance image is an image in a cardiac cine, and the feature tracking apparatus further includes:
and the processing module is used for determining the myocardial strain parameter information according to the position of the characteristic point in each frames of cardiac magnetic resonance image after the marking module marks the characteristic point in each frames of cardiac magnetic resonance image in the cardiac movie.
Based on the th aspect or the second aspect, optionally, the neural network is trained based on a preset data set, the data set including a plurality of pre-acquired magnetic resonance images and a plurality of magnetic resonance images with the feature points being artificially labeled.
For example, after a plurality of magnetic resonance images are acquired in advance, characteristic points are manually marked on each magnetic resonance image. Then, in the process of training the neural network, a plurality of pre-acquired magnetic resonance images are used as the input of the neural network, and the plurality of magnetic resonance images marked with the characteristic points are used as the output of the neural network.
Optionally, the neural Network is a Residual Density Network (RDN).
Based on the optional mode, local and global features of different depths can be fused by utilizing the RDN, so that the features of the whole network are effectively utilized, and the accuracy of marking the feature points in the magnetic resonance image is improved.
In a third aspect, an embodiment of the present application provides terminal devices, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the method according to or of .
In a fourth aspect, embodiments of the present application provide computer readable storage media storing a computer program that when executed by a processor implements the method as described in any of alternatives of aspect or aspect .
In a fifth aspect, embodiments of the present application provide computer program products for, when run on a terminal device, causing the terminal device to perform the method of any of the alternatives of the or described above.
It is understood that the beneficial effects of the second to fifth aspects can be found in the related description of the aspect, and are not repeated herein.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings according to these drawings without any creative effort.
FIG. 1 is a schematic flowchart of a feature tracking method according to an embodiment of the present application ;
fig. 2 is a schematic diagram of a feature point extraction process provided in this application ;
FIG. 3 is a process flow diagram of an RDN network provided by an embodiment of the present application ;
FIG. 4 is a second flowchart illustrating a feature tracking method according to an embodiment of the present application ;
FIG. 5 is a schematic diagram of a feature tracking device according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of this application .
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
For example, "A and/or B" means that "A" is present alone, or "B" is present alone, or both.
It should also be understood that references to " embodiments" or " embodiments" etc. described in the specification of the application mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in or more embodiments of the application, thus, the appearances of the phrases "in embodiments," "in embodiments," "in other embodiments," "in other embodiments," etc. in this specification are not necessarily all referring to the same embodiment, but mean " or more, but not all embodiments," unless specifically emphasized otherwise.
According to the feature tracking method, the neural network is utilized to automatically mark the feature points of the magnetic resonance image, so that labor is saved, and the efficiency and accuracy of feature point marking are improved.
The following describes an exemplary feature tracking method provided in the present application with reference to specific embodiments.
Referring to fig. 1, a flow chart of embodiments of the feature tracking method provided herein includes:
step S101, the terminal equipment acquires a magnetic resonance image to be tracked.
And S102, the terminal equipment inputs the magnetic resonance image into a trained neural network for processing to obtain the magnetic resonance image marked with the preset characteristic points.
The neural network is trained and can mark designated characteristic points in the magnetic resonance image.
For example, referring to fig. 2, a trained neural network is used for marking two atrioventricular junctions and a central point of a posterior left atrium wall in the cardiac magnetic resonance image, and after the cardiac magnetic resonance image is input into the trained neural network, the neural network outputs the cardiac magnetic resonance image marked with the atrioventricular junction 1, the atrioventricular junction 2 and the central point of the posterior left atrium wall by extracting the two atrioventricular junctions and the central point of the posterior left atrium wall in the cardiac magnetic resonance image.
In the present application, the neural network used may include, but is not limited to, neural Networks of U-type network (U _ Net), Generative Adaptive Networks (GAN), RDN, and the like.
The neural network selected and built can be trained by utilizing a pre-built data set, so that the built neural network has the function of marking specified feature points.
The method includes acquiring a predetermined number of magnetic resonance images, and manually labeling feature points on the acquired magnetic resonance images to obtain a data set consisting of the acquired magnetic resonance images and the feature point-labeled magnetic resonance images.
And taking a plurality of pre-acquired magnetic resonance images as the input of the neural network, taking the plurality of magnetic resonance images marked with the characteristic points as the output of the neural network, and training the neural network. The method comprises the step of training hyper-parameters involved in the neural network, such as learning rate, an optimizer, a loss function and the like, so that the hyper-parameters of the neural network achieve the most effect. And then configuring the trained neural network in the terminal equipment, so that the terminal equipment can mark the characteristic points in the magnetic resonance image by using the trained neural network.
Illustratively, the step 102 is exemplarily described by taking RDN as an example. The RDN can fuse local features and global features of different depths, so that the features extracted from the whole neural network are effectively utilized. The RDN comprises five parts, namely a Shallow feature extraction network (SFEnet), n (n is more than or equal to 2) Residual density modules (RDB), Global feature fusion (Global feature fusion), Global Residual learning (Global Residual learning) and deep feature extraction.
Referring to fig. 3, a schematic flow chart of a process for using the trained RDN to label feature points of the cardiac magnetic resonance image for the terminal device is shown.
After the cardiac magnetic resonance image is input into the RDN, the RDN first performs shallow feature extraction on the cardiac magnetic resonance image through a shallow feature extraction network, where the shallow feature extraction network includes two three-dimensional (3D) convolution layers (shown as Conv in fig. 3), and after the th layer of 3D convolution layer performs convolution operation on the cardiac magnetic resonance image, the extracted shallow feature is a feature a, and after the second layer of 3D convolution layer performs convolution operation on the feature a, the extracted shallow feature is a feature b.
And secondly, inputting the feature b into n RDBs for local feature fusion, wherein each RDB in the n RDBs comprises a plurality of layers of 3D convolutional layers, a connection (concatenate) layer (such as concat shown in FIG. 3), 1 × 1 of 3D convolutional layers and residual error links, taking the process flow of the RDB1 as an example, assuming that the RDB1 comprises 3 layers of 3D convolutional layers, outputting the feature c11 after the feature b is input into a th layer of 3D convolutional layers for calculation, extracting the feature c12 after the feature b and the feature c11 are input into a second layer of 3D convolutional layers for calculation, extracting the feature c12 after the feature b, the feature c11 and the feature c12 are input into a third layer of 3D convolutional layers for calculation, extracting the feature c13, then connecting the feature b, the feature c 7378, the feature c 6866 and the feature 13 through the connection layers, inputting the feature b, completing the calculation in the 3D layers with 1 × 1, completing the calculation of the three layers of 3D layers for local feature fusion, outputting the result of the feature b fusion, adding the RDB fusion, and the feature 468, and outputting the feature c1, and finally obtaining the result of the feature b which is the result of the RDB fused 3684, and the result of the local feature b fused by analogy, and the RDB fused 3684, and the feature c 465, and the feature c 4614, and the result of fused RDB fused by the RDB fused 3614, and the feature c fused RDB fused by.
Next, global feature fusion is performed on the n RDB output local feature fusion results (including feature c1, feature c2, and feature … …, feature cn). As shown in fig. 3, the feature c1, the feature c2, and the feature cn of … … are connected via a connection layer, and input into the 3D convolution layer of 1 × 1 for calculation, so as to complete the fusion of n RDB outputs, and obtain the global fusion result feature D.
And , performing global residual error learning on the feature D, namely inputting the feature D into 3D convolutional layers for calculation to obtain a feature e, and then performing addition processing of residual error connection on the feature a and the feature e to complete global residual error learning to obtain a feature f.
And finally, carrying out deep feature extraction on the feature f to obtain a cardiac magnetic resonance image marked with the atrioventricular junction 1, the atrioventricular junction 2 and the midpoint of the rear wall of the left atrium, inputting the feature f into an amplification convolution (upscale) layer, amplifying the feature f to obtain a feature g, and then inputting the feature g into 3D convolution layers for calculation to obtain the cardiac magnetic resonance image marked with the atrioventricular junction 1, the atrioventricular junction 2 and the midpoint of the rear wall of the left atrium.
In embodiments, when the magnetic resonance image is a cardiac magnetic resonance image in a cardiac cine, the terminal device performs feature point labeling on every frames of cardiac magnetic resonance image in the cardiac cine by using the trained neural network in the same manner, based on fig. 1, as shown in fig. 4, the method further includes:
step S103, after the terminal equipment marks the characteristic points of every frames of cardiac magnetic resonance images in the cardiac movie, the myocardial strain parameter information is determined according to the positions of the characteristic points in every frames of cardiac magnetic resonance images.
For example, the terminal device may extract the motion trajectory of the feature point in the cardiac cine according to the position of the feature point in each frames of cardiac magnetic resonance image, for example, calculate the distance between the feature point and the position of the cardiac magnetic resonance image in the adjacent frame.
The myocardial strain parameter information may be global myocardial parameters and/or local myocardial parameters of the myocardium. For example, the myocardial strain parameter information may include stress in different directions (circumferential, radial, and/or longitudinal), stress versus time, and/or strain rate.
By adopting the characteristic tracking method provided by the application, in the aspect of , the designated characteristic points in the magnetic resonance image can be automatically and quickly marked through the trained neural network without using artificial marking, so that the efficiency and the accuracy of characteristic point extraction are improved.
In the other aspect, the feature points in the magnetic resonance image can be marked in real time and quickly, so that the myocardial strain parameter information can be acquired in real time based on the positions of the feature points.
Fig. 5 is a block diagram of a feature tracking device according to an embodiment of the present application, which corresponds to the feature tracking method according to the above embodiment, and for convenience of description, only the relevant portions of the embodiment of the present application are shown in fig. 5.
Referring to fig. 5, the feature tracking apparatus 50 includes:
an acquiring module 501 is configured to acquire a magnetic resonance image to be tracked.
The labeling module 502 is configured to input the magnetic resonance image acquired by the acquiring module 501 into a trained neural network for processing, so as to obtain a magnetic resonance image labeled with preset feature points, where the neural network is configured to label the feature points for the input magnetic resonance image.
Optionally, the magnetic resonance image is a cardiac magnetic resonance image and the characteristic points comprise at least atrioventricular junctions and/or a posterior left atrial wall midpoint.
Optionally, the cardiac magnetic resonance image is an image in a cardiac cine, and the feature tracking apparatus 50 further includes:
a processing module 503, configured to determine myocardial strain parameter information according to the position of the feature point in each frames of the cardiac magnetic resonance image after the marking module 502 completes marking the feature point on each frames of the cardiac magnetic resonance image in the cardiac cine.
The feature tracking apparatus 50 may be a terminal device, a chip in the terminal device, or a functional module integrated in the terminal device, which executes the feature tracking method provided in the present application in the above embodiments.
It should be noted that, for the specific functions and technical effects of the implementation process of each module in the feature tracking apparatus 50, which are based on the concept of in the embodiment of the method of the present application, reference may be made to the above-mentioned embodiment of the method, and details are not described herein again.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of each functional module is merely illustrated, and in practical applications, the above function distribution may be completed by different functional units and modules as needed, that is, the internal structure of the feature tracking apparatus 50 is divided into different functional units or modules to complete all or part of the above described functions.
The present embodiment also provides kinds of terminal devices 60, where the terminal device 60 may be a desktop computer, a notebook computer, a palm computer, a cloud server, and other computing devices, and may also be a console in a magnetic resonance imaging system, referring to fig. 6, the terminal device 60 includes at least processors 601 and a memory 602, and at least processors 601 are connected to the memory 602 through a bus 603, the memory 602 stores a computer program 604 that can be executed on the at least processors 601, and the processor 601 implements the steps in any of the above method embodiments when executing the computer program 604.
The Processor 601 may be a Central Processing Unit (CPU), the Processor 601 may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), ready-to-use Programmable arrays (FPGAs) or other Programmable logic devices, discrete or transistor logic devices, discrete hardware components, etc.
The memory 602 may be an internal storage unit of the terminal device 60 in embodiments, such as a hard disk or memory of the terminal device 60, the memory 602 may also be an external storage device of the terminal device 60 in embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. further , the memory 602 may also include both an internal storage unit of the terminal device 60 and an external storage device, the memory 602 is used to store an operating system, applications, a boot loader (BootLoader), data, and other programs, such as program code for the computer program 604, etc. the memory 602 may also be used to temporarily store data that has been or will be output.
The present application further provides computer-readable storage media, where the computer-readable storage media store computer programs, and when the computer programs are executed by a processor, the computer programs implement the steps that can implement the above-mentioned method embodiments.
The present application provides computer program products, which when run on a mobile terminal, enable the mobile terminal to implement the steps of the above method embodiments when executed.
The integrated unit, if implemented as a software functional unit and sold or used as a stand-alone product, may be stored in computer readable storage media upon which it is understood that all or part of the processes of the methods of the embodiments described above are implemented, may be implemented by a computer program that may be stored in computer readable storage media, which when executed by a processor may implement the steps of the various method embodiments described above.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1, A method for feature tracking, the method comprising:
acquiring a magnetic resonance image to be tracked;
and inputting the magnetic resonance image into a trained neural network for processing to obtain the magnetic resonance image marked with preset characteristic points, wherein the neural network is used for marking the characteristic points for the input magnetic resonance image.
2. The feature tracking method of claim 1, wherein the magnetic resonance image is a cardiac magnetic resonance image;
the characteristic points include at least atrioventricular junctions and/or a posterior left atrial wall midpoint.
3. The feature tracking method of claim 2, wherein the cardiac magnetic resonance image is an image in a cardiac cine, the method further comprising:
after the characteristic point marking is completed on each frames of cardiac magnetic resonance images in the cardiac cine, myocardial strain parameter information is determined according to the positions of the characteristic points in each frames of cardiac magnetic resonance images.
4. The feature tracking method according to , wherein the neural network is trained based on a preset data set, the data set comprising a plurality of pre-acquired magnetic resonance images and the plurality of magnetic resonance images with the feature points being artificially labeled.
5. The method according to claims 1-3, characterized in that the neural network is a residual density network, RDN.
An feature tracking device, comprising:
the acquisition module is used for acquiring a magnetic resonance image to be tracked;
and the marking module is used for inputting the magnetic resonance image acquired by the acquisition module into a trained neural network for processing to obtain the magnetic resonance image marked with preset characteristic points, and the neural network is used for marking the characteristic points for the input magnetic resonance image.
7. The feature tracking device of claim 6 wherein the magnetic resonance image is a cardiac magnetic resonance image;
the characteristic points include at least atrioventricular junctions and/or a posterior left atrial wall midpoint.
8. The feature tracking device of claim 7, wherein the cardiac magnetic resonance image is an image in a cardiac cine, the feature tracking device further comprising:
and the processing module is used for determining myocardial strain parameter information according to the position of the characteristic point in each frames of cardiac magnetic resonance image after the marking module finishes marking the characteristic point on each frames of cardiac magnetic resonance image in the cardiac cine.
Terminal device of the kind 9, , comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 5 through when executing the computer program.
10, computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any of claims 1 to 5 to .
CN201910857479.3A 2019-09-11 2019-09-11 feature tracking method and device Pending CN110738635A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910857479.3A CN110738635A (en) 2019-09-11 2019-09-11 feature tracking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910857479.3A CN110738635A (en) 2019-09-11 2019-09-11 feature tracking method and device

Publications (1)

Publication Number Publication Date
CN110738635A true CN110738635A (en) 2020-01-31

Family

ID=69267851

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910857479.3A Pending CN110738635A (en) 2019-09-11 2019-09-11 feature tracking method and device

Country Status (1)

Country Link
CN (1) CN110738635A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017001208A1 (en) * 2015-06-30 2017-01-05 Koninklijke Philips N.V. Method for estimating a displacement of an structure of interest and magnetic resonance imaging system
CN107633486A (en) * 2017-08-14 2018-01-26 成都大学 Structure Magnetic Resonance Image Denoising based on three-dimensional full convolutional neural networks
CN109155065A (en) * 2016-03-31 2019-01-04 眼睛有限公司 System and method for diagnostic image analysis and image quality measure
CN109147941A (en) * 2018-10-17 2019-01-04 上海交通大学 Brain robustness appraisal procedure based on structure nuclear magnetic resonance image data
CN109166130A (en) * 2018-08-06 2019-01-08 北京市商汤科技开发有限公司 A kind of image processing method and image processing apparatus
CN109785334A (en) * 2018-12-17 2019-05-21 深圳先进技术研究院 Cardiac magnetic resonance images dividing method, device, terminal device and storage medium
CN110211166A (en) * 2019-06-13 2019-09-06 北京理工大学 Optic nerve dividing method and device in magnetic resonance image

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017001208A1 (en) * 2015-06-30 2017-01-05 Koninklijke Philips N.V. Method for estimating a displacement of an structure of interest and magnetic resonance imaging system
CN109155065A (en) * 2016-03-31 2019-01-04 眼睛有限公司 System and method for diagnostic image analysis and image quality measure
CN107633486A (en) * 2017-08-14 2018-01-26 成都大学 Structure Magnetic Resonance Image Denoising based on three-dimensional full convolutional neural networks
CN109166130A (en) * 2018-08-06 2019-01-08 北京市商汤科技开发有限公司 A kind of image processing method and image processing apparatus
CN109147941A (en) * 2018-10-17 2019-01-04 上海交通大学 Brain robustness appraisal procedure based on structure nuclear magnetic resonance image data
CN109785334A (en) * 2018-12-17 2019-05-21 深圳先进技术研究院 Cardiac magnetic resonance images dividing method, device, terminal device and storage medium
CN110211166A (en) * 2019-06-13 2019-09-06 北京理工大学 Optic nerve dividing method and device in magnetic resonance image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MIKE VAN ZON: ""Automatic cardiac landmark localization by a recurrent neural network"", 《SPIE》 *
PUYOL-ANTON E: ""Fully automated myocardial strain estimation from cine MRI using convolutional neural networks"", 《2018 IEEE 15TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING》 *
YULUN ZHANG: ""Residual Dense Network for Image Super-Resolution"", 《ARXIV》 *

Similar Documents

Publication Publication Date Title
US10430949B1 (en) Automatic method and system for vessel refine segmentation in biomedical images using tree structure based deep learning model
CN109544598B (en) Target tracking method and device and readable storage medium
Estrada et al. Tree topology estimation
CN110598714B (en) Cartilage image segmentation method and device, readable storage medium and terminal equipment
CN108805871B (en) Blood vessel image processing method and device, computer equipment and storage medium
CN107610146B (en) Image scene segmentation method and device, electronic equipment and computer storage medium
CN109063584B (en) Facial feature point positioning method, device, equipment and medium based on cascade regression
CN107730514B (en) Scene segmentation network training method and device, computing equipment and storage medium
CN111160298B (en) Robot and pose estimation method and device thereof
CN107958285A (en) The mapping method and device of the neutral net of embedded system
CN112037146B (en) Automatic correction method and device for medical image artifacts and computer equipment
CN113223078B (en) Mark point matching method, device, computer equipment and storage medium
CN110910483B (en) Three-dimensional reconstruction method and device and electronic equipment
CN113160189A (en) Blood vessel center line extraction method, device, equipment and storage medium
CN113870215B (en) Midline extraction method and device
CN112307876A (en) Joint point detection method and device
US8548225B2 (en) Point selection in bundle adjustment
CN111368860A (en) Relocation method and terminal equipment
CN114155193A (en) Blood vessel segmentation method and device based on feature enhancement
CN112861239A (en) Method, system, device and storage medium for initially balancing ground stress of numerical model
CN111339969B (en) Human body posture estimation method, device, equipment and storage medium
CN110738635A (en) feature tracking method and device
Lassalle et al. Large scale region-merging segmentation using the local mutual best fitting concept
CN113178000A (en) Three-dimensional reconstruction method and device, electronic equipment and computer storage medium
He et al. Fruit tree extraction based on simultaneous tracking of two edges for 3D reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200131

RJ01 Rejection of invention patent application after publication