WO2004051571A2 - Image fusion with registration confidence measure - Google Patents

Image fusion with registration confidence measure Download PDF

Info

Publication number
WO2004051571A2
WO2004051571A2 PCT/GB2003/004021 GB0304021W WO2004051571A2 WO 2004051571 A2 WO2004051571 A2 WO 2004051571A2 GB 0304021 W GB0304021 W GB 0304021W WO 2004051571 A2 WO2004051571 A2 WO 2004051571A2
Authority
WO
WIPO (PCT)
Prior art keywords
measure
images
confidence
image
transformation
Prior art date
Application number
PCT/GB2003/004021
Other languages
French (fr)
Other versions
WO2004051571A3 (en
Inventor
Christian Peter Behrenbruch
Jerome Marie Joseph Declerck
Original Assignee
Mirada Solutions Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mirada Solutions Limited filed Critical Mirada Solutions Limited
Priority to AU2003267577A priority Critical patent/AU2003267577A1/en
Priority to EP03748269A priority patent/EP1565883B1/en
Priority to US10/502,034 priority patent/US20050238253A1/en
Priority to DE60324455T priority patent/DE60324455D1/en
Publication of WO2004051571A2 publication Critical patent/WO2004051571A2/en
Publication of WO2004051571A3 publication Critical patent/WO2004051571A3/en

Links

Classifications

    • G06T3/153
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

Definitions

  • the present invention relates to the registration of images, that is to say the process in which two different images are compared to find how they match each other, and are then displayed superimposed one on the other.
  • the registration of different images is useful in a variety of fields.
  • the images being compared and superimposed could be images of the same object acquired using different modalities, which thus show up different features of interest.
  • the fact that different features of interest are shown by the two modalities is useful in itself, but the usefulness can be enhanced by displaying the two images in superimposition.
  • this technique might be the fusion of an infrared image with a visible light image, for instance in a surveillance, mapping or medical situation, or, particularly in the medical field, the combination of two different modality images such as magnetic resonance images, nuclear medicine images, x-ray images, ultrasound images etc. In general this fusion of different images assists the interpretation of the images.
  • the two images to be fused are taken at same time, or nearly the same time, but in other situations it is useful to fuse images taken at different times.
  • the fusion of time- separated images arises also in many other fields, such as surveillance and mapping.
  • a typical registration (or fusion) technique relies on identifying corresponding points in the two images and calculating a transformation which maps the pixels of one image to the pixels of another.
  • This may use, for example, the well known block matching techniques in which pixels in a block in one image frame are compared with pixels in corresponding blocks in a search window in the other image frame and the transformation is calculated which minimises a similarity measure in the intensities in the blocks, such as the sum of square difference.
  • Other techniques based on identification of corresponding shapes in the two images have also been proposed. Explanations of different registration techniques are found in, for example, US 5,672,877 (ADAC Laboratories), US 5,871,013 (Elscint Limited), and many other text books and published papers. While such registration techniques are useful, the results can be regarded with suspicion by users.
  • FIG. 1 of the accompanying drawings illustrates schematically a typical situation.
  • An image feature 1 in image frame (a) is found to match an image feature 1' in another image frame (b).
  • To map the feature 1' to the original feature 1 it is necessary to perform a rigid displacement in the direction of arrow D in image (b), but it can be seen that there is also a shape change required because the right hand side of the object is stretched in image (b).
  • the present invention provides an image registration, or fusion, method in which a confidence measure can also be displayed to the user to give the user an idea of the quality of registration.
  • This confidence measure is calculated from the registration process.
  • the measure of confidence can be, for example, the degree of transformation required to perform the mapping, and preferably be based on the degree of non-rigid deformation in the transformation. Thus the confidence measure may exclude rigid motions and represent only the magnitude of the local deformation.
  • the measure can also be based on the local change of volume implied by the mapping transformation from one image to the other.
  • the measure may be selectively displayed in response to user input. It may be displayed as a visually distinguishable overlay on the display of the fused images. It may comprise a colour overlay with the colour or intensity (or both) indicating the measure of confidence, or the same could be achieved with a monochrome overlay whose grey level represents the measure. Alternatively, a symbol, such as a circle, can be displayed at any selected point in the fused image, whose size and/or shape, for instance the diameter of the circle, is measure of the confidence in the registration. Clearly a number or another symbol could be chosen and another attribute, e.g. colour, rather than size, used to indicate the confidence measure.
  • the symbol is only displayed at a single point selected by the user, for example by setting the cursor at that position, possibly in response to the user "clicking" at the selected point on the screen.
  • the confidence measure is displayed, it need not be on the fused image, but can be next to it, or in a separate display window, or on a copy of the fused image. For example an error bar or number corresponding to the confidence measure at the cursor position would be displayed alongside the fused image.
  • the method is particularly applicable to fused medical images, though it is also applicable in other fields where images are registered, such as surveillance, mapping etc.
  • the invention may conveniently be embodied as a computer program comprising program code means for executing the method, and the invention extends to a storage or transmission medium encoding the program and to an image processing and display apparatus which performs the method.
  • Figure 1 is a schematic diagram of an image registration or fusion process
  • Figure 2 is a flow diagram of an embodiment of the invention
  • Figure 3 illustrates an original MR image of the brain
  • Figure 4 illustrates an original PET image of the brain
  • Figure 5 illustrates the result of applying a non-rigid transformation to the PET image of Figure 4 so that it will register with the MR image of Figure 3;
  • Figure 6 illustrates the original MR image superimposed with the deformed PET image in registration with it
  • Figure 7 illustrates the result of applying only a rigid transformation to the PET image of Figure 4 so that it will register with the MR image of Figure 3;
  • Figure 8 illustrates a display of a fused image in accordance with one embodiment of the invention
  • Figure 9 illustrates a display of a fused image in accordance with another embodiment of the invention.
  • a typical image fusion or registration process involves at step 21 the input of two images. These may be individual image frames in static imaging, or could be image sequences in a dynamic imaging application.
  • the two images are compared in step 22 and the transformation which best maps features from one image onto corresponding features of the other is calculated.
  • This transformation is, in essence, a mapping of pixels in one image to pixels in the other. Taking, as an example, a three dimensional image this may be expressed as :-
  • F represents the mapping transformation.
  • this transformation may include a rigid movement and a deformation viz:-
  • the rigid part of the movement may be a translation and a rotation, namely :-
  • FIGS 3, 4, 5 and 6 illustrate an example of this method applied to brain imaging.
  • Figures 3 and 4 illustrate respectively the original MR and PET images of the brain.
  • Figure 5 illustrates the result of applying a non-rigid transformation to the PET image of Figure 4 so that it will register with the MR image of Figure 3 and
  • Figure 6 illustrates the original MR image superimposed with the deformed PET image in registration with it. The advantages of being able to see at a glance the information from both imaging modalities are clear.
  • Figure 7 illustrates the result of superimposing a version of the PET image of Figure 4 onto the MR image of Figure 3 with only a rigid deformation.
  • the size of the . deformable part of the transformation DEF is regarded as a measure of the disagreement between the rigid registration (RIG) and the deformable registration (RIG+DEF).
  • a confidence measure Mis calculated from the deformable part of the transformation may be simply the magnitude (norm) of the local displacement. That is to say :-
  • the confidence measure may be calculated as the determinant of the local Jacobian of the transformation, which defines the local stretching (change of volume) at a particular location. So if:-
  • the measure can be displayed over the fused image.
  • the similarity function such as cross-correlation, mutual information, correlation ratio or the like
  • FIG. 8 illustrates this applied to the fused image of Figure 6.
  • the user can "point” to different positions on the image using the perpendicularly- intersecting cross-hairs (typically controlled by a pointing device such as a computer mouse), and "clicking" at the selected position causes a circle to be displayed as shown.
  • the larger the circle the more non-rigid deformation has occurred and so the less agreement there is between the rigid and non-rigid registration.
  • the registration has been basically a rigid movement, and thus the result of the fusion can be regarded as more reliable.
  • Figure 9 shows an alternative way of displaying the confidence measure on the fused image of Figure 6.
  • the overlay is a transparent colour wash (though shown in black and white in Figure 9) whose colour and intensity are directly related to the value of the confidence measure. For example, green of low intensity may be used where the confidence is high (i.e. the amount of non-rigid deformation is low) whereas red, growing more intense, can be used as the amount of non-rigid deformation increases. It can be seen that the confidence decreases towards the left of the image where a high non-rigid deformation was required to register the two images.

Abstract

A method of displaying two images in registration with each other in which a visually distinguishable overlay is also displayed to represent the degree of 'confidence' in the registration process. The degree of confidence may be calculated on the basis of the degree of non-rigid deformation needed to register the two images. The visually distinguishable overlay can be in the form of a transparent colour wash whose colour and/or intensity indicate the level of confidence, or a symbol, e.g. a circle, whose size represents the degree of confidence.

Description

LMPROVEMENTS IN OR RELATING TO IMAGE REGISTRATION
The present invention relates to the registration of images, that is to say the process in which two different images are compared to find how they match each other, and are then displayed superimposed one on the other.
The registration of different images (also often called fusion of images) is useful in a variety of fields. The images being compared and superimposed could be images of the same object acquired using different modalities, which thus show up different features of interest. The fact that different features of interest are shown by the two modalities is useful in itself, but the usefulness can be enhanced by displaying the two images in superimposition. Examples of this technique might be the fusion of an infrared image with a visible light image, for instance in a surveillance, mapping or medical situation, or, particularly in the medical field, the combination of two different modality images such as magnetic resonance images, nuclear medicine images, x-ray images, ultrasound images etc. In general this fusion of different images assists the interpretation of the images.
In some situations the two images to be fused are taken at same time, or nearly the same time, but in other situations it is useful to fuse images taken at different times. For example, in the medical field it may be useful to fuse an image taken during one patient examination with an image taken in a different examination, for instance six months or a year spaced from the first one. This can assist in showing the changes in the patient's condition during that time. The fusion of time- separated images arises also in many other fields, such as surveillance and mapping. A typical registration (or fusion) technique relies on identifying corresponding points in the two images and calculating a transformation which maps the pixels of one image to the pixels of another. This may use, for example, the well known block matching techniques in which pixels in a block in one image frame are compared with pixels in corresponding blocks in a search window in the other image frame and the transformation is calculated which minimises a similarity measure in the intensities in the blocks, such as the sum of square difference. Other techniques based on identification of corresponding shapes in the two images have also been proposed. Explanations of different registration techniques are found in, for example, US 5,672,877 (ADAC Laboratories), US 5,871,013 (Elscint Limited), and many other text books and published papers. While such registration techniques are useful, the results can be regarded with suspicion by users. This is particularly true where the transformation which maps features in one image to features in the other involves not only a rigid movement, but also a non-rigid deformation of the image features. Users are typically prepared to accept the validity of a rigid movement, such as a translation and/or rotation, between two different images, but the validity of a shape deformation is much less clear. Figure 1 of the accompanying drawings illustrates schematically a typical situation. An image feature 1 in image frame (a) is found to match an image feature 1' in another image frame (b). To map the feature 1' to the original feature 1 it is necessary to perform a rigid displacement in the direction of arrow D in image (b), but it can be seen that there is also a shape change required because the right hand side of the object is stretched in image (b). Combining the rigid displacement and the deformation results in the fused image (c), but it can be seen that there is a resulting deformation field in the right hand part of the feature 1" in the fused image (c). Particularly in the medical field the concept of deformation like this is regarded with great scepticism by clinicians as they fear the consequences of an erroneous distortion: a stretched or shrunk functional image could lead the clinician to under- or over-estimate the extent of a diseased area, and lead to an inappropriate treatment with potentially dramatic consequences.
The present invention provides an image registration, or fusion, method in which a confidence measure can also be displayed to the user to give the user an idea of the quality of registration. This confidence measure is calculated from the registration process. The measure of confidence can be, for example, the degree of transformation required to perform the mapping, and preferably be based on the degree of non-rigid deformation in the transformation. Thus the confidence measure may exclude rigid motions and represent only the magnitude of the local deformation. The measure can also be based on the local change of volume implied by the mapping transformation from one image to the other.
The measure may be selectively displayed in response to user input. It may be displayed as a visually distinguishable overlay on the display of the fused images. It may comprise a colour overlay with the colour or intensity (or both) indicating the measure of confidence, or the same could be achieved with a monochrome overlay whose grey level represents the measure. Alternatively, a symbol, such as a circle, can be displayed at any selected point in the fused image, whose size and/or shape, for instance the diameter of the circle, is measure of the confidence in the registration. Clearly a number or another symbol could be chosen and another attribute, e.g. colour, rather than size, used to indicate the confidence measure. Preferably, to ayoid cluttering the display, the symbol is only displayed at a single point selected by the user, for example by setting the cursor at that position, possibly in response to the user "clicking" at the selected point on the screen. However the confidence measure is displayed, it need not be on the fused image, but can be next to it, or in a separate display window, or on a copy of the fused image. For example an error bar or number corresponding to the confidence measure at the cursor position would be displayed alongside the fused image.
The method is particularly applicable to fused medical images, though it is also applicable in other fields where images are registered, such as surveillance, mapping etc.
The invention may conveniently be embodied as a computer program comprising program code means for executing the method, and the invention extends to a storage or transmission medium encoding the program and to an image processing and display apparatus which performs the method.
The invention will be further described by way of example with reference to the accompanying drawings, in which:-
Figure 1 is a schematic diagram of an image registration or fusion process;
Figure 2 is a flow diagram of an embodiment of the invention; Figure 3 illustrates an original MR image of the brain; Figure 4 illustrates an original PET image of the brain;
Figure 5 illustrates the result of applying a non-rigid transformation to the PET image of Figure 4 so that it will register with the MR image of Figure 3;
Figure 6 illustrates the original MR image superimposed with the deformed PET image in registration with it;
Figure 7 illustrates the result of applying only a rigid transformation to the PET image of Figure 4 so that it will register with the MR image of Figure 3;
Figure 8 illustrates a display of a fused image in accordance with one embodiment of the invention; and Figure 9 illustrates a display of a fused image in accordance with another embodiment of the invention.
As indicated in Figure 2 a typical image fusion or registration process involves at step 21 the input of two images. These may be individual image frames in static imaging, or could be image sequences in a dynamic imaging application. The two images are compared in step 22 and the transformation which best maps features from one image onto corresponding features of the other is calculated. This transformation is, in essence, a mapping of pixels in one image to pixels in the other. Taking, as an example, a three dimensional image this may be expressed as :-
(xl,yi,z ) = mG(x2,y2,z2)+ OΕF(x2,y2,z2) (1)
where F represents the mapping transformation. Typically this transformation may include a rigid movement and a deformation viz:-
(xl,yl,z1) = F(x2,y2 ,z2) (2)
The rigid part of the movement may be a translation and a rotation, namely :-
(xλyγ , zχ ) = TRANS(x2 , y2 > zι ) + ROT(x2 , y2 , ∑2 ) + DEF(x2 > 72 ' z2 ) (3) Figures 3, 4, 5 and 6 illustrate an example of this method applied to brain imaging. Figures 3 and 4 illustrate respectively the original MR and PET images of the brain. Figure 5 illustrates the result of applying a non-rigid transformation to the PET image of Figure 4 so that it will register with the MR image of Figure 3 and Figure 6 illustrates the original MR image superimposed with the deformed PET image in registration with it. The advantages of being able to see at a glance the information from both imaging modalities are clear. By way of a comparison, Figure 7 illustrates the result of superimposing a version of the PET image of Figure 4 onto the MR image of Figure 3 with only a rigid deformation.. In accordance with this embodiment of the invention the size of the . deformable part of the transformation DEF is regarded as a measure of the disagreement between the rigid registration (RIG) and the deformable registration (RIG+DEF). Thus a confidence measure Mis calculated from the deformable part of the transformation. As one example the confidence measure M may be simply the magnitude (norm) of the local displacement. That is to say :-
M = DEF(x2 ,.y2 , z2) (4)
Alternatively, the confidence measure may be calculated as the determinant of the local Jacobian of the transformation, which defines the local stretching (change of volume) at a particular location. So if:-
y = Fy (x2 ,y2 ,z2) (5) z \ = rz \ x ,y ,z )
Then the measure Mbecomes:-
Figure imgf000007_0001
It is also possible to base the measure on the value of the similarity function such as cross-correlation, mutual information, correlation ratio or the like (see for example A. Roche, X. Pennec, G. Malandain, and N. Ayache. Rigid Registration of 3D Ultrasound with MR Images: a New Approach Combining Intensity and Gradient Information. IEEE Transactions on Medical Imaging, 20(10):1038~1049, October 2001) used in matching local blocks in the images, or to combine these various measure together to form a normalised estimate of the "confidence" in the registration process. Once the measure has been calculated it can be displayed over the fused image. One way of displaying it is, in response to the user "clicking" at a certain point on the display, to display a circle whose diameter represents the value of the confidence measure. Figure 8 illustrates this applied to the fused image of Figure 6. The user can "point" to different positions on the image using the perpendicularly- intersecting cross-hairs (typically controlled by a pointing device such as a computer mouse), and "clicking" at the selected position causes a circle to be displayed as shown. The larger the circle the more non-rigid deformation has occurred and so the less agreement there is between the rigid and non-rigid registration. Thus the more careful the clinician should be while reviewing this fusion result. On the other hand, in areas where the circle has a small diameter, the registration has been basically a rigid movement, and thus the result of the fusion can be regarded as more reliable.
Figure 9 shows an alternative way of displaying the confidence measure on the fused image of Figure 6. In Figure 9 the overlay is a transparent colour wash (though shown in black and white in Figure 9) whose colour and intensity are directly related to the value of the confidence measure. For example, green of low intensity may be used where the confidence is high (i.e. the amount of non-rigid deformation is low) whereas red, growing more intense, can be used as the amount of non-rigid deformation increases. It can be seen that the confidence decreases towards the left of the image where a high non-rigid deformation was required to register the two images.

Claims

1. A method of displaying two images in registration with each other comprising the steps of comparing the two images to each other, calculating a transformation which maps features in one image to corresponding features in the other, displaying the two images in superimposition based on the transformation, and displaying a measure of the confidence in the registration.
2. A method according to claim 1 wherein said measure of confidence is calculated from the degree of transformation required to perform said mapping.
3. A method according to claim 2 wherein said measure of confidence is calculated from the degree of non-rigid deformation in said calculated transformation.
4. A method according to claim 2 or 3 wherein said measure is calculated excluding rigid motions.
5. A method according to claim 2, 3 or 4 wherein said measure of confidence is calculated from the magnitude of the local deformation in said transformation.
6. A method according to claim 2, 3, 4 or 5 wherein said measure of confidence is calculated from the local change of volume implied by the transformation.
7. A method according to any one of the preceding claims wherein the measure is selectively displayed in response to user input.
8. A method according to any one of the preceding claims wherein the confidence measure is displayed overlaid on the two images.
9. A method according to any one of the preceding claims wherein the measure is displayed as a visually distinguishable overlay on the two images, the visual properties of the overlay at any point being based on the said measure.
10. A method according to claim 9 wherein the colour of the visually distinguishable overlay is varied in dependence on said measure.
11. A method according to claim 9 or 10 wherein the intensity of the visually distinguishable overlay is varied in dependence on said measure.
12. A method according to claim 9 wherein the grey-level of the visually distinguishable overlay is varied in dependence on said measure.
13. A method according to claim 8 wherein the confidence measure is displayed next to the displayed superimposed images.
14. A method according to claim 8, 9 or 13, wherein the visually distinguishable overlay comprises a symbol having a property which depends on the value of said measure at a selected display point.
15. A method according to claim 14 wherein the symbol is one of a circle and an error bar whose size depends on the value of said measure at a selected display point.
16. A method according to claim 14 or 15 wherein the symbol is displayed at any time only at a single selected display point.
17. A method according to any one of the preceding claims wherein the images are medical images.
18. A computer program comprising program code means for executing on a programmed computer the method of any one of the preceding claims.
19. A computer-readable storage medium encoding a computer program in accordance with claim 18.
20. An image display apparatus comprising a display, and an image processor adapted to perform the method of any one of claims 1 to 17.
PCT/GB2003/004021 2002-11-29 2003-09-18 Image fusion with registration confidence measure WO2004051571A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AU2003267577A AU2003267577A1 (en) 2002-11-29 2003-09-18 Image fusion with registration confidence measure
EP03748269A EP1565883B1 (en) 2002-11-29 2003-09-18 Image fusion with registration confidence measure
US10/502,034 US20050238253A1 (en) 2002-11-29 2003-09-18 Image registration
DE60324455T DE60324455D1 (en) 2002-11-29 2003-09-18 BILDFUSION WITH RELIABILITY OF REGISTRATION

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB0227887.7A GB0227887D0 (en) 2002-11-29 2002-11-29 Improvements in or relating to image registration
GB0227887.7 2002-11-29

Publications (2)

Publication Number Publication Date
WO2004051571A2 true WO2004051571A2 (en) 2004-06-17
WO2004051571A3 WO2004051571A3 (en) 2004-10-21

Family

ID=9948779

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2003/004021 WO2004051571A2 (en) 2002-11-29 2003-09-18 Image fusion with registration confidence measure

Country Status (6)

Country Link
US (1) US20050238253A1 (en)
EP (1) EP1565883B1 (en)
AU (1) AU2003267577A1 (en)
DE (1) DE60324455D1 (en)
GB (1) GB0227887D0 (en)
WO (1) WO2004051571A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006107364A1 (en) * 2005-04-04 2006-10-12 Siemens Medical Solutions Usa, Inc. System and method for quantifying the quality of motion correction in image registration
WO2012004742A1 (en) * 2010-07-09 2012-01-12 Koninklijke Philips Electronics N.V. Automatic point-wise validation of respiratory motion estimation
US20140364719A1 (en) * 2004-11-23 2014-12-11 St. Jude Medical, Atrial Fibrillation Division, Inc. Method and apparatus for localizing an ultrasound catheter
WO2017102468A1 (en) * 2015-12-15 2017-06-22 Koninklijke Philips N.V. Image processing systems and methods

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006013473B4 (en) * 2006-03-23 2009-10-22 Siemens Ag Method for spatially resolved visualization of the reconstruction quality, in particular the coverage, of a three-dimensional target volume to be recorded and reproduced in a three-dimensional reconstruction volume representation
US8195269B2 (en) * 2006-07-07 2012-06-05 Siemens Medical Solutions Usa, Inc. System and method for automatic detection and measurement of malacia in the airways
US8548568B2 (en) * 2006-09-08 2013-10-01 General Electric Company Methods and apparatus for motion compensation
US8223143B2 (en) * 2006-10-27 2012-07-17 Carl Zeiss Meditec, Inc. User interface for efficiently displaying relevant OCT imaging data
ITTO20070620A1 (en) * 2007-08-31 2009-03-01 Giancarlo Capaccio SYSTEM AND METHOD FOR PRESENTING VISUAL DATA DETACHED IN MULTI-SPECTRAL IMAGES, MERGER, AND THREE SPACE DIMENSIONS.
CA2702927A1 (en) * 2007-10-19 2009-04-23 Boston Scientific Scimed, Inc. Display of classifier output and confidence measure in an image
US8965071B2 (en) * 2008-05-30 2015-02-24 Emory University Assessing tumor response to therapy
US20100063400A1 (en) * 2008-09-05 2010-03-11 Anne Lindsay Hall Method and apparatus for catheter guidance using a combination of ultrasound and x-ray imaging
DE102010005265A1 (en) * 2010-01-20 2011-07-21 Testo AG, 79853 Method for overlay representing images form e.g. thermal image camera, involves computing transfer function between images and applying transfer function for overlay representing images, and evaluating energy function by computation values
US9105115B2 (en) * 2010-03-16 2015-08-11 Honeywell International Inc. Display systems and methods for displaying enhanced vision and synthetic images
US8582846B2 (en) * 2010-06-18 2013-11-12 Siemens Aktiengesellschaft Method and system for validating image registration
US9996929B2 (en) * 2010-10-27 2018-06-12 Varian Medical Systems International Ag Visualization of deformations using color overlays
US10152951B2 (en) 2011-02-28 2018-12-11 Varian Medical Systems International Ag Method and system for interactive control of window/level parameters of multi-image displays
DE102011080905B4 (en) * 2011-08-12 2014-03-27 Siemens Aktiengesellschaft Method for visualizing the registration quality of medical image data sets
US8944597B2 (en) 2012-01-19 2015-02-03 Carl Zeiss Meditec, Inc. Standardized display of optical coherence tomography imaging data
US9420945B2 (en) 2013-03-14 2016-08-23 Carl Zeiss Meditec, Inc. User interface for acquisition, display and analysis of ophthalmic diagnostic data
CN105069768B (en) * 2015-08-05 2017-12-29 武汉高德红外股份有限公司 A kind of visible images and infrared image fusion processing system and fusion method
CN114078102A (en) * 2020-08-11 2022-02-22 北京芯海视界三维科技有限公司 Image processing apparatus and virtual reality device
WO2022141531A1 (en) * 2020-12-31 2022-07-07 西安大医集团股份有限公司 Image registration evaluation method and apparatus, and electronic device and readable storage medium
CN116831626A (en) * 2022-03-25 2023-10-03 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic beam synthesis method and equipment
CN116977387B (en) * 2023-09-22 2023-12-15 安徽大学 Deformable medical image registration method based on deformation field fusion

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5871013A (en) * 1995-05-31 1999-02-16 Elscint Ltd. Registration of nuclear medicine images
US5672877A (en) * 1996-03-27 1997-09-30 Adac Laboratories Coregistration of multi-modality data in a medical imaging system
US6016442A (en) * 1998-03-25 2000-01-18 Cardiac Pacemakers, Inc. System for displaying cardiac arrhythmia data
US6944330B2 (en) * 2000-09-07 2005-09-13 Siemens Corporate Research, Inc. Interactive computer-aided diagnosis method and system for assisting diagnosis of lung nodules in digital volumetric medical images
DE10141186A1 (en) * 2001-08-22 2003-03-20 Siemens Ag Device for processing images, in particular medical images
DE10163813A1 (en) * 2001-12-22 2003-07-03 Philips Intellectual Property Method for displaying different images of an examination object
US7050615B2 (en) * 2002-07-25 2006-05-23 Ge Medical Systems Glogal Technology Company, Llc Temporal image comparison method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BLACKALL JM: "Respiratory Motion in Image-Guided Interventions of the Liver" PHD THESIS, [Online] September 2002 (2002-09), XP002292466 LONDON Retrieved from the Internet: URL:http://www-ipg.umds.ac.uk/J.Blackall/jm_blackall_thesis.pdf> [retrieved on 2004-08-12] *
FEDOROV D V ET AL: "Automatic registration and mosaicking system for remotely sensed imagery" PROC. SPIE - INT. SOC. OPT. ENG. (USA), PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING, 2003, SPIE-INT. SOC. OPT. ENG, USA, [Online] vol. 4885, September 2002 (2002-09), pages 444-451, XP002292578 ISSN: 0277-786X Retrieved from the Internet: URL:http://scitation.aip.org/getpdf/servlet/GetPDFServlet?filetype=pdf&id=PSISDG004885000001000444000001&idtype=cvips> [retrieved on 2004-08-16] *
See also references of EP1565883A2 *
TANNER C ET AL: "Validation of volume-preserving non-rigid registration: application to contrast-enhanced MR-mammography" 2002, BERLIN, GERMANY, SPRINGER-VERLAG, GERMANY, [Online] September 2002 (2002-09), pages 307-314, XP002292465 ISBN: 3-540-44224-3 Retrieved from the Internet: URL:http://www.springerlink.com/media/M3T1FWLTWJ6U4D2V9XAW/Contributions/0/1/F/4/01F403A1UWE825VE.pdf> [retrieved on 2004-08-12] *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140364719A1 (en) * 2004-11-23 2014-12-11 St. Jude Medical, Atrial Fibrillation Division, Inc. Method and apparatus for localizing an ultrasound catheter
US10639004B2 (en) * 2004-11-23 2020-05-05 St. Jude Medical, Atrial Fibrillation Division, Inc. Method and apparatus for localizing an ultrasound catheter
WO2006107364A1 (en) * 2005-04-04 2006-10-12 Siemens Medical Solutions Usa, Inc. System and method for quantifying the quality of motion correction in image registration
US7295951B2 (en) 2005-04-04 2007-11-13 Siemens Medical Solutions Usa, Inc. System and method for quantifying the quality of motion correction in image registration
WO2012004742A1 (en) * 2010-07-09 2012-01-12 Koninklijke Philips Electronics N.V. Automatic point-wise validation of respiratory motion estimation
CN102985946A (en) * 2010-07-09 2013-03-20 皇家飞利浦电子股份有限公司 Automatic point-wise validation of respiratory motion estimation
US9171377B2 (en) 2010-07-09 2015-10-27 Koninklijke Philips N.V. Automatic point-wise validation of respiratory motion estimation
WO2017102468A1 (en) * 2015-12-15 2017-06-22 Koninklijke Philips N.V. Image processing systems and methods
CN108701360A (en) * 2015-12-15 2018-10-23 皇家飞利浦有限公司 Image processing system and method
US11538176B2 (en) 2015-12-15 2022-12-27 Koninklijke Philips N.V. Image processing systems and methods

Also Published As

Publication number Publication date
EP1565883B1 (en) 2008-10-29
WO2004051571A3 (en) 2004-10-21
GB0227887D0 (en) 2003-01-08
DE60324455D1 (en) 2008-12-11
AU2003267577A8 (en) 2004-06-23
AU2003267577A1 (en) 2004-06-23
EP1565883A2 (en) 2005-08-24
US20050238253A1 (en) 2005-10-27

Similar Documents

Publication Publication Date Title
EP1565883B1 (en) Image fusion with registration confidence measure
Murphy et al. Semi-automatic construction of reference standards for evaluation of image registration
Andronache et al. Non-rigid registration of multi-modal images using both mutual information and cross-correlation
Collins et al. Automatic 3D segmentation of neuro-anatomical structures from MRI
US20170301090A1 (en) Systems and methods for interleaving series of medical images
US20070237372A1 (en) Cross-time and cross-modality inspection for medical image diagnosis
US10225086B2 (en) Image fingerprinting
US7346199B2 (en) Anatomic triangulation
US8705821B2 (en) Method and apparatus for multimodal visualization of volume data sets
US20070160276A1 (en) Cross-time inspection method for medical image diagnosis
Bach Cuadra et al. Atlas-based segmentation
Rohr et al. Elastic registration of electrophoresis images using intensity information and point landmarks
Calmon et al. Automatic measurement of changes in brain volume on consecutive 3D MR images by segmentation propagation
KR102149369B1 (en) Method for visualizing medical image and apparatus using the same
KR20200131737A (en) Method for aiding visualization of lesions in medical imagery and apparatus using the same
Riddle et al. Characterizing changes in MR images with color-coded Jacobians
Buerger et al. Combining deep learning and model-based segmentation for labeled spine CT segmentation
EP2050071B1 (en) A method, apparatus and computer-readable medium for creating a preset map for the visualization of an image dataset
Koh et al. Automatic spinal canal detection in lumbar MR images in the sagittal view using dynamic programming
Liu et al. Multiple sclerosis medical image analysis and information management
Hellier et al. A hierarchical parametric algorithm for deformable multimodal image registration
CA2585054C (en) Virtual grid alignment of sub-volumes
US20060215888A1 (en) Method and apparatus of displaying of a medical image
WO2008002325A2 (en) Cross-time inspection method for medical diagnosis
Wang et al. Novel elastic registration for 2-D medical and gel protein images

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

WWE Wipo information: entry into national phase

Ref document number: 2003748269

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 10502034

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWP Wipo information: published in national office

Ref document number: 2003748269

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP