WO2014106747A1 - Methods and apparatus for image processing - Google Patents

Methods and apparatus for image processing Download PDF

Info

Publication number
WO2014106747A1
WO2014106747A1 PCT/GB2014/050010 GB2014050010W WO2014106747A1 WO 2014106747 A1 WO2014106747 A1 WO 2014106747A1 GB 2014050010 W GB2014050010 W GB 2014050010W WO 2014106747 A1 WO2014106747 A1 WO 2014106747A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
local
ellipses
interest
fitting
Prior art date
Application number
PCT/GB2014/050010
Other languages
French (fr)
Inventor
Sylvia RUEDA
Julia Alison Noble
Original Assignee
Isis Innovation Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isis Innovation Limited filed Critical Isis Innovation Limited
Publication of WO2014106747A1 publication Critical patent/WO2014106747A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the present invention relates to methods and apparatus for processing images - for example, images of tubular structures and organ/tissue structures. It is particularly applicable, but by no means limited, to the analysis of medical ultrasound images.
  • Imaging techniques such as those which use ultrasound, are commonly used to study objects of interest within human or animal bodies, or in industrial applications. Once an image has been obtained, it is sometimes necessary to quantify structures/objects within the image, for example in order to measure the objects or to perform some other analysis on them.
  • Ultrasound is one of the most difficult and challenging among medical imaging modalities for quantification, as the quantification of structures faces multiple challenges including signal dropouts, missing boundaries, shadows, and presence of speckle [1 ].
  • organ/tissue delineation is often essential for underpinning image-based measurements of organ dimensions or tissue region properties.
  • Finding the whereabouts of an object of interest approximately, by fitting ellipses or circles to it, is often sufficient for the purposes of quantification.
  • the size of the head and abdomen of a fetus is often estimated from manually fitting ellipses to the objects of interest within an ultrasound image. This might be the end goal (biometric quantification) or might be a pre-analysis step to identify a region of interest for subsequent analysis (for example myocardial thickness estimation).
  • an image processing method as defined in claim 1 of the appended claims.
  • an image processing method performed by a processor and comprising: receiving a first image containing one or more objects of interest; deriving local phase and local orientation information from the first image; identifying one or more edge features of the object(s) of interest using the local phase information; and fitting one or more parametric shapes to the edge feature(s) of the object(s) of interest using the local orientation information.
  • Local phase information advantageously extracts structural image information while being invariant to contrast, and so is well-suited for a variety of image analysis tasks.
  • the use of local orientation information advantageously enables the method to distinguish between inner and outer contours when fitting the parametric shapes to the objects of interest.
  • Each image edge feature may be described by its pixel location and a feature attribute vector - in our case, local orientation and local phase. Other image feature descriptors may be included in the feature attribute vector in practice.
  • the local phase information is derived from the first image using the monogenic signal.
  • the step of identifying the edge feature(s) comprises computing the local phase at multiple scales and applying a feature asymmetry measure, although other ways are also possible.
  • Feature asymmetry on its own can also be used directly for fitting the shape(s).
  • the step of identifying the edge feature(s) further comprises applying non-maximal suppression in the local orientation direction.
  • the step of fitting parametric shapes to the edge feature(s) comprises fitting ellipses or circles.
  • coupled ellipses or circles may be fitted simultaneously to inner and outer contours of the same object of interest.
  • the step of fitting parametric shapes to the edge feature(s) may comprise fitting other two-dimensional shapes, or three- dimensional shapes.
  • the method further comprises performing quantitative analysis of the image using the fitted shapes.
  • the results of the quantitative analysis may then be displayed, for example by superimposing the results on the image(s) to which they relate.
  • imaging apparatus configured to implement a method in accordance with the first aspect of the invention.
  • the imaging apparatus may be a medical scanner, such as an ultrasound scanner, a computed tomography (CT) scanner or a magnetic resonance imaging (MRI) scanner.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • the imaging apparatus may be an industrial scanner, for example of the kind used for the non-destructive testing of aircraft components.
  • image processing apparatus configured to implement a method in accordance with the first aspect of the invention.
  • a processor configured to implement a method in accordance with the first aspect of the invention.
  • a computer program or set of instruction code for implementing a method in accordance with the first aspect of the invention when executed on a processor.
  • a computer-readable medium or physical carrier signal encoding a computer program in accordance with the fifth aspect of the invention.
  • Figure 1 (a) is a schematic diagram of the composition of a fetal arm
  • Figure 1 (b) is a typical ultrasound image of an arm cross-section of a 25 weeks fetus
  • Figure 1 (c) is a local phase image derived from the ultrasound image in Figure 1 (b);
  • Figure 1 (d) is a feature asymmetry image derived from the ultrasound image in Figure 1 (b), and from several scales of the local phase;
  • Figure 1 (e) is an edge representation derived from the image in Figure 1 (d);
  • Figure 1 (f) illustrates oriented edges in the selected area (dashed square) of Figure 1 (e);
  • FIG. 1 illustrates principles of monogenic signal processing
  • Figure 3 illustrates filters that can be used for calculating the monogenic signal
  • Figure 4(a) illustrates an ellipse fitted to an inner contour of imaged edge features using the present method
  • Figure 4(b) illustrates an ellipse fitted to an outer contour of imaged edge features using the present method
  • Figures 5(a) to 5(c) illustrate ellipses fitted to edge maps (left hand images) and to ultrasound images (right hand images) using the present method, with Figure 5(a) showing a fetal mid-arm cross-section at 21 weeks of gestation, Figure 5(b) showing a fetal mid-arm cross-section at 26 weeks of gestation, and Figure 5(c) showing a fetal mid-arm cross-section at 28 weeks of gestation - in each case with manual delineations from the same clinical expert displayed on the ultrasound (right hand) images as dashed lines; and
  • Figure 6 illustrates ellipses fitted to an ultrasound image (as solid lines) using the present method, and two manual delineations (as dashed lines) obtained by the same clinical expert, showing the intra-expert variability that can exist when delineating the object contour manually.
  • the present embodiments represent the best ways known to the applicants of putting the invention into practice. However, they are not the only ways in which this can be achieved. It is to be emphasised that, although the present embodiments often refer to the soft tissue quantification of tubular structures (such as fetal limb cross-sections) using ultrasound images, the principles described herein are equally applicable to the analysis of other structures in humans or animals, such as left ventricular short-axis myocardial thickness measurement, or vessel thickness estimation. The principles are also applicable to the imaging of man-made objects (e.g. aircraft engine parts) using non-invasive imaging techniques. Moreover, the principles are applicable to any image modality, such as computed tomography (CT) or magnetic resonance imaging (MRI), as well as ultrasound. Embodiments can be used as a research tool or a clinical tool, for example quantifying the amount of soft tissue for a particular application.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • the present embodiments take the form of a method or algorithm for processing medical (or other) images.
  • the method or algorithm may be incorporated in a computer program or a set of instruction code capable of being executed by a processor.
  • the processor may be that of a conventional (sufficiently high performance) computer, or some other image processing apparatus or computer system.
  • the processor may be incorporated in, or in communication with, a piece of imaging equipment such as an ultrasound scanner or an MRI scanner.
  • the computer program or set of instruction code may be supplied on a computer- readable medium or data carrier such as a CD-ROM, DVD or solid state memory device. Alternatively, it may be downloadable as a digital signal from a connected computer, either directly or over a local area network or a wide area network such as the Internet. As a further alternative, the computer program or set of instruction code may be hard-coded in the processor (or memory associated therewith) arranged to execute it.
  • the present embodiment provides automatic oriented feature-based (coupled) ellipse fitting for image-based quantification.
  • the presently preferred embodiment is a tool for automatic quantification of imaged tubular structures, applicable to medical image soft tissue quantification. It includes a novel ellipse/circle fitting method, which can automatically and simultaneously fit one, two or more ellipses/circles to an image using image edge features and local orientation, which can be derived from the monogenic signal, instead of intensity gradients.
  • the use of the monogenic signal renders the present technique very useful for ultrasound imaging.
  • the presently preferred embodiment automatically quantifies the soft tissue of tubular structures using a clinical image-based measurement tool that employs (coupled) ellipse fitting based on structural information and a modified Hough transform.
  • the candidate object boundary locations and orientation can be identified in a robust way in any imaging modality.
  • Oriented image edge feature maps which can be derived from the monogenic signal, guide the process to find a single ellipse or coupled ellipses simultaneously (inner and outer contours) for quantification of tubular structures.
  • edge orientation A novel use of edge orientation to identify the contours of the structure of interest to find (coupled) ellipses in the images.
  • the edge orientation is such that it always points outwards from the object of interest, while being derived from structural information, and can be obtained from the monogenic signal, making it robust to intensity changes.
  • a modified Hough transform keeps track of the sign of the dot product between the local orientation and the fitted ellipse normals to decide between the inner and outer contour for coupled ellipses.
  • Figure 1 (a) is a schematic diagram of fetal arm composition, showing that the arm consists of a humerus bone surrounded by muscle, which in turn is surrounded by an adipose tissue layer.
  • Figure 1 (b) is a typical ultrasound image of the cross-section of such an arm, of a 25 weeks fetus.
  • the ultrasound image of Figure 1 (b) will be referred to herein as the "original" or "first” image. It can be seen that the original image of Figure 1 (b) is prone to speckle and the objects have fuzzy boundaries. The use of local phase is better suited as it is intensity invariant.
  • a local phase representation as shown in Figure 1 (c) is first obtained using the monogenic signal [4] derived from the original image ( Figure 2).
  • Image edge features are detected as points of local phase congruency, obtained from computing the local phase at multiple scales and combined using a feature asymmetry measure, as shown in Figure 1 (d).
  • the feature asymmetry image contains thick edges, whereas a good localisation of the edges is desirable. Therefore, non-maximal suppression is applied in the local orientation direction, also obtained from the monogenic signal.
  • Other ways of obtaining the edges can also be incorporated into this framework. Feature asymmetry (without non-maximal suppression) with local orientation could also be directly used to find the ellipses, but would not be as well localised.
  • an oriented edge map as shown in Figures 1 (e)-(f) is then produced by assigning the corresponding local orientation value to each pixel within the edge map of Figure 1(e).
  • local orientation vectors point outwards from the structure of interest.
  • Ellipses, circles or other parametric shapes can then be automatically fitted to the oriented edge map, with the vector information being used to determine whether the fitted shape corresponds to an outer or an inner boundary of the imaged feature. Such a process is illustrated in Figures 4(a) and 4(b), and described in greater detail below. Measurements or other quantitative analysis may then be carried out using the fitted shapes.
  • ⁇ ( ⁇ , ⁇ ) arc tan I , respectively.
  • Figure 2 illustrates in more detail the operations used to process the monogenic signal derived from the original image, in order to obtain the local phase (e.g. processing the image of Figure 1 (b) to obtain the image of Figure 1 (c)) and local orientation representations.
  • the local phase is estimated using the monogenic signal but can be estimated using other approaches. It is invariant to contrast and describes the structure of the object of interest rather than the magnitude.
  • h- ⁇ and h 2 provide the quadrature pair of filters required for local hase estimation.
  • these filters are defined as H 1 (w,v) is a cosine in the Fourier
  • H 2 (u,v) is a sine.
  • This pair of filters define the Riesz transform.
  • the scale selected determines the size of the structures detected.
  • the choice of the band-pass filter used also affects the final result.
  • Image edge features can be detected at points of local phase congruency, obtained from computing the local phase (such as that shown in Figure 1 (c)) at multiple scales and combined using the feature asymmetry (FA) measure defined as
  • N the total number of scales
  • T s is an orientation dependent threshold that controls spurious responses to
  • Feature asymmetry returns a value between 0 and 1 .
  • a feature is defined as a point where there is local phase congruency among scales.
  • the feature asymmetry image (e.g. Figure 1 (d)) can be thinned in order to give better localization of the actual edges, resulting in an image such as the one shown in Figure 1 (e).
  • Oriented edge map construction
  • an oriented edge map can then be constructed by adding the local orientation information to the edge features, which can be obtained from feature asymmetry.
  • Figure 1 (f) shows the local orientation vectors added to the feature edges within the dashed square in Figure 1 (e).
  • Local orientation vectors represented by arrows in Figure 1 (f) point outwards from the object of interest. Being derived from structural information, obtained from the monogenic signal, makes this process robust to intensity changes and independent of the intensity gradient.
  • any pixel in the oriented edge map (e.g. as shown in Figure 1 (f)) can be a potential ellipse centre.
  • the parameters of an ellipse (principal axes and orientation) are iteratively varied and, at each iteration, points on the ellipse that match edge features are analysed and a likelihood score obtained as follows: At each point on the ellipse, the scalar product between the local orientation of each oriented edge feature and the normal to the ellipse at that point is calculated and accumulated.
  • Coupled ellipses can be identified simultaneously by retaining the two ellipses such that the inner/outer contours have an opposite scalar product sign and minimise/maximise the accumulator at that ellipse centre.
  • Examples of fitted ellipses are illustrated using dashed lines in Figures 4(a) and 4(b).
  • Figure 4(a) the scalar product is negative at the image edge features on the inner contour matching the fitted ellipse (the dashed line 42).
  • Figure 4(b) the scalar product is positive at the image edge features belonging to the outer contour matching the fitted ellipse (the dashed line 44).
  • the scalar product is used to update the accumulator for detecting coupled ellipses on the images. Instead of using intensity gradients, the oriented edge map based on structural information is created. For circle detection only three parameters are necessary, whereas ellipse detection requires five parameters.
  • the scalar product of an ellipse matching the inner contour of the object is negative, since the local orientation of the inner contour and the normal vectors to the fitted ellipse at those edge points are opposite.
  • the scalar product of an ellipse matching the outer contour is positive because the local orientation and the normal vectors to the ellipse at those points are pointing in the same direction.
  • the final coupled ellipses are obtained using a likelihood score, which combines the results of inner and outer contours, so the best ellipses are detected simultaneously.
  • the likelihood score can be defined in a number of ways, by using the information from the accumulator. An example is given in the following: An ellipse is characterised by five parameters, which are the centre (x 0 ,y 0 ) , the semi-major axis a, the semi-major axis b, and the angle of rotation ⁇ . The current implementation is aimed at detecting two concentric ellipses simultaneously.
  • the orientation of both ellipses is set to be the same, making sure that the semi-major axes of the outer ellipse are bigger than the semi-major axes of the inner ellipse.
  • l_(x,y) is maximised to find the centre (x 0 ,y 0 ) of the coupled ellipses that best fit the edges in the oriented edge map.
  • Parametric shapes other than circles or ellipses may be fitted using this technique.
  • shapes with straight edges e.g. triangles, hexagons, squares or rectangles
  • Three-dimensional structures such as spheres/hollow spheres or cylinders/tubes may also be fitted to three- dimensional images using this technique.
  • measurements or other quantitative analytical calculations may then be carried out using the fitted shapes. For example, linear, area or volume measurements relating to the fitted shapes may be evaluated, or ratios or other functions may be calculated. As those skilled in the art will appreciate, the quantitative results may be presented to the user on a display screen (e.g. superimposed on the images to which they relate), printed on paper, or saved in memory or in a data storage device.
  • Figures 5(a) to 5(c) illustrate ellipses fitted to edge maps (left hand images) and to ultrasound images (right hand images), with Figure 5(a) showing a fetal mid-arm cross-section at 21 weeks of gestation, Figure 5(b) showing a fetal mid-arm cross-section at 26 weeks of gestation, and Figure 5(c) showing a fetal mid-arm cross-section at 28 weeks of gestation.
  • the dashed line contours in the left hand images and the solid line contours in the right hand images correspond to the fitted ellipses.
  • FIG. 6 displays the fitted ellipses (solid line contours) over the manually delineated contours (dashed lines) by the same clinical expert.
  • This example shows how difficult it is to manually delineate the contours of the object of interest, since the two manual delineations have a high variability among themselves, due to fuzzy regions in the ultrasound image.

Abstract

An image processing method performed by a processor and comprising: receiving a first image containing one or more objects of interest; deriving local phase and local orientation information from the first image; identifying one or more edge features of the object(s) of interest using the local phase information; and fitting one or more parametric shapes to the edge feature(s) of the object(s) of interest using the local orientation information.

Description

METHODS AND APPARATUS FOR IMAGE PROCESSING Field of the Invention
The present invention relates to methods and apparatus for processing images - for example, images of tubular structures and organ/tissue structures. It is particularly applicable, but by no means limited, to the analysis of medical ultrasound images.
Background to the Invention
Imaging techniques, such as those which use ultrasound, are commonly used to study objects of interest within human or animal bodies, or in industrial applications. Once an image has been obtained, it is sometimes necessary to quantify structures/objects within the image, for example in order to measure the objects or to perform some other analysis on them.
Ultrasound is one of the most difficult and challenging among medical imaging modalities for quantification, as the quantification of structures faces multiple challenges including signal dropouts, missing boundaries, shadows, and presence of speckle [1 ].
In a number of medical imaging applications (including ultrasound, and others), organ/tissue delineation is often essential for underpinning image-based measurements of organ dimensions or tissue region properties. Finding the whereabouts of an object of interest approximately, by fitting ellipses or circles to it, is often sufficient for the purposes of quantification. For example, in the obstetrics field, it is common clinical practice for the size of the head and abdomen of a fetus to be estimated from manually fitting ellipses to the objects of interest within an ultrasound image. This might be the end goal (biometric quantification) or might be a pre-analysis step to identify a region of interest for subsequent analysis (for example myocardial thickness estimation). However, manual delineation of such images is tedious and subjective, and its accuracy is highly related to the image characteristics and the expertise of the observer. Automatic image analysis techniques also exist. Examples of automated feature detection or measurement techniques using ultrasound imaging are provided in US 2009/0093717 A1 , US 201 1/0021915 A1 and US 7,995,820 B2. In particular, automatic ellipse detection is a well-studied problem but most existing methods require the use of intensity-based image features to perform accurately. Some existing methods use a generalised Hough Transform (HT) [2][3] framework, for detecting arbitrary shapes, making use of intensity gradients. However, the contrast within the same object of interest, particularly in ultrasound images, varies considerably, and therefore intensity-gradient-based approaches are not well-suited for extracting image features across the range of image qualities seen in clinical practice (and in the case of fetal analysis a further issue is the change in appearance across gestational age). The development of automatic methods for quantitative analysis is especially challenging in ultrasound images [1 ], where object boundaries appear fuzzy or are not visible.
There is therefore a desire to be able to automatically quantify structures in images (such as organ/tissue structures in medical ultrasound images) more effectively and efficiently.
Summary of the Invention
According to a first aspect of the present invention there is provided an image processing method as defined in claim 1 of the appended claims. Thus, there is provided an image processing method performed by a processor and comprising: receiving a first image containing one or more objects of interest; deriving local phase and local orientation information from the first image; identifying one or more edge features of the object(s) of interest using the local phase information; and fitting one or more parametric shapes to the edge feature(s) of the object(s) of interest using the local orientation information. Local phase information advantageously extracts structural image information while being invariant to contrast, and so is well-suited for a variety of image analysis tasks. Moreover, the use of local orientation information advantageously enables the method to distinguish between inner and outer contours when fitting the parametric shapes to the objects of interest.
Each image edge feature may be described by its pixel location and a feature attribute vector - in our case, local orientation and local phase. Other image feature descriptors may be included in the feature attribute vector in practice.
Preferably the local phase information is derived from the first image using the monogenic signal.
Preferably the step of identifying the edge feature(s) comprises computing the local phase at multiple scales and applying a feature asymmetry measure, although other ways are also possible. Feature asymmetry on its own can also be used directly for fitting the shape(s).
Preferably the step of identifying the edge feature(s) further comprises applying non-maximal suppression in the local orientation direction.
Preferably the step of fitting parametric shapes to the edge feature(s) comprises fitting ellipses or circles. Advantageously, coupled ellipses or circles (which may be concentric) may be fitted simultaneously to inner and outer contours of the same object of interest. Alternatively, the step of fitting parametric shapes to the edge feature(s) may comprise fitting other two-dimensional shapes, or three- dimensional shapes.
Preferably the method further comprises performing quantitative analysis of the image using the fitted shapes. The results of the quantitative analysis may then be displayed, for example by superimposing the results on the image(s) to which they relate.
According to a second aspect of the invention there is provided imaging apparatus configured to implement a method in accordance with the first aspect of the invention.
The imaging apparatus may be a medical scanner, such as an ultrasound scanner, a computed tomography (CT) scanner or a magnetic resonance imaging (MRI) scanner. Alternatively, the imaging apparatus may be an industrial scanner, for example of the kind used for the non-destructive testing of aircraft components.
According to a third aspect of the invention there is provided image processing apparatus configured to implement a method in accordance with the first aspect of the invention.
According to a fourth aspect of the invention there is provided a processor configured to implement a method in accordance with the first aspect of the invention.
According to a fifth aspect of the invention there is provided a computer program or set of instruction code for implementing a method in accordance with the first aspect of the invention when executed on a processor. According to a sixth aspect of the invention there is provided a computer-readable medium or physical carrier signal encoding a computer program in accordance with the fifth aspect of the invention.
Brief Description of the Drawings
Embodiments of the invention will now be described, by way of example only, and with reference to the drawings in which:
Figure 1 (a) is a schematic diagram of the composition of a fetal arm; Figure 1 (b) is a typical ultrasound image of an arm cross-section of a 25 weeks fetus;
Figure 1 (c) is a local phase image derived from the ultrasound image in Figure 1 (b);
Figure 1 (d) is a feature asymmetry image derived from the ultrasound image in Figure 1 (b), and from several scales of the local phase;
Figure 1 (e) is an edge representation derived from the image in Figure 1 (d);
Figure 1 (f) illustrates oriented edges in the selected area (dashed square) of Figure 1 (e);
Figure 2 illustrates principles of monogenic signal processing;
Figure 3 illustrates filters that can be used for calculating the monogenic signal; Figure 4(a) illustrates an ellipse fitted to an inner contour of imaged edge features using the present method;
Figure 4(b) illustrates an ellipse fitted to an outer contour of imaged edge features using the present method;
Figures 5(a) to 5(c) illustrate ellipses fitted to edge maps (left hand images) and to ultrasound images (right hand images) using the present method, with Figure 5(a) showing a fetal mid-arm cross-section at 21 weeks of gestation, Figure 5(b) showing a fetal mid-arm cross-section at 26 weeks of gestation, and Figure 5(c) showing a fetal mid-arm cross-section at 28 weeks of gestation - in each case with manual delineations from the same clinical expert displayed on the ultrasound (right hand) images as dashed lines; and
Figure 6 illustrates ellipses fitted to an ultrasound image (as solid lines) using the present method, and two manual delineations (as dashed lines) obtained by the same clinical expert, showing the intra-expert variability that can exist when delineating the object contour manually.
Detailed Description of Preferred Embodiments
The present embodiments represent the best ways known to the applicants of putting the invention into practice. However, they are not the only ways in which this can be achieved. It is to be emphasised that, although the present embodiments often refer to the soft tissue quantification of tubular structures (such as fetal limb cross-sections) using ultrasound images, the principles described herein are equally applicable to the analysis of other structures in humans or animals, such as left ventricular short-axis myocardial thickness measurement, or vessel thickness estimation. The principles are also applicable to the imaging of man-made objects (e.g. aircraft engine parts) using non-invasive imaging techniques. Moreover, the principles are applicable to any image modality, such as computed tomography (CT) or magnetic resonance imaging (MRI), as well as ultrasound. Embodiments can be used as a research tool or a clinical tool, for example quantifying the amount of soft tissue for a particular application.
Primarily, the present embodiments take the form of a method or algorithm for processing medical (or other) images. The method or algorithm may be incorporated in a computer program or a set of instruction code capable of being executed by a processor. The processor may be that of a conventional (sufficiently high performance) computer, or some other image processing apparatus or computer system. Alternatively, the processor may be incorporated in, or in communication with, a piece of imaging equipment such as an ultrasound scanner or an MRI scanner.
The computer program or set of instruction code may be supplied on a computer- readable medium or data carrier such as a CD-ROM, DVD or solid state memory device. Alternatively, it may be downloadable as a digital signal from a connected computer, either directly or over a local area network or a wide area network such as the Internet. As a further alternative, the computer program or set of instruction code may be hard-coded in the processor (or memory associated therewith) arranged to execute it. Initial summary
The present embodiment provides automatic oriented feature-based (coupled) ellipse fitting for image-based quantification. The presently preferred embodiment is a tool for automatic quantification of imaged tubular structures, applicable to medical image soft tissue quantification. It includes a novel ellipse/circle fitting method, which can automatically and simultaneously fit one, two or more ellipses/circles to an image using image edge features and local orientation, which can be derived from the monogenic signal, instead of intensity gradients. The use of the monogenic signal renders the present technique very useful for ultrasound imaging.
Since a circle is a special case of an ellipse, we will refer only to ellipses in the remainder of this description. However, the term "ellipse" should be interpreted as encompassing a circle as one possibility.
The presently preferred embodiment automatically quantifies the soft tissue of tubular structures using a clinical image-based measurement tool that employs (coupled) ellipse fitting based on structural information and a modified Hough transform.
By using the image edge feature representation derived from local phase and orientation instead of intensity gradients (gradients having been used in previous ellipse fitting methods), the candidate object boundary locations and orientation can be identified in a robust way in any imaging modality. Oriented image edge feature maps, which can be derived from the monogenic signal, guide the process to find a single ellipse or coupled ellipses simultaneously (inner and outer contours) for quantification of tubular structures.
Key features of the presently preferred embodiment are:
• An image-based quantification tool that fits ellipses, effectively simultaneously, to the inner and outer contours of tubular structures. Other shapes or structures are also able to be fitted.
· A novel use of edge orientation to identify the contours of the structure of interest to find (coupled) ellipses in the images. The edge orientation is such that it always points outwards from the object of interest, while being derived from structural information, and can be obtained from the monogenic signal, making it robust to intensity changes.
• A modified Hough transform keeps track of the sign of the dot product between the local orientation and the fitted ellipse normals to decide between the inner and outer contour for coupled ellipses.
Image processing overview
By way of introduction, we illustrate the embodiment with reference to soft tissue quantification. Figure 1 (a) is a schematic diagram of fetal arm composition, showing that the arm consists of a humerus bone surrounded by muscle, which in turn is surrounded by an adipose tissue layer. Figure 1 (b) is a typical ultrasound image of the cross-section of such an arm, of a 25 weeks fetus. The ultrasound image of Figure 1 (b) will be referred to herein as the "original" or "first" image. It can be seen that the original image of Figure 1 (b) is prone to speckle and the objects have fuzzy boundaries. The use of local phase is better suited as it is intensity invariant.
In accordance with the presently preferred embodiment, to obtain a structural representation of the original image, a local phase representation as shown in Figure 1 (c) is first obtained using the monogenic signal [4] derived from the original image (Figure 2). Image edge features are detected as points of local phase congruency, obtained from computing the local phase at multiple scales and combined using a feature asymmetry measure, as shown in Figure 1 (d). The feature asymmetry image contains thick edges, whereas a good localisation of the edges is desirable. Therefore, non-maximal suppression is applied in the local orientation direction, also obtained from the monogenic signal. Other ways of obtaining the edges can also be incorporated into this framework. Feature asymmetry (without non-maximal suppression) with local orientation could also be directly used to find the ellipses, but would not be as well localised.
From the local orientation and the edge map derived from feature asymmetry, an oriented edge map as shown in Figures 1 (e)-(f) is then produced by assigning the corresponding local orientation value to each pixel within the edge map of Figure 1(e). In Figures 1(e)-(f), local orientation vectors point outwards from the structure of interest.
Ellipses, circles or other parametric shapes can then be automatically fitted to the oriented edge map, with the vector information being used to determine whether the fitted shape corresponds to an outer or an inner boundary of the imaged feature. Such a process is illustrated in Figures 4(a) and 4(b), and described in greater detail below. Measurements or other quantitative analysis may then be carried out using the fitted shapes.
Local phase and local orientation estimation
The monogenic signal [4] IM(x,y) of an image I(x,y) generalises the analytic signal to 2D and higher dimensions using a Riesz transform instead of a Hilbert transform. From the monogenic signal, local energy, local phase, and local orientation can be estimated. The image is initially convolved with a bandpass filter b(x,y), to give Ib(x,y) = b(x,y)®I(x,y), where ® denotes the convolution operation. The monogenic signal is then expressed as
(Ib(x,y\hi(x,y)®Ib(x,y\h2(x,y)®Ib(x,y)), where \ and h2 are the convolution kernels of the Riesz transform, which are represented Figure 3 and defined as , respectively.
Figure imgf000011_0001
The local energy E(x,y)i local phase φ(χ,γ), and local orientation Q(x,y) of I(x,y) are derived from IM(x,y) and expressed as E(x,y) = lb 2 +{\®Ibf+{h2®Ibf,
Figure imgf000011_0002
h2(x,y) ® Ib (x,y)
θ(χ,γ) = arc tan I , respectively.
Figure 2 illustrates in more detail the operations used to process the monogenic signal derived from the original image, in order to obtain the local phase (e.g. processing the image of Figure 1 (b) to obtain the image of Figure 1 (c)) and local orientation representations.
The local phase is estimated using the monogenic signal but can be estimated using other approaches. It is invariant to contrast and describes the structure of the object of interest rather than the magnitude.
As shown in Figure 2, h-\ and h2 provide the quadrature pair of filters required for local hase estimation. In the frequency domain, these filters are defined as H1(w,v) is a cosine in the Fourier
Figure imgf000012_0001
domain and H2 (u,v) is a sine. This pair of filters define the Riesz transform. The scale selected determines the size of the structures detected. The choice of the band-pass filter used also affects the final result.
Feature asymmetry processing
Image edge features can be detected at points of local phase congruency, obtained from computing the local phase (such as that shown in Figure 1 (c)) at multiple scales and combined using the feature asymmetry (FA) measure defined as
Figure imgf000012_0002
where N is the total number of scales,
Ts is an orientation dependent threshold that controls spurious responses to
Figure imgf000013_0001
ε = 0.01 , to avoid division by zero.
Feature asymmetry returns a value between 0 and 1 . A feature is defined as a point where there is local phase congruency among scales. Thinning
The feature asymmetry image (e.g. Figure 1 (d)) can be thinned in order to give better localization of the actual edges, resulting in an image such as the one shown in Figure 1 (e). Oriented edge map construction
Having obtained an image with preferably well-localized edges (e.g. by thinning as noted above), such as the one shown in Figure 1 (e), an oriented edge map can then be constructed by adding the local orientation information to the edge features, which can be obtained from feature asymmetry. Figure 1 (f) shows the local orientation vectors added to the feature edges within the dashed square in Figure 1 (e). Local orientation vectors (represented by arrows in Figure 1 (f)) point outwards from the object of interest. Being derived from structural information, obtained from the monogenic signal, makes this process robust to intensity changes and independent of the intensity gradient.
Shape fitting and vector analysis
Having determined the orientations of the image edge features, parametric shapes such as (but not necessarily limited to) ellipses or circles can then be automatically fitted to the edge features. Any pixel in the oriented edge map (e.g. as shown in Figure 1 (f)) can be a potential ellipse centre. For each potential centre, the parameters of an ellipse (principal axes and orientation) are iteratively varied and, at each iteration, points on the ellipse that match edge features are analysed and a likelihood score obtained as follows: At each point on the ellipse, the scalar product between the local orientation of each oriented edge feature and the normal to the ellipse at that point is calculated and accumulated. For each potential ellipse centre, the best ellipse parameters for the ellipses fitting the inner and outer contours are obtained using the accumulator values. Coupled ellipses can be identified simultaneously by retaining the two ellipses such that the inner/outer contours have an opposite scalar product sign and minimise/maximise the accumulator at that ellipse centre. Examples of fitted ellipses are illustrated using dashed lines in Figures 4(a) and 4(b). In Figure 4(a), the scalar product is negative at the image edge features on the inner contour matching the fitted ellipse (the dashed line 42). Conversely, in Figure 4(b), the scalar product is positive at the image edge features belonging to the outer contour matching the fitted ellipse (the dashed line 44).
The scalar product is used to update the accumulator for detecting coupled ellipses on the images. Instead of using intensity gradients, the oriented edge map based on structural information is created. For circle detection only three parameters are necessary, whereas ellipse detection requires five parameters.
The scalar product of an ellipse matching the inner contour of the object is negative, since the local orientation of the inner contour and the normal vectors to the fitted ellipse at those edge points are opposite. The scalar product of an ellipse matching the outer contour is positive because the local orientation and the normal vectors to the ellipse at those points are pointing in the same direction.
The final coupled ellipses are obtained using a likelihood score, which combines the results of inner and outer contours, so the best ellipses are detected simultaneously. The likelihood score can be defined in a number of ways, by using the information from the accumulator. An example is given in the following: An ellipse is characterised by five parameters, which are the centre (x0,y0) , the semi-major axis a, the semi-major axis b, and the angle of rotation β. The current implementation is aimed at detecting two concentric ellipses simultaneously. In this particular case, the orientation of both ellipses is set to be the same, making sure that the semi-major axes of the outer ellipse are bigger than the semi-major axes of the inner ellipse. For each potential ellipse centre (x,y) in the image, the best ellipse parameters fitting the inner and outer contours are obtained as (a^b^ fim ) = aigmin(H(x,y,a,b,ffj) and a,b,P
respectively. In this particular of am- ,bf bm- application, the orientation of the best fitted ellipses to inner and outer contours is assumed to be similar. This criteria can be modified depending on the application, and other criteria, such as preserving the eccentricity, can be used instead. Furthermore, the criteria can be modified to find more than two ellipses, concentric or not.
Once the best parameters for inner and outer ellipses are obtained at each image pixel (x,y), a likelihood score L (x,y) can be defined as L(x,y) = H(x,y,amt,bmt, fi0Ut) - H(x,y,ain,binfin ) . l_(x,y) is maximised to find the centre (x0,y0) of the coupled ellipses that best fit the edges in the oriented edge map.
Parametric shapes other than circles or ellipses may be fitted using this technique. For example, shapes with straight edges (e.g. triangles, hexagons, squares or rectangles) may be fitted, as may irregular shapes. Three-dimensional structures such as spheres/hollow spheres or cylinders/tubes may also be fitted to three- dimensional images using this technique.
Quantitative analysis
Having fitted the shapes, measurements or other quantitative analytical calculations may then be carried out using the fitted shapes. For example, linear, area or volume measurements relating to the fitted shapes may be evaluated, or ratios or other functions may be calculated. As those skilled in the art will appreciate, the quantitative results may be presented to the user on a display screen (e.g. superimposed on the images to which they relate), printed on paper, or saved in memory or in a data storage device.
Experimental demonstration
The above techniques were used on a dataset of ultrasound images of the fetal arm to quantify adipose tissue across gestation, obtaining good results as presented in Figures 5(a) to 5(c). These figures illustrate ellipses fitted to edge maps (left hand images) and to ultrasound images (right hand images), with Figure 5(a) showing a fetal mid-arm cross-section at 21 weeks of gestation, Figure 5(b) showing a fetal mid-arm cross-section at 26 weeks of gestation, and Figure 5(c) showing a fetal mid-arm cross-section at 28 weeks of gestation. The dashed line contours in the left hand images and the solid line contours in the right hand images correspond to the fitted ellipses. The dashed lines on the right hand images correspond to manual delineations by the same trained clinician at two different time points. Figure 6 displays the fitted ellipses (solid line contours) over the manually delineated contours (dashed lines) by the same clinical expert. This example shows how difficult it is to manually delineate the contours of the object of interest, since the two manual delineations have a high variability among themselves, due to fuzzy regions in the ultrasound image. This shows the necessity of automated tools for soft tissue quantification, and how well the present ellipse fitting method performs using the image information.
References
[1 ] Noble, J. A. , Boukerroui, D. : Ultrasound image segmentation: a survey.
IEEE Trans Med Imaging. 25(8), 987-1010 (2006)
[2] Ballard, D. H. : Generalizing the Hough Transform to detect arbitrary shapes.
Pattern Recognition. 13(2), 1 1 1 -122 (1981 )
[3] Hough, P.V.C. : Method and means for recognizing complex patterns. U.S.
Patent 3,069,654 (1962) [4] Felsberg, M., Sommer, G.: The monogenic signal. IEEE Trans. Signal
Process. 49(12), 3136-3144 (2001 )
[5] Kovesi, P.: Image features from phase congruency. Videre, 1 (3) (1999)

Claims

1 . An image processing method performed by a processor and comprising: receiving a first image containing one or more objects of interest;
deriving local phase and local orientation information from the first image; identifying one or more edge features of the object(s) of interest using the local phase information; and
fitting one or more parametric shapes to the edge feature(s) of the object(s) of interest using the local orientation information.
2. A method as claimed in claim 1 , wherein the local phase information is derived from the first image using the monogenic signal.
3. A method as claimed in claim 1 or claim 2, wherein the step of identifying the edge feature(s) comprises computing the local phase at multiple scales and applying a feature asymmetry measure.
4. A method as claimed in claim 3, wherein the step of identifying the edge feature(s) further comprises applying non-maximal suppression in the local orientation direction.
5. A method as claimed in any preceding claim, wherein the step of fitting parametric shapes to the edge feature(s) comprises fitting ellipses or circles.
6. A method as claimed in claim 5, wherein coupled ellipses or circles are fitted simultaneously to inner and outer contours of the object(s) of interest.
7. A method as claimed in claim 6, wherein the coupled ellipses or circles are concentric.
8. A method as claimed in any of claims 1 to 4, wherein the step of fitting parametric shapes to the edge feature(s) comprises fitting three-dimensional shapes.
9. A method as claimed in any preceding claim, further comprising performing quantitative analysis of the image using the fitted shapes.
10. A method as claimed in claim 9, further comprising displaying the results of the quantitative analysis.
1 1 . A method as claimed in claim 10, further comprising superimposing the results of the quantitative analysis on the image(s) to which they relate.
12. Imaging apparatus configured to implement a method as claimed in any preceding claim.
13. Imaging apparatus as claimed in claim 12, being a medical scanner.
14. Imaging apparatus as claimed in claim 13, being an ultrasound scanner.
15. Imaging apparatus as claimed in claim 13, being a computed tomography scanner or a magnetic resonance imaging scanner.
16. Imaging apparatus as claimed in claim 12, being an industrial scanner.
17. Image processing apparatus configured to implement a method as claimed in any of claims 1 to 1 1 .
18. A processor configured to implement a method as claimed in any of claims 1 to 1 1 .
19. A computer program or set of instruction code for implementing a method as claimed in any of claims 1 to 1 1 when executed on a processor.
20. A computer-readable medium or physical carrier signal encoding a computer program as claimed in claim 19.
21 . An image processing method substantially as herein described with reference to and as illustrated in any combination of the accompanying drawings.
22. Imaging apparatus, image processing apparatus, or a processor substantially as herein described with reference to and as illustrated in any combination of the accompanying drawings.
23. A computer program or set of instruction code substantially as herein described with reference to and as illustrated in any combination of the accompanying drawings.
PCT/GB2014/050010 2013-01-07 2014-01-03 Methods and apparatus for image processing WO2014106747A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1300198.7 2013-01-07
GBGB1300198.7A GB201300198D0 (en) 2013-01-07 2013-01-07 Methods and apparatus for image processing

Publications (1)

Publication Number Publication Date
WO2014106747A1 true WO2014106747A1 (en) 2014-07-10

Family

ID=47748033

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2014/050010 WO2014106747A1 (en) 2013-01-07 2014-01-03 Methods and apparatus for image processing

Country Status (2)

Country Link
GB (1) GB201300198D0 (en)
WO (1) WO2014106747A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109753887A (en) * 2018-12-17 2019-05-14 南京师范大学 A kind of SAR image target recognition method based on enhancing nuclear sparse expression
CN110070519A (en) * 2019-03-13 2019-07-30 西安电子科技大学 Stitching image measuring method, image mosaic system based on phase equalization
CN112365538A (en) * 2020-10-13 2021-02-12 西安理工大学 Efficient target detection method of automatic reeling system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007010206A1 (en) * 2005-07-18 2007-01-25 Isis Innovation Limited Method and computer program for spatial compounding of images

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007010206A1 (en) * 2005-07-18 2007-01-25 Isis Innovation Limited Method and computer program for spatial compounding of images

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BELAID A ET AL: "Phase-Based Level Set Segmentation of Ultrasound Images", IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, IEEE SERVICE CENTER, LOS ALAMITOS, CA, US, vol. 15, no. 1, 1 January 2011 (2011-01-01), pages 138 - 147, XP011373644, ISSN: 1089-7771, DOI: 10.1109/TITB.2010.2090889 *
FATHIMA SANA ET AL: "A novel local-phase method of automatic atlas construction in fetal ultrasound", MEDICAL IMAGING 2011: IMAGE PROCESSING, SPIE, 1000 20TH ST. BELLINGHAM WA 98225-6705 USA, vol. 7962, no. 1, 3 March 2011 (2011-03-03), pages 1 - 10, XP060009541, DOI: 10.1117/12.878317 *
MARTIN STORATH: "Directional Multiscale Amplitude and Phase Decomposition by the Monogenic Curvelet Transform", SIAM JOURNAL ON IMAGING SCIENCES, vol. 4, no. 1, 1 January 2011 (2011-01-01), pages 57 - 78, XP055109315, DOI: 10.1137/100803924 *
UNSER M ET AL: "Multiresolution Monogenic Signal Analysis Using the Rieszâ Laplace Wavelet Transform", IEEE TRANSACTIONS ON IMAGE PROCESSING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 18, no. 11, 1 November 2009 (2009-11-01), pages 2402 - 2418, XP011294582, ISSN: 1057-7149 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109753887A (en) * 2018-12-17 2019-05-14 南京师范大学 A kind of SAR image target recognition method based on enhancing nuclear sparse expression
CN109753887B (en) * 2018-12-17 2022-09-23 南京师范大学 SAR image target identification method based on enhanced kernel sparse representation
CN110070519A (en) * 2019-03-13 2019-07-30 西安电子科技大学 Stitching image measuring method, image mosaic system based on phase equalization
CN112365538A (en) * 2020-10-13 2021-02-12 西安理工大学 Efficient target detection method of automatic reeling system

Also Published As

Publication number Publication date
GB201300198D0 (en) 2013-02-20

Similar Documents

Publication Publication Date Title
CN108520519B (en) Image processing method and device and computer readable storage medium
CN110338844B (en) Three-dimensional imaging data display processing method and three-dimensional ultrasonic imaging method and system
JP5977214B2 (en) Image processing method, apparatus, and program
CN109124662B (en) Rib center line detection device and method
EP2846310A2 (en) Method and apparatus for registering medical images
CN107274399A (en) A kind of Lung neoplasm dividing method based on Hession matrixes and 3D shape index
Qin et al. Automatic segmentation of right ventricular ultrasound images using sparse matrix transform and a level set
CN108052909B (en) Thin fiber cap plaque automatic detection method and device based on cardiovascular OCT image
US20170360396A1 (en) Ultrasound imaging apparatus and method for segmenting anatomical objects
JP6293619B2 (en) Image processing method, apparatus, and program
Yap et al. TIMER: Tensor image morphing for elastic registration
KR20140114303A (en) System and method for automatic planning of two-dimensional views in 3d medical images
CN113706473A (en) Method for determining long and short axes of lesion region in ultrasonic image and ultrasonic equipment
CN110634554A (en) Spine image registration method
CN115861656A (en) Method, apparatus and system for automatically processing medical images to output an alert
Lorenz et al. Automated abdominal plane and circumference estimation in 3D US for fetal screening
WO2014106747A1 (en) Methods and apparatus for image processing
JP5364009B2 (en) Image generating apparatus, image generating method, and program thereof
Kovacs et al. Holistic segmentation of the lung in cine MRI
Tan et al. Automatic localization of the left ventricular blood pool centroid in short axis cardiac cine MR images
US11551371B2 (en) Analyzing symmetry in image data
Schmidt-Richberg et al. Abdomen segmentation in 3D fetal ultrasound using CNN-powered deformable models
JP4709290B2 (en) Image processing apparatus and method, and program
Valizadeh et al. Parametric-based feature selection via spherical harmonic coefficients for the left ventricle myocardial infarction screening
CN115439453A (en) Vertebral body positioning method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14700018

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14700018

Country of ref document: EP

Kind code of ref document: A1