GB2407636A - Automated measurement in images - Google Patents
Automated measurement in images Download PDFInfo
- Publication number
- GB2407636A GB2407636A GB0325614A GB0325614A GB2407636A GB 2407636 A GB2407636 A GB 2407636A GB 0325614 A GB0325614 A GB 0325614A GB 0325614 A GB0325614 A GB 0325614A GB 2407636 A GB2407636 A GB 2407636A
- Authority
- GB
- United Kingdom
- Prior art keywords
- image
- intensity
- location device
- edges
- feature location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 75
- 238000002604 ultrasonography Methods 0.000 claims abstract description 46
- 238000013528 artificial neural network Methods 0.000 claims abstract description 28
- 238000000034 method Methods 0.000 claims description 40
- 230000001605 fetal effect Effects 0.000 claims description 23
- 238000000691 measurement method Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 description 10
- 238000012360 testing method Methods 0.000 description 6
- 230000002759 chromosomal effect Effects 0.000 description 5
- 230000007547 defect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 229920006395 saturated elastomer Polymers 0.000 description 4
- 238000000926 separation method Methods 0.000 description 4
- 230000001419 dependent effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000010200 validation analysis Methods 0.000 description 3
- 210000001367 artery Anatomy 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 210000002302 brachial artery Anatomy 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000002592 echocardiography Methods 0.000 description 2
- 238000003708 edge detection Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 210000003754 fetus Anatomy 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000035935 pregnancy Effects 0.000 description 2
- 238000012502 risk assessment Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000013529 biological neural network Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 210000001715 carotid artery Anatomy 0.000 description 1
- 230000007012 clinical effect Effects 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000008774 maternal effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000002792 vascular Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1075—Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0858—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving measuring tissue layers, e.g. skin, interfaces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0866—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Health & Medical Sciences (AREA)
- Geometry (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
An apparatus 2 for measuring nuchal translucency in foetal ultrasound images comprises an input for receiving an image from an ultrasound scanner 4, an input for receiving a signal indicating a user-defined region of interest in the image, an artificial neural network 26 for locating intensity peaks within the selected region of the image and a processor 22 for determining a measurement of the thickness of the nuchal translucency on the basis of the located intensity peaks.
Description
AUTOMATED MEASUREMENT IN IMAGES
This invention relates to a system and method for automatically measuring dimensions in images.
The invention is applicable to the measurement of nuchal translucency in fetal ultrasound images. However it is not intended that the invention be limited to this field of application. The technique is applicable to any suitable imaging application.
The probability of chromosomal defects in the fetus can be assessed, in conjunction with maternal age, at 11-14 weeks gestation by measuring nuchal translucency thickness in fetal ultrasound images. The following two articles describe this in more detail: Nicolaides K.H., Azar G., Byrne D., Mansur C., Marks K.'Fetal nuchal translucency: ultrasound screening for chromosomal defects in first trimester of pregnancy' Br. Med. J. 1992; 304:867-9; and Pandya P.P., Kondylios A., Hilber L., Snijders Rj., Nicolaides K.H.
Chromosomal defects and outcome in 1015 fetuses with increased nuchal translucency' Ultrasound Obstet. Gynecol., 1995; 5:1:15-9.
Translucency thickness is measured by capturing an image using an ultrasound scanner and then manually positioning cursors on the B-mode image translucency edges of the image using the high contrast trailing-toleading echo edges for measurement. However, the finite rise and fall times of ultrasound scanner pulses are generally processed non-linearly by the scanner amplifier resulting in a small but perceptible variation in echo size with scanner gain and consequent measurement dependence. Measuring between similar relative positions on the translucency echoes i. e. Ieading-to-leading edge or peak-to peak minimises the gain dependency, but leading edges can be obscured and the eye is relatively insensitive when discriminating subtle brightness changes at high intensities, such as echo peaks or saturated or near-saturated echoes.
These, and other factors including fetal movement, acquisition of the correct sagittal image plane, a sub-conscious tendency to underestimate thickness at critical threshold levels, scanner process settings, the display contrast and brightness settings, ambient lighting and slight variation in ultrasound pulse length with different scanner types and transducers, combine to make nuchal translucency measurements subjective.
Translucency repeatability studies report inter and intra operator repeatability coefficient of up to +0.88 mm and +0.70 mm respectively (95% confidence level). Based on these reports the inter operator range of, for example, a 3.5 mm translucency could be from 2.62 mm to 4.38 mm. Clearly an automated objective translucency measurement method is desirable.
Ideally a translucency measurement system should be invariant to the factors discussed above, with good repeatability and reliability over the clinical image quality range. Bernardino F., Cardoso R., Montenegro N., Bernardes J., Marques de Sa J. 'Semiautomated Ultrasonoagraphic Measurement of Fetal Nuchal Translucency using a Computer Software Tool' Ultrasound in Med. & Biol. 1998; 24:1:51-54 have reported a semiautomated technique based on analyzing brightness gradients (Sober gradient function) at the nuchal edges.
The method, however, is scanner gain dependent, performs poorly with noisy images, and requires considerable manual supervision.
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which: Figure 1 is a block diagram that illustrates an overview of a system for automatically measuring nuchal translucency in fetal ultrasounds; Figure 2 shows an example of automated translucency edge detection; Figure 3 shows the vertical. brightness profile through enlargement a) of Figure 2; Figure 4 shows a composite image showing a nuchal translucency ultrasound phantom and measurement window at three scanner gain settings; Figure 5a and 5b show the effects of gross gain change on measured thickness in images of the nuchal translucency ultrasound phantom for automated and manual measurements respectively; Figure 6 shows the correlation between manual and automated trailing-to leading edge and automated peak-to-peak nuchal measurements on 193 clinical images; Figure 7 shows the correlation between automated trailing-to-leading edge and peak-to-peak nuchal measurements on 193 clinical images; and Figure 8 shows the correlation between automated trailing-to-leading edge and peak-to-peak nuchal measurements on 193 clinical images.
A method and apparatus for automatically measuring nuchal translucency in fetal ultrasounds is described. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these specific details. In other instances, wellknown structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
Embodiments are described herein according to the following outline: 1.0 General Overview 2.0 Structural and Functional Overview 3.0 Method of automatically measuring nuchal translucency in fetal ultrasounds 4.0 Extensions and Alternatives 1.0 General Overview The needs identified in the foregoing Background, and other needs and objects that will become apparent for the following description, are achieved in the present invention, which comprises, in one aspect, a method for automatically measuring nuchal translucency in fetal ultrasounds. There is provided an automated feature measurement apparatus for providing an indication of distance between features within an ultrasound image. The apparatus comprises an input for receiving a signal from an ultrasound scanner, said signal representing an image as captured by the ultrasound scanner and an input for receiving a signal indicating a selected region of the image. A feature location device is provided for locating features of peak intensity within the selected region ofthe image. A processor is provided to determine a measurement on the basis of the located features.
In other aspects, the invention encompasses a computer apparatus and a computer-readable medium configured to carry out the foregoing steps.
A PC based nuchal translucency measurement method is provided which uses artificial neural networks to detect the translucency edges. Measurements are largely invariant to all of the factors previously discussed except the purely clinical: fetal movement and image scan plane. In use it is generally only necessary for an operator to point to the translucency on the image using a cursor device to obtain the measurement. Performance is reported for automated trailing-to-leading edge and peak-to-peak versus manual measurements using simulated (n = 50) and clinical image sets (n = 193).
2.0 Structural and Functional Overview Figure 1 shows a block diagram that illustrates an overview of a system for automatically measuring nuchal translucency in fetal ultrasounds. Automated nuchal translucency measurement apparatus 2 comprises a processor 22, memory 24 and an artificial neural network 26. Connected to the apparatus 2 is an ultrasound scanner 4 of any suitable type. Also connected to the apparatus 2 is a display 6, such as a cathode ray tube (CRT) display, a thin film display or any other suitable display device. A selection device 8, such as a computer mouse, a stylus or the like, provides a means of selecting an area of an image displayed on the display 8.
Artificial neural networks (ANN) are simplified models of biological neural networks and their principle advantages are that they can be trained to discriminate between data classes with good classification performance and require little expert knowledge of the discrimination process in design. The ANN used here to detect nuchal translucency edges were originally designed to identify brachial artery walls in B-mode ultrasound images as described in the article by V.R Newey and D.K. Nassiri "On-line artery diameter measurement in ultrasound images using artificial neural networks" Ultrasound in Medicine arid Biology 2002; 28: 2: 209-216.
The rationale for utilizing an existing ANN is that the biological tissue interfaces in both applications, i.e. artery wall and nuchal translucency, produce similar ultrasound image characteristics with a similar measurement dimension i.e. the trailing edge of the proximal echo to the leading edge of the distal echo.
The original networks have also demonstrated successful on-line detection and tracking of brachial and carotid artery walls using clinical images from different ultrasound scanner types over a range of image zoom settings. A detailed account of ANN theory is beyond the scope of this paper, however
comprehensive descriptions are available. s
3.0 Method of measuring the nuchal thickness A user is required to identify the nuchal translucency in a fetal ultrasound image by pointing on the image with a mouse cursor by using the selection device 8. The neural networks 26 then effectively conduct a vertical pixel-by pixel search about this start point to locate, within a search window, the positions with highest probability of being translucency edges. A problem associated with ANN is that there is no generally accepted method for establishing the contribution made by individual network inputs to the discrimination process. It would be difficult, therefore, to establish quantitatively any criteria used to identify nuchal edges, such as edge brightness threshold, and consequently to validate the technique. To overcome this problem the networks used only detect the translucency edges.
An example of translucency edges detected by the system is shown in Figure 2.
Figure 2 shows an example of automated translucency edge detection within an image 10. An operator uses the selection device 8 to click on the image in the general region of the nuchal translucency (e.g. the centre of the nuchal translucency). A measurement region 12, centred on the chosen point, is then automatically calculated. The user may be able to over-ride this if required.
The vertical size of the area 12 is based on the image resolution (pixels/cm) and the mean value of translucency in the population (say 2. 5mm) plus the axial size of the ultrasound beam. The vertical height is limited to avoid an extensive search, which may erroneously detect tissue interfaces outside the translucency. This means that in some cases the height will be too small and here it can be increased manually. Generally the operator is only required to point and click on approximately the centre of the nuchal translucency to make an automated measurement. The horizontal size is generally fixed (say 4mm), but again the operator can manually override if required. The processor 22 in response initiates the ANN 26 to carry out detection of translucency edges within the selected region 12. Figure 2 also shows five enlargements: a) The bit map of the measurement region.
b) Neural network detected edges (un-smoothed).
c) Neural network detected edges (smoothed).
and for comparison: d) Sobel gradient/threshold.
e) Manual measurement.
With the image shown, it proved impossible to set the threshold in d) to give a continuous lower translucency edge as per the result with the neural network.
In b) and c) the trailing-to-leading edge markers are shown in white and peak to-peak in black. The horizontal dotted lines in a), b) and c) are the search constraints set by the selected region 12. (The measurement boxes shown here are wider than in practice to demonstrate nuchal edge tracking.) The edges detected by the neural network 26 are then refined by the processor 22 to give measurements based on a quantitative brightness threshold lo. For a given edge, this threshold was defined as follows: lo = k (Ipeak- Imm) + Imps where k = a threshold constant Ipeak = translucency edge peak brightness Im,n = minimum brightness between translucency edges Figure 2a is an expanded view of the smoothed edges detected by the neural network, as shown in Figure 2. Once the neural network has detected these edges (as shown in Figure 2a), the processor 22 operates to calculate I, and the thickness of the nuchal translucency. The processor operates as follows.
S Firstly the processor estimates a centre line A-A between the edges detected by the neural network. The processor 22 then examines the pixels on a line B-B perpendicular to the tangent of the centre line A-A, working along successive lines B-B perpendicular to the tangent at each point along the centre line A-A within the selected area 12. For instance, considering the image as shown in figure 2a, the processor 22 estimates a general centre line A-A and then, beginning from one side of the image to the other, examines pixels along a line B-B normal to tangents of the centreline A-A. Examination of the pixels along the line B-B produces a brightness profile along that cross-section of the selected area.
Figure 3 shows an example of a brightness profile through the selected area 12 of Figure 2 (enlargement c), showing proximal and distal nuchal edges along a line B-B as detected by the neural network 26. Inner dashed lines are trailing to-leading edge points detected automatically by the neural networks at k = 0.65, i.e. 65% of difference between Ipeak and Im,n. Outer dashed lines are limits of search for translucency edge peaks.
Ipeak was found by searching beyond the edges selected by ANN over a distance slightly less than the scanner axial resolution. A value of k = 0.65 was found to locate translucency trailing-to-leading edge positions very similar to those selected by experienced operators but with greater consistency. However, setting k = 1.0 gives nuchal echo peak-to-peak measurements, which have even greater consistency as a result of lower scanner gain dependence. By this method the nuchal translucency edges were tracked laterally over the search window and nuchal thickness normal to the translucency centre line measured at each point, from which mean thickness over the search window width was calculated.
An integrated software package was written for the system using LabVIEW and IMAQ (National Instruments, Austin, Texas USA) and the ANN were designed using NeuralWorks Professional II/PLUS (Pittsburgh, PA, USA).
System validation' Validation was carried out to establish: the dependency of measurements on the scanner gain and translucency rotation; automated versus manual measurement repeatability; and subjective clinical reliability of the technique. This required a series of tests, some of which could only be addressed using simulated images.
For example, it was impossible to explore scanner gain and translucency rotation dependency in a clinical setting, for both practical and ethical reasons.
Therefore, a series of progressively more realistic images was obtained from simple translucency simulations, a translucency ultrasound phantom, a set of 50 realistic simulated fetal images of defined translucency thickness, and finally a representative database of 193 clinical images with known manual measurements. For comparison an experienced operator conducted manual measurements on some simulated images. The repeatability coefficient for this operator was known from an earlier study'9, representing average ability in a group of thirteen operators. Repeatability coefficients were calculated by plotting pair differences against pair means. All results reported are at the 95% confidence level (standard deviation x +1.96) and Pearson's coefficient was used in all correlation calculations.
Scanner gain dependence A simple nuchal translucency ultrasound phantom was designed consisting of a thin wall plastic tube of nominally 4 mm inside diameter embedded in tissue mimicking materials'. The phantom was imaged with an ATE HDI5000 ultrasound scanner (Eindhoven, Netherlands) and CS-2 transducer using a S range of gain settings covering barely perceptible translucency edges to the onset of saturation at a typical clinical investigation zoom setting of 9.8 pixels per mm. The transducer was clamped to avoid any physical movement. The test was repeated for the scanner C1, C3, and C6 process settings. Figure 4 shows a composite image showing nuchal translucency ultrasound phantom and measurement window at three scanner gain settings, giving barely perceptable (left) to very bright (right) phantom translucencies.
When undertaking comparisons, automated and manual measurements of translucency thickness in these images were at identical spatial coordinates (as I S can be seen from Figures Sa and Sb) with identical search window size for the automated measurements. Translucency thickness was dependent therefore upon only the scanner gain and process settings. Figures 5a and Sb show the effects of gross gain change on measured thickness in images of the nuchal translucency ultrasound phantom (Figure 4) with an ATE HD15000 scanner at process settings of C1 and C6. Brightness (the two lower curves) was measured by recording the maximum intensity of the proximal translucency edge.
Measurement repeatability For nuchal translucency measurement in general no adequate quantitative measurement standards are available, as the only data sources are live clinical images or retrospective clinical images recorded with manual measurements at a fixed scanner gain setting. To address this limitation realistic fetal images were simulated using a virtual ultrasound scanner'9. The technique models the processing steps necessary to create realistic B-mode fetal images of known properties, such as translucency thickness, with user control of scanner gain and time-gain-control and simulation of gain dependent changes in echo axial sizes 4. Two sets of fifty realistic fetal images of known translucency thickness and distribution were generated using the method. The sets were at scanner gain extremes giving a very dim and a very bright set to reflect the brightness range in the scanner gain dependence tests with the translucency ultrasound phantom (Figure 4). Translucency thickness was randomly in the range 1-5 mm at 1 mm intervals, with random clinical effects such as non-horizontal translucencies and translucency artefacts. Translucency was measured in the sets using the automated system and a repeatability coefficient calculated (table 1).
Measurement method Repeatability coefficient (mm) 95% confidence level Manual +0.42 Automated trailing-to-leading edge +0.1 1 Automated peak-to-peak +0.07 Table 1. Repeatability using simulated fe tal image sets.
For comparison the inter-operator repeatability coefficient for 13 experienced operators was known (Table 1) for a similar image set with a clinical gain range.
Clinical reliability The reliability of the automated system in correctly detecting translucency edges was assessed using a representative database of 193 clinical images. This was compiled from a range of scanners (ATE HDI3000, ATE HD15000, Aloka SSD4000PHD, Aloka SSD5500, GE Gateway), transducers (3 to 7 MHz), image zoom (3.9 to 20.5 pixels/mm), nuchal thickness (1.41 mm +0.43 mm mean and standard deviation), operators and image quality. In this database identical image pairs both with and without manually positioned measurement cursors were available for comparison purposes. Translucency thickness was calculated automatically at two threshold levels, k= 0.65 corresponding to the conventional trailing-to-leading edge dimension, and k = 1.0 corresponding to the peakto-peak dimension. Agreement between manual and automated methods are shown in figs 6, 7 and 8. In all simulated and clinical images the brightness profile along a vertical line through the nuchal translucency was examined, as in the example of Figure 3, to confirm the positions of echo peaks detected by the system.
Rotation invariance To eliminate the influence of other variables the virtual scanner was used to generate a simple translucency image for rotation tests containing a pair of parallel lines of nominally 2 mm separation. By rotating the image +15 in 5 increments the rotational dependency of automated measurement was assessed.
RESULTS
The gain dependency tests (Figures 5a and 5b) show a maximum change of 0.94 mm for manual, 0.64 mm for automated trailing-to-leading edge (edge) and 0.14 mm for automated peak-to-peak (peak) measurements over the practical operational scanner gain range at C1 processing. The corresponding changes for C6 processing were 0.61 mm, 0.33 mm and 0.04 mm. Peak-to-peak measurement, therefore, gave a 15:1 reduction in gain dependency versus manual measurement for the C6 process setting and better than 6:1 for the C1 process. The C3 process results were similar to C1 and have, therefore, been omitted for clarity. Correlation and differences between methods are summarised in table 2.
Automated peak-to- Automated trailing-to peak leading edge Manual trailing-to- r= 0.86, P < 0.001 r= 0.92, P < 0.001 leading edge d = + O. 51 mm d = + 0.33 mm Automated peak-to- r = 1 r= 0.95, P < 0.001 peak d = 0 d = + 0.29 mm Table 2. Correlation (Pearson's r, significance P) and differences (d at 95% confidence) between methods.
Figure 6 shows the correlation (r= 0.92, p < 0.001) between automated and manual trailing-to-leading edge measurements on the clinical image database (n = 193). Automated peak-to-peak values are also shown. This relatively high correlation was expected as the neural networks were effectively trained using visual measurement criteriat2, i.e. trailing-to-leading edge measurement points.
The lower correlation and greater differences (r= 0.86, p < 0.001, +0.51 mm) between manual measurements and automated peak-to-peak measurements (Figure 6) are also consistent, as Figure 5 demonstrates that the peak-topeak measurement has lower gain dependence than either manual or automated trailing-to-leading edge measurements. Mean vertical separation between the regression lines of Figure 6 is approximately 0.58 mm representing the mean value of the ultrasound beam axial size in the clinical database. Differences between manual and automated trailing-to- leading edge measurements were within +0.33 mm, and between manual and automated peak-to-peak measurements within +0.51 mm (95% confidence level) . it is reasonable to attribute a significant proportion of the +0.51 mm differences to inter-operator repeatability in view of reported interoperator repeatability coefficient of up to +0.88 mm78 and the relatively low +0.29 mm automated peak-to-peak versus trailing-to-leading edge differences (Figure 8).
Figure 7 shows the correlation (r= 0.95, p < 0.001) between automated trailing-to-leading edge and peak-to-peak measurements on the clinical database (n = 193), and Figure 8 shows the corresponding differences (after subtracting from peak-to-peak measurements the vertical separation between regression lines in Figure 6). Agreement is within +0.29 mm (95% confidence level). The maximum errors resulting from rotation of a simulated 2 mm nuchal translucency by + 15 were +0.06 mm and +0.01 mm for automated trailing-to leading edge and peak-to-peak measurements respectively.
The automated peak-to-peak dimension has a lower scanner gain dependence than either manual or automated trailing-to-leading edge dimension (Figure 5).
This is logically consistent, as only the amplitudes not the positions of peaks should change with gain. The peak-to-peak dimension shows a sixfold improvement in repeatability coefficient over experienced operators using simulated fetal images (table 1) and good correlation with manual and automated trailing-to-leading edge measurements on the clinical database (Figure 6). The clinical set does not represent a measurement 'gold standard', as the true physiological translucency values are unknown, however some conclusions concerning the likely clinical performance of peak-to-peak measurements can be extrapolated from the results on this set. Clinical repeatability should lie somewhere between the value of +0. 29 mm for this set (Figure 8) which includes some variability due to scanner gain dependence of the automated trailing-to-leading edge method and the simulated fetal image repeatability of +0.07 mm (table 1). This represents a significant improvement over manual repeatability of up to +0.88 mm. The vertical separation between regression lines in Figure 6 approximates the mean ultrasound beam axial dimension over the clinical database. The peak-topeak dimension can be normalized to conventional trailing-to-leading edge measurements by subtracting a fixed value equal to this axial dimension. This allows established chromosomal defect risk assessment data to be used but with better repeatability and gain dependence. A further advantage of automation is that saturated translucency edges can be detected. Saturation results in underestimation of translucency thickness and approximately 8% of the clinical test images had one or both translucency edges saturated, demonstrating that the condition is difficult to consistently detect even for experienced operators.
Automated translucency measurements on the clinical database provides a subjective indicator of system reliability as the training database was compiled from a range and combination of scanner type, image zoom level, transducer frequency, operator and image quality. It was possible in this database to compare the automatically detected edges with manual positioned cursors and the translucency edge peaks with corresponding visually identifiable peaks.
The neural networks detected the same nuchal translucency edges as those chosen by experienced operators in 191 of the 193 clinical images, with good performance on relatively noisy images. In two clinical images the translucency edges were not correctly identified due to a combination of poor edge contrast and relatively bright secondary interfaces within the translucency.
A manual measurement capability was provided for this situation. 190 of the clinical images had echo peaks correctly identified automatically for peak-to peak measurements. The remaining 3 images had echo peaks that were beyond the search scope and translucency based on the peak-to-peak dimension was, therefore, underestimated. In practice this condition can be detected automatically and corrected by using an echo peak position estimator, based on the ultrasound beam axial size. The neural networks correctly detected translucency edges in all 50 simulated fetal images.
Thus there is provided a robust automated nuchal translucency measurement technique, which is simple to operate: the user is generally only required to point to the nuchal translucency. The validation study demonstrates very good agreement with translucency edges selected by experienced operators and the peak-to-peak dimension gives measurements that are largely independent of a range of factors currently influencing repeatability: visual limitations, scanner gain and process settings, echo saturation, transducer frequency, sub-conscious measurement bias, display brightness and contrast and ambient lighting. This allows the operator to concentrate more effectively on the clinical aspects that also influence repeatability such as fetal movement and scan plane. The lower repeatability coefficient of the peak-to-peak dimension is likely to significantly increase the accuracy and consistency of measurements and we suggest, therefore, that the automated peak-to-peak dimension is adopted and related to earlier studies by subtracting the ultrasound beam axial dimension from measurements. This enables the continued use of established chromosomal defect risk assessment data but with a more reliable measurement method.
4.0 Extensions and Alternatives The technique is applicable to any suitable imaging application, for instance in medical applications, the automated measurement of arterial wall thickness, the Interma Thickness Measurement (ITM), vascular wall thickness or diameter, fetal skullmeasurements etc. It will also be clear to a person skilled in the art that the technique may also be applied to images of a non-medical nature.
Claims (19)
- Claims 1. An automated feature measurement apparatus for providing anindication of distance between features within an ultrasound image, the S apparatus comprising: an input for receiving a signal from an ultrasound scanner, said signal representing an image as captured by the ultrasound scanner an input for receiving a signal indicating a selected region of the image, a feature location device for locating features of peak intensity within the selected region of the image, a processor for determining a measurement on the basis of the located features.
- 2. Apparatus as claimed in claim I wherein the feature location device comprises a neural network system.
- 3. Apparatus as claimed in claim 1 or 2 wherein the feature location device is arranged to locate peaks of intensity representing edges within the selected region of the image.
- 4. Apparatus as claimed in claim 1, 2 or 3 wherein the processor is arranged to provide an estimation of the width of an object within the image by determining the average distance between the features located by the feature location device.
- 5. Apparatus as claimed in any preceding claim wherein the feature location device is arranged to locate peaks of intensity representing edges within the selected region of the images and the processor is arranged to provide an estimation of the width of an object within the image by determining the average distance between the features located by the feature location device, wherein the processor is arranged to estimate a centre line between the edges located by the feature location device and to estimate the distance on a line normal to the centre line between located edges.
- 6. Apparatus as claimed in any of claims 1 to 5 wherein the feature location device is further arranged to locate trailing-to-leading edges of intensity in the selected region of the image.
- 7 Apparatus as claimed in claim 6 wherein the trailing edge of intensity is a fraction of the difference between the peak intensity and minimum intensity in the region of an edge as detected by the feature location device.
- 8. Apparatus is claimed in claim 7 wherein the trailing edge of intensity is approximately 0.65 of the difference between the peak intensity and minimum intensity in the region of an edge as detected by the feature location device.
- 9. Apparatus as claimed in any preceding claim wherein the apparatus is arranged to provide an indication of nuchal translucency measurement from an ultrasound image of a fetal nuchal region.
- 10. An automated feature measurement method for providing an indication of distance between features within an ultrasound image, the method comprising: receiving a signal from an ultrasound scanner, said signal representing an image as captured by the ultrasound scanner, receiving a signal indicating a selected region of the image, a feature location device for locating features of peak intensity within a selected region of the image, a processor for determining a measurement on the basis of the located features.
- 11. A method as claimed in claim 10 further comprising using a neural network system to locate the features.
- 12. A method as claimed in claim 10 or 11 further comprising locating peaks of intensity representing edges within the selected region of the image.
- 13. A method as claimed in claim 10, 11 or 12 further comprising providing an estimation of the width of an object within the image by determining the average distance between the features located by the feature location device.
- 14. A method as claimed in any of claims 10 to 13 further comprising locating peaks of intensity representing edges within the selected region of the images and providing an estimation of the width of an object within the image by determining the average distance between the features located by the feature location device, wherein the average distance is determined by estimating a centre line between the edges located by the feature location device and estimating the distance on a line normal to the centre line between located edges.
- 15. A method as claimed in any of claims 10 to 14 further comprising locating trailing-to-leading edges of intensity in the selected region of the image.
- 16. A method as claimed in claim 15 wherein the trailing edge of intensity is a fraction of the difference between the peak intensity and minimum intensity in the region of an edge as detected by the feature location device.
- 17. A method as claimed in claim 16 wherein trailing edge of intensity is approximately 0.65 of the difference between the peak intensity and minimum intensity in the region of an edge as detected by the feature location device. s
- 18. A method as claimed in any of claims 10 to 17 further comprising providing an indication of nuchal translucency measurement from an ultrasound image of a fetal nuchal region.
- 19. A method as claimed in any of claims 10 to 18 further comprising identifying at least two edges of an object within an image, each edge begin represented by a substantially continuous line of peak intensity within the image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0325614A GB2407636B (en) | 2003-11-03 | 2003-11-03 | Automated measurement in images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0325614A GB2407636B (en) | 2003-11-03 | 2003-11-03 | Automated measurement in images |
Publications (4)
Publication Number | Publication Date |
---|---|
GB0325614D0 GB0325614D0 (en) | 2003-12-10 |
GB2407636A true GB2407636A (en) | 2005-05-04 |
GB2407636A8 GB2407636A8 (en) | 2005-05-23 |
GB2407636B GB2407636B (en) | 2006-08-23 |
Family
ID=29725842
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0325614A Expired - Fee Related GB2407636B (en) | 2003-11-03 | 2003-11-03 | Automated measurement in images |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2407636B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009136332A2 (en) * | 2008-05-09 | 2009-11-12 | Koninklijke Philips Electronics N.V. | Automatic ultrasonic measurement of nuchal fold translucency |
JP2012105968A (en) * | 2010-10-20 | 2012-06-07 | Toshiba Corp | Ultrasonic diagnostic apparatus, control method, and image processor |
EP2965693A1 (en) * | 2014-07-11 | 2016-01-13 | Samsung Medison Co., Ltd. | Imaging apparatus and control method thereof |
WO2017149027A1 (en) * | 2016-03-01 | 2017-09-08 | Koninklijke Philips N.V. | Automated ultrasonic measurement of nuchal fold translucency |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100803328B1 (en) | 2006-07-06 | 2008-02-14 | 이화여자대학교 산학협력단 | Method for automated measurement of nuchal translucency in a fetal ultrasound image |
CN103263278B (en) * | 2013-01-23 | 2015-05-13 | 珠海艾博罗生物技术有限公司 | Image processing method for automatically measuring thickness of fetal nuchal translucency from ultrasonic image |
EP3215868B1 (en) * | 2014-11-07 | 2018-10-03 | Tessonics Corp. | An ultrasonic adaptive beamforming method and its application for transcranial imaging |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2273357A (en) * | 1992-12-09 | 1994-06-15 | Nicholas John Wald | Non-invasive medical scanning |
US5838592A (en) * | 1996-10-14 | 1998-11-17 | Diasonics Ultrasound Inc. | Method and apparatus for measuring fetus head and abdominal circumference |
EP1026516A2 (en) * | 1999-02-05 | 2000-08-09 | Animal Ultrasound Services, Inc. | Method and apparatus for analyzing an ultrasonic image of a carcass |
EP1030187A2 (en) * | 1999-02-19 | 2000-08-23 | The John P. Robarts Research Institute | Automated segmentation method for 3-dimensional ultrasound |
EP1324070A1 (en) * | 1996-04-15 | 2003-07-02 | Olympus Optical Co., Ltd. | Diagnostic ultrasonic imaging system |
US20030171668A1 (en) * | 2002-03-05 | 2003-09-11 | Kabushiki Kaisha Toshiba | Image processing apparatus and ultrasonic diagnosis apparatus |
-
2003
- 2003-11-03 GB GB0325614A patent/GB2407636B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2273357A (en) * | 1992-12-09 | 1994-06-15 | Nicholas John Wald | Non-invasive medical scanning |
EP1324070A1 (en) * | 1996-04-15 | 2003-07-02 | Olympus Optical Co., Ltd. | Diagnostic ultrasonic imaging system |
US5838592A (en) * | 1996-10-14 | 1998-11-17 | Diasonics Ultrasound Inc. | Method and apparatus for measuring fetus head and abdominal circumference |
EP1026516A2 (en) * | 1999-02-05 | 2000-08-09 | Animal Ultrasound Services, Inc. | Method and apparatus for analyzing an ultrasonic image of a carcass |
EP1030187A2 (en) * | 1999-02-19 | 2000-08-23 | The John P. Robarts Research Institute | Automated segmentation method for 3-dimensional ultrasound |
US20030171668A1 (en) * | 2002-03-05 | 2003-09-11 | Kabushiki Kaisha Toshiba | Image processing apparatus and ultrasonic diagnosis apparatus |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009136332A2 (en) * | 2008-05-09 | 2009-11-12 | Koninklijke Philips Electronics N.V. | Automatic ultrasonic measurement of nuchal fold translucency |
WO2009136332A3 (en) * | 2008-05-09 | 2010-03-25 | Koninklijke Philips Electronics N.V. | Automatic ultrasonic measurement of nuchal fold translucency |
JP2012105968A (en) * | 2010-10-20 | 2012-06-07 | Toshiba Corp | Ultrasonic diagnostic apparatus, control method, and image processor |
EP2965693A1 (en) * | 2014-07-11 | 2016-01-13 | Samsung Medison Co., Ltd. | Imaging apparatus and control method thereof |
US10298849B2 (en) | 2014-07-11 | 2019-05-21 | Samsung Medison Co., Ltd. | Imaging apparatus and control method thereof |
WO2017149027A1 (en) * | 2016-03-01 | 2017-09-08 | Koninklijke Philips N.V. | Automated ultrasonic measurement of nuchal fold translucency |
JP2019510557A (en) * | 2016-03-01 | 2019-04-18 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Automatic ultrasonic measurement of transmission image of eyelids |
US11553892B2 (en) | 2016-03-01 | 2023-01-17 | Koninklijke Philips N.V. | Automated ultrasonic measurement of nuchal fold translucency |
Also Published As
Publication number | Publication date |
---|---|
GB2407636B (en) | 2006-08-23 |
GB0325614D0 (en) | 2003-12-10 |
GB2407636A8 (en) | 2005-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11344278B2 (en) | Ovarian follicle count and size determination using transvaginal ultrasound scans | |
Krivanek et al. | Ovarian ultrasound image analysis: Follicle segmentation | |
KR101121353B1 (en) | System and method for providing 2-dimensional ct image corresponding to 2-dimensional ultrasound image | |
KR101121396B1 (en) | System and method for providing 2-dimensional ct image corresponding to 2-dimensional ultrasound image | |
Collins et al. | Computer-assisted edge detection in two-dimensional echocardiography: comparison with anatomic data | |
Gustavsson et al. | A dynamic programming procedure for automated ultrasonic measurement of the carotid artery | |
US20110196236A1 (en) | System and method of automated gestational age assessment of fetus | |
EP1913874B1 (en) | Ultrasound diagnostic apparatus and method for measuring a size of a target object | |
KR101175426B1 (en) | Ultrasound system and method for providing three-dimensional ultrasound image | |
US10572987B2 (en) | Determination of localised quality measurements from a volumetric image record | |
Aarnink et al. | A practical clinical method for contour determination in ultrasonographic prostate images | |
JP7266523B2 (en) | Prenatal ultrasound imaging | |
CN110811691A (en) | Method and device for automatically identifying measurement items and ultrasonic imaging equipment | |
US20200202551A1 (en) | Fetal development monitoring | |
CN115429325A (en) | Ultrasonic imaging method and ultrasonic imaging equipment | |
Sindhwani et al. | Semi‐automatic outlining of levator hiatus | |
MacGillivray et al. | The resolution integral: visual and computational approaches to characterizing ultrasound images | |
Khan et al. | Semiautomatic quantification of carotid plaque volume with three-dimensional ultrasound imaging | |
GB2407636A (en) | Automated measurement in images | |
JP2020103883A (en) | Ultrasound imaging system and method for displaying target object quality level | |
WO2006117497A2 (en) | Automated measurement in images | |
CN106204523A (en) | A kind of image quality evaluation method and device | |
Lakshmi et al. | Automated screening for trisomy 21 by measuring nuchal translucency and frontomaxillary facial angle | |
Thomas et al. | Automatic measurements of fetal long bones. A feasibility study. | |
KR101024857B1 (en) | Ultrasound system and method for performing color modeling processing on three-dimensional ultrasound image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20131103 |