WO2016042146A1 - A portable ultrasound system for use in veterinary applications - Google Patents

A portable ultrasound system for use in veterinary applications Download PDF

Info

Publication number
WO2016042146A1
WO2016042146A1 PCT/EP2015/071481 EP2015071481W WO2016042146A1 WO 2016042146 A1 WO2016042146 A1 WO 2016042146A1 EP 2015071481 W EP2015071481 W EP 2015071481W WO 2016042146 A1 WO2016042146 A1 WO 2016042146A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
feature
user
image
portable
Prior art date
Application number
PCT/EP2015/071481
Other languages
French (fr)
Inventor
Tarik CHOWDHURY
Daniel Ryan
John Mallon
Original Assignee
Reproinfo Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Reproinfo Ltd. filed Critical Reproinfo Ltd.
Publication of WO2016042146A1 publication Critical patent/WO2016042146A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/462Displaying means of special interest characterised by constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0866Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4427Device being portable or laptop-like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/40Animals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1075Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • A61B8/565Details of data transmission or power supply involving data transmission via a network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physiology (AREA)
  • Gynecology & Obstetrics (AREA)
  • Pregnancy & Childbirth (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Quality & Reliability (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The present application relates to portable ultrasound scanning devices primarily used in veterinary applications such as bovines, horses, pigs, sheep, camels etc. The application provides a means for the device operator to easily identify the feature type and general location whereupon it returns a fully identified feature, from which physiological data pertaining to the desired feature may be computed by the device to assist in diagnosis determination.

Description

A PORTABLE ULTRASOUND SYSTEM FOR USE IN VETERINARY APPLICATIONS
Field of the Application The present application pertains to portable ultrasonic devices used in veterinary applications.
Background
Ultrasonography or sonography is an ultrasound based diagnostic imaging technique used to visualise internal muscles, tendons, and internal organs. Conventional ultrasound imaging systems are large and bulky and operate in fixed constrained environments.
In contrast ultrasonography in veterinary applications typically requires a light portable battery operated device, and a convenient means of viewing the machine output such as disclosed in WO2000070366 while retaining peripheral vision. Often examination time must be kept to a minimum, e.g. a minute or two. Characteristic features of existing portable devices such as disclosed in US5957846, EP2749228, US7141020 and US20140024939 include water- resistance, easy to clean, impact resistant and robust, lightweight as the device has to be carried by the operator for prolonged periods, battery powered with long running time, offer range of transducer types such as sector and linear, offer a range of viewing devices such as wearable monitors and headsets.
This application refers to ultrasound data, as the output data from a device, typically in image frame format, but not limited to such a format. This data can be acquired in varying coordinate spaces, for example as with linear probes, or with sectorial probes. Data can be presented in varying coordinate spaces, often called modes. B-Mode is used commonly and represents the echoes of an array of transducers captured simultaneously and the ultrasound plotted in 2D on the screen as a function of depth. Doppler mode makes use of the Doppler effect in measuring and visualising flow. It can be used in conjunction with B-Mode. The present application relates specifically to B Mode and Doppler mode, and variations thereof but is not limited to these named modalities.
Existing portable ultrasonic scanning devices typically offer an intuitive user interaction, with a minimal number of large, easy to locate, buttons and a graphical user interface on the video display terminal to provide user feedback and navigation control on menus etc. Commercially available devices provide the ability to conveniently power cycle the machine, adjust brightness/contrast of the display device, open and navigation through menus, increase/decrease the measured field depth from the transducer head, freeze on the current video frame/image, store images and video to a file system memory, recall stored images/video from file system, and compare stored data against live video.
Many commercially available devices offer an on-screen keyboard to enter details regarding the examination. Navigation is via arrow/up-down key combinations but can be time consuming. Where animals are fitted with RFID tags, a RFID reader can be connected to the device (wirelessly or wired) and the animal ID can be automatically stored in the scanner memory, and displayed on the graphical user interface.
These features have developed in response to the operating conditions they are found in. In most applications strong animal handling equipment is required due to the likelihood of injury. For example, head-mounted viewing devices must also provide the operator with excellent peripheral vision due to the dangerous conditions. Ultrasound examinations may be conducted outdoors in chutes where environmental conditions can vary widely. Equipment must be resistant to extremes of hot/cold, wet/dry and sandy/muddy. Equipment must tolerate the physical nature of the activity, and where throughput can require up to 50 examinations per hour per machine.
The role of the operator is to use the ultrasound device to locate a biological feature or features of interest based on its (their) echogenic representation and offer a diagnosis or otherwise with regard to the desired state of the animal. Thus, the echogenic representation of relevant and non-relevant features alike must be known to the operator. A second challenge for the operator is to visually classify the echogenic representation of a particular feature based on such features as its shape, size, position and echogenic intensity in whole or at a particular location within the feature. The range and variation of features encountered require the operator to be experienced and well trained. As with fixed high powered ultrasound equipment used in medical applications such as detailed in US5605155, some tools exist to partially assist the operator in their role. It will be appreciated that in a hospital setting this may be difficult but in a farmyard setting incredibly so.
Existing portable ultrasound scanning devices provide a "Sizing" guide in the application of determining the age of a foetus. These linear (ruler based) guides offer the operator a reference ruler on the graphical user interface from which the feature size can be compared visually. This guide is given based on the crown- rump length or the trunk diameter scales. Similarly, the operator can use an on screen callipers to mark the features at the crown and at the rump location or at either side of the trunk locations, and the foetus age is calculated by the machine using the linear. These latter functions however require precisely locating two locations on the video display, which poses difficult for the operator in terms of time required, the high intensity of the work environment and the limited interface on the machine. It will be appreciated that using an input device whilst sitting at a desk is easy, doing so with one hand in a farmyard with animals is not an easy task.
Another foetus age determination algorithm exists where an ellipse is fitted by the user to the approximate foetus boundary, and the age is determined based on the ellipse parameters. However, determining foetus boundary locations in real-time remains a challenge in the field of portable ultrasound scanning devices.
A device feature exists implementing an automatic detection process for particularly discernible features that are completely fluid filled, and thus appear as black connected area in the ultrasound data. The number and type of physiological feature characteristics that can be encountered is much broader, than this subset of features, and thus these devices are limited to a small subset of features. Therefore there is a need for a portable ultrasound system for use in veterinary applications that offers advantages over the prior art.
Description of Drawings
The present application will now be described with reference to the attached figures in which:
Figure 1 is an exemplary ultrasound image of an ovary;
Figure 2 is the ultrasound image of Figure 1 with a Corpus Luteum feature highlighted;
Figure 3 is an ultrasound image of an ovary; Figure 4 is the ultrasound Image of Figure 3 with highlighted Corpus Luteum structure;
Figure 5 is an ultrasound Image of a foetus;
Figure 6 is the ultrasound of Figure 5 with the location of the Foetus highlighted;
Figure 7 is the ultrasound image of the Foetus; Figure 8 is the ultrasound of Figure 7 with the Foetus highlighted;
Figure 9 is an image showing the cross section of a uterine horn in bovines;
Figure 10 is the ultrasound of Figure 9 showing the Cross section of uterine horn with the region approximate boundary highlighted;
Figure 1 1 ultrasound image of uterine horn in bovine; Figure 12 is the ultrasound of Figure 1 1 showing the uterine horn in bovine with horn region highlighted;
Figure 13 is an exemplary Data Framework flowchart illustrating the steps conducted during the examination of an animal;
Figure 14 is an exemplary flowchart illustrating process of semi-automated data extraction for a physiological feature; Figure 15 is an example of output from a system of the present application based on the input frame shown in Fig. 5 and Fig. 6, where the small white cross-hair indicates the selected seeding point, and the white pixels line represents the boundary of the foetus tissue and in which age estimation of the foetus is displayed to aid the operator;
Figure 1 6 Example of output based on the input frame shown in Fig. 7 and Fig. 8, where the small white cross-hair indicates the selected seed point, and the white pixels line represent the foetus boundary. The age estimation of the foetus is displayed to the operator; Figure 17 Example of early stage foetus, where the small white cross-hair represents the seed point, and the white pixels line represent the foetus boundary. Note the boundary is not complete in this example as the foetus is lying on the uterine wall. The age estimation is presented to the operator;
Figure 18 Example of output based on the input frame shown in Fig. 1 and Fig. 2, for an Ovary Corpus Luteum feature, where the white cross-hair indicates the selected seed point, and the white pixels represent the boundary of the CL within the ovary tissue. The data presented in this case is the mmA2 area and maximum diameter is presented to the operator;
Figure 19 Example output for the frame shown in Fig. 3 and Fig. 4, for a Ovary Corpus Luteum feature, where the white cross-hair represents the selected seed points, and the white pixels line represent the CL boundary;
Figure 20 Example output for the input frame shown in Fig 1 1 , for a uterine horn feature. The white cross-hair shows the selected seed position, and the white pixels line represent the boundary of the horn from the other elements in the frame. The user in this example is presented with the horn area mmA2 and the maximum diameter;
Figure 21 , Flowchart illustrating the image processing pipeline used to identify features in ultrasound data;
Figure 22 Typical components of existing portable ultrasound scanning devices; Figure 23, illustrates an exemplary approach suitable for use with existing portable scanning devices by using an auxiliary device to provide the additional functionality and control features;
Figure 24 Details possible examples of auxiliary device components; Figure 25, Examples of components for integration of features within the ultrasound device;
Figure 26, Example illustrating a remote implementation of the present application; Detailed Description
Interpreting existing ultrasound images utilising portable ultrasound devices requires extensive knowledge of how the parameters; shape, contour, size position and echogenicity must each be considered individually and as a group to give consistent accurate diagnosis.
Attaining the ability to make such a complex decisions requires first of all extensive knowledge of the three-dimensional shape of the organ in space. The ultrasound operator then needs to know how to position the probe or transducer to attain the most suitable view of the desired feature. In the context of doing this on a farm animal in a farmyard or similar setting, the difficulties and difference from a clinical setting with a human patient in a hospital will be immediately apparent.
The quality of the feature echogenic representation depends on the operators understanding of the interactions between the ultrasound wave and the organ tissue as well as the proper use of the instrument's controls. The ultrasound operator must through the correct interpretation of shape, size, position and echogenicity intensity variations form a diagnosis.
In addition the operator must consider the current status or position of the animal in its own production cycle, as the expected normal physiological state constantly changes. This status can be interpreted by the operator from bio-informatics data pertaining to the animal. Based on combinations these criteria a final diagnosis is formed. It is widely considered that to gain interpretation proficiency requires extensive periods of training and experience. For example, FIG 1 shows a typical ultrasound representation of an ovary. Within the ovary there are two structures of interest, the Corpus Luteum and a Follicle. FIG 2 highlights the corpus luteum location. Similarly, FIG 3 shows another example of an ovary, with the Corpus Luteum highlighted in FIG 4. Note the difference in size of the Corpus Luteum in FIG 1 to that depicted in FIG 3. The Corpus Luteum size and intensity may be important for the operator in this case. FIG 5 and FIG 6 show a foetus which when compared with FIG 7 and FIG 8 demonstrates a difference in foetus size. The size of the foetus at this stage of development is widely considered proportional to its age. In non-pregnant states, the condition of the uterine horn is often assessed as part of the diagnosis. FIG 9 and FIG 10 shows a uterine horn that is typical of post-partum animals, or delayed involution due to infection. FIG 1 1 and FIG 12 shows a cross section of a uterine horn from a bovine, showing varying levels of echogenicity within the horn, which may be important in converging on a diagnosis for the animal. In general these examples demonstrate the variation and detail in physiological features and the inter-feature variation that can be encountered. Also, the "normal" or expected state of these features changes during the production cycle, which can add to the complexity in interpretation.
The present invention may operate within a data infrastructure framework that has been developed to assist the ultrasound operator. This system is described as context for the application, though is not depended on it. This data infrastructure system provides a means for electronic recording of ultrasound diagnosis results and the reviewing of animal bioinformatics during the ultrasound examination process. Using a small rugged handheld (Personal Data Assistant PDA) or similar device the herd information containing the animal ID's, their bioinformatics data, and their previous ultrasound examination result or diagnosis is made available.
The mode of operation of a system by the present application is generally illustrated in Figure 13 with descriptions of corresponding apparatus following below. The process begins with the arrival of an animal for testing. Optionally, an ID for the animal is entered. It will be appreciated that in most circumstances this is preferable as it allows for the subsequent storage of the recorded data for the animal. It also allows for subsequent review of this information at a later date. Indeed, generally as shown in Figure 13, the next step is the retrieval of previously stored information for the animal from a database using the animal ID.
The animal ID may be entered, 202, manually using a keyboard. It will be understood that by law in Europe, animals such as cows must have an unique identity tag which is used to track them. The tag ID may be used as the animal ID. It is also possible that the animals may be provided with electronic identifiers, e.g. RFID tags, in which case an electronic reader connected to the system may be employed to obtain the animal ID. It will be appreciated the electronic identifier may need to be correlated with entries in a database to obtain an animal ID from the electronic identifier ID.
The information which is retrieved, 203, using the animal ID may guide the operator in the performance of the ultrasound examination, 204. Once the ultrasound process has been completed, the user may enter, 205, a diagnosis or other information which in turn may be stored in the database along with the obtained ultrasound image or images.
Using either a communication channel with the ultrasound scanner, or through an auxiliary video overlay device, a user interface is established to provide feedback and a means of presenting data to the operative via the operators chosen display device. The ultrasound scanner is as previously described a portable ultrasound scanner for use in veterinary applications and is generally wearable on the user. The ultrasound scanner is suitably powered by a portable power source (typically batteries which may be housed within the scanner or provided as a portable battery back with a connecting power lead). At the same time, the system provides the user with a display device in which the user can view the ultrasound images captured by the user employing the scanner. The images may be viewed as individual image frames or as live or replayed video comprising a sequence of image frames.
The scanner may have the ability to store the frames for subsequent playback or to provide the frames to an external device for subsequent playback or review. The system may be connected to a rugged printer or a remote printer, so that the operator may print the results of the examination(s) on site.
The system may also connect with a database via a suitable network connection (e.g. WIFI, or mobile internet) internet infrastructure to synchronize herd data and upload newly created ultrasound diagnosis data. This upload may happen live or the material may be stored for upload once a data connection becomes available. Equally, other methods of upload may be provided for including removable memory devices on connected on the system, which may be removed and used to upload the data onto a server. The system suitably provides a range of administrative functions to manage herds, provide various reports and share data with other databases.
Whilst these systems generally are advantageous and offer significant advancements, the operator is still required to make numerous comparisons between the observed ultrasound data and those retained in his/her memory through training and past experience. While size measurements can be assisted with grids and device features such as ruler/linear scale references, they are excessively cumbersome and many physiological features are not possible to analyse in this way.
In this context, it must be remembered that the measurements are being made in the field in a rushed situation where the entire process with an individual animal may be completed from the animal's arrival at the scanning location to their departure may only be a minute or a matter of minutes and where the operating environment is less than laboratory conditions.
This results in a degree of inaccuracy in diagnosis and is a barrier to expanding the range and depth to which physiological features the operator confidently acquire information to inform their diagnosis upon.
The present application provides a user interface which allows a user to control a point of interest to select a location, which is a section on the ultrasound image or video frame. The section may be identified as singular point, or as a cross-hair, but in simple terms is generally a pixel or group of pixels in an image. The point of interest may be visualised as a cross-hair, or similar pointing reference, with which the operator can easily identify a location that is within a feature of interest. It will be appreciated that a user input navigation device is provided for allowing the user to create or move the point of interest. For example, the user input navigation device may be a joystick or touch sensitive input device.
Once the user has used the navigation device to move the point of interest to identify a feature of interest in the ultrasound image, an image analysis engine uses the selected section (which may correspond to a pixel or group of pixels as a seeding section for performing an analysis of the selected frame for identifying a feature in the selected frame). It will be appreciated that feature identification in image analysis may generally be performed on an image. However, the use of a seed significantly improves the process and reduces the computational time and power required.
Once the image analysis engine has identified a feature in an image, the identified feature may be presented to the user as an outline on the selected image. The outline may be a different colour to the generally black and white ultrasound imagery, e.g. a red border showing the outline of the feature.
Furthermore, whilst the selection of a seed point improves the speed of processing generally, the present application provides a further improvement which improves speed further and again reduces the power requirement.
More particularly, the user interface of the system may present the user with a predefined list of choices of pre-assigned feature types to identify the purpose of the ultrasound, i.e. which type of feature of the animal the user is trying to capture, or indeed at what stage in a biological process, the user is trying to capture a feature.
These choices may, for example, be presented to the user as a drop down list or as a series of drop down lists, e.g. a first drop down list with categories and then a second drop down list dependent on the outcome of the first providing the user with a series of choices to select from. Each choice in the list has a corresponding set of previously defined parameters. These parameters are employed by the image analysis in the processing of the image using the seed. The parameters may for example, identify a particular type of pre-filtering to be performed on the ultrasound image prior to feature identification. Indeed there may be a combination of different pre-filtering steps and the parameters may specify the nature of the filtering to be performed at each of these steps.
Thus once the seed and feature types have been selected by the user, the image analysis engine starts an internal process that identifies the boundary pixels of the feature(s) from the other pixels in the image and provides output data related to the feature. The boundary and summary statistics/data on the feature are presented to the operator for validation and consideration towards the final diagnosis. The process is outlined generally in FIG 14, which also includes steps which allow the user to select the image the user wishes to process. This is an additional step which further reduces the amount of processing required as it allows a user to select, 212, an image which they feel has best captured the feature of interest as so the image analysis engine need only operate on a single image (e.g. frame of video) rather than a plurality of images.
At the same time, when the image analysis engine has identified the boundary of the feature and presented it to the user, optionally with measurement or other data, the operator may decide, 213, that the image analysis engine has simply got it wrong. In this case, the user may elect to change, 220, the point of reference or the video frame being used and the image analysis engine repeats the analysis. In the case of the video frame being changed, the user may have an input allowing them to rewind or play forward images on their display to allow the optimal image to be selected.
An important aspect of the application is that the processing is possible to carry out on portable low power processors such as found within portable ultrasound devices or rugged portable computing devices. In order to achieve this it uses the operator and their skill to determine the feature general location from the background data. This application then resolves the remaining challenge in conducting precision measurements on this feature which is difficult for the operator.
With reference to FIG 14, the ultrasound operator using a portable ultrasound scanning device selects, 212, a physiological feature of interest in the normal fashion. The operator using their skill and understanding seeks to attain a view of the feature that describes the particular feature best. For example if the feature is a foetus, the operator will strive to capture the view where the foetus is maximal size for the foetus presentation position. If the operator wishes to examine a follicle, they will position the view to maximally capture the larges cross section of the follicle.
The operator indicates the frame within which these criteria are satisfied, but due to possible delays in indicating this selection to a device, a process of navigation between frames is optionally invoked to adjust for any miss-selection. Thus for example, the system may allow, 213, a user to press a button or other input when they believe they have captured the image to be used. However, they can subsequently scroll through, 220 images about the selected image to optimally make their selection.
The operator, using a suitable pointing reference such as a cross-hair or other reference identifies, 214, the location of a point that is within the desired feature region in the selected image or video frame. The skilled operator can easily determine the feature from the background data. The operator then selects the feature type, 215, to both indicate to the device the appropriate algorithm to conduct and to initiate this algorithm. Alternatively, the feature may have been preselected before performing the ultrasound. Using the seeding point and the feature type, the image analysis engine employs an algorithm, 216, which seeks to segment the feature from the background data. This algorithm uses the seed point to determine the boundary pixels of the feature. Where possible the segmented feature pixels are then transformed, 217, into physiological meaningful data and both the segmented feature and the computed data are presented to the operator, 218. The transformation of data depends on the level of known data about the feature representation. This data transformation process can also include an estimate of the feature characteristic compared with the "normal" trajectory for the feature based on pre-computed models and the bio- informatics data on the animal.
The segmented feature and its associated data are communicated with the operator for their own validation. This data can then be used towards the final diagnosis by the operator. In the event of a segmentation failure the operator can quickly visually judge the under or over segmentation and the level of this inaccuracy. The operator can optionally reselect, 221 , a seed point if it was not chosen correctly within the feature region or navigate to select, 220, a different target frame and proceed as before.
For the example where the feature is a foetus, the operator may be presented with the foetus boundary and a biological measurement for example an estimate of the foetus age. FIG 15, FIG 16 and FIG 17 illustrate a typical output for a selected frame for a foetus feature, while FIG 18 and FIG 19 illustrate a selected seed point and output for a selected frame for an Ovarian Corpus Luteum feature. FIG 20 shows the selected seed location and output for a uterine horn cross section feature. Note that the data transformation capability may be further transformed to more meaningful physiological data in these examples of FIG 18, FIG 19 and FIG 20.
FIG 15, FIG 1 6, FIG 17, FIG 18, FIG 19 and FIG 20 demonstrate how the selected seed does not require precise locating on the frame. Roughly locating the seed point within the feature region is sufficient. This allows the operator to complete the location easily and quickly in comparison with any existing device features that require precise locating of multiple points.
An exemplary image processing pipeline performed by the image analysis engine is outlined in FIG 21 . The input to the image processing stage, 232, comes from user interaction and represents the frame/image of interest, the selected features type and a general location for this feature, as indicated by the selected reference point.
Ultrasound data is extremely noisy, so a number of filtering steps, 233, may be applied to enable standard image processing tools to be applied. The ultrasound data also varies its intensity from scanner to scanner, with the depth of the object and with scanner settings for gains.
A process of adaptive threshold, 234, is applied, using data from the local neighbourhood around the selected reference point. This aims to counteract the inherent intensity variations and gain a better representation of the target feature.
Following the threshold process, a standard image processing segmentation algorithm, 235, is applied.
It will be appreciated that the segmentation algorithm in itself is not important and that image analysis engine may use a one of any number of different methods including for example Watershed, Level-Sets, Region growing, and many more.
However, equally, it will be appreciated that different segmentation algorithms may perform better depending on the feature in question. Accordingly, one of the previously referred to parameters associated with the choice of feature may be a selection of the segmentation algorithm to be used. Equally, it may include parameters for that particular algorithm.
Following segmentation contours are determined, 236, from the segmentation result, and these contours are then examined for their relative and internal geometrical properties by a shape analysis process, 237.
Finally the processed contours form the foundations for the feature boundary, 238, and the image processing stage is terminated. It will be appreciated that depending on the feature in question, the contours may define an enclosed shape or shape in which one part is open.
Based on the selected feature type, the parameters used in the filtering, 233, segmentation selection, 235, and shape analysis, 237, steps are set to calculate a meaningful estimate of the feature boundaries. The estimate is a measurement of a dimension of the identified feature. This may be a linear dimension (e.g. a height or width) or it may be an area.
The dimension may be converted into a biological measurement for the state of the animal, e.g. the age of a fetus. It will be appreciated that the biological measurement displayed may vary with the feature type selected. As a result, the form and manner of calculating the biological measurement and the display of it may be dependent on the feature type selected. The shape analysis suitably converts the estimated measurement dimension (e.g. length width or area) into a biological measurement using either a reference table or conversion formula. The choice of which may dependent on the feature type selected. Equally, the biological measurement may be obtained with reference to one or more thresholds, where the dimensional measurement is compared with one or more thresholds to ascertain whether the dimensional measurement falls within a particular range identifying a particular condition and this condition is presented to the user as the state of the animal, e.g. ready for insemination / not ready for insemination.
The application will now be described with reference to several different implementations. It will be appreciated that whilst the different features may be provided differently between the different implementations that they provide generally the same functionality.
One implementation of this application is via an auxiliary device that connects to an existing portable ultrasound scanning device, 241 . The basic components of existing ultrasound devices is shown in FIG 22, where a probe, 242, is connected to driving and decoding electronic circuits and firmware inside the device to provide ultrasound image data. This data is transformed to form a visualisation, 243, and typically sent to a display device, 245 within the device or as a separate attached display (e.g. a headset display device). The teaching of the present application may be provided within an auxiliary device, 256, taking power, 254, from the ultrasound scanner or more typically via its own independent battery power. This configuration is illustrated in FIG 23, including a separate input device to provide pointing control and button control, 255. The auxiliary device, 256, provides both the additional user interface, 255, required for the user and the image processing and control to implement the features. It will be appreciated that in this configuration, the auxiliary device, 256, operates on a video feed from the portable ultrasound device, 251 . Accordingly, the auxiliary device may present the ultrasound images to a user on a display device, 257, rather than the portable ultrasound device. A user input/interface device, 255, may be provided on or connected to the auxiliary device to allow a user to provide inputs in order to allow a user to perform the method described above. Possible internal components of such an auxiliary device, 256, are depicted in FIG 24, which it will be appreciated shares a number of components in common with general computing device, including input devices and associated interfaces (keypad/control buttons and pointing devices, 261 ). Additionally, however the auxiliary device provides a video acquisition module, 262, for receiving a video output from the portable ultrasound machine. The video acquisition module, 262, is required to digitise the ultrasound data content. These images are stored in memory (image buffers), 264, to facilitate navigation around a feature of interest by the user. The functionality of the previously described user interface and image analysis engine is performed generally by the image processing unit, 263. At the same time, the ultrasound video is overlaid in an image overlay unit, 267, with detail from the user interface and image analysis engine and in turn is passed to a video generator module, 268, to provide a compatible output for the connected visualization/display device employed by the user.
In one arrangement, the visualization device is that previously employed by the portable ultrasound device and in effect the auxiliary device is interposed between the ultrasound device and the display device so that the system generally appears to function the same, with of course additional functionality and data obtained and presented to the user.
The selected key frame and seed point is managed by the control module, 265, based on user input, which initiates the image processing unit, 263, and a data translation unit, 266. The output from these units is fed to the live video path. The electronics and components are chosen to maximise speed of computation for real-time operation and to conserve power.
Another implementation is depicted in FIG 25, where the necessary processors and memory components are compiled as part of the ultrasound device. Additional components to support frame-navigation, image processing, and data translation for recovered features may be required for this implementation. FIG 25 shows the use of an external keypad/pointing device, 273, although this may be incorporated within the portable ultrasound device unit with suitable means. It will be appreciated that the functionality is generally the same as that of figures 23 and 24 except that the functionality is combined together in one unit. Possible components of such a configured unit includes a probe, 271 , an ultrasound controller, 272, an ultrasound signal path and processing unit, 274, memory (image buffers), 275, an image processing unit, 276, a user interface and control module, 277, a video generation module, 278, a data translation unit, 280, and a display device, 279.
Another implementation of this application is where the step of acquiring, 302, the images using the portable ultrasound device, 303, is performed at a separate location, 301 , to the performance of the analysis, 307. In this arrangement, the captured ultrasound imagery may be stored, 304, for subsequent processing. For example, a device operator can choose to store, 304, a chosen frame or a video of a desired feature. This stored frame or video may then be uploaded to a subsequent device, 315, for processing. This may happen immediately, for example if a data communication link is available. Otherwise, the information may be stored, 306, on a removable memory, 305, which may be placed in an external reader, 308, for transferring the information to another device, 314, whereupon this image or video is loaded, 309, from the external reader, 308, into memory (image buffer), 310, and passed to a second computing device, 312, to carry out the steps outlined in FIG 14 where data from the analysis is transformed, 31 1 , into physiological meaningful data and both the segmented feature and the computed data, as disclosed in the implementation of FIG 14, is presented to the operator, 313. The means of transferring the image or video from one device, 300, to another device, 315, are many and varied, and may include: transfer via dedicated memory device, physically re-located from the ultrasound scanning device to the host computing device, transfer via local area network, transfer via internet or other means. This arrangement is outlined generally in FIG 26.
The above examples utilise an input device, 314, to achieve pointing and selection capability. One form of input device may be implemented with a wearable mouse/ keypad combination. By wearable is meant a device can be mechanically fastened to the operator's clothes or through other means to the user. The input device is suitably lightweight. During the ultrasound examination procedures the operator can use a hands free headset, to view the portable ultrasound device output. With the scanning probe in one hand, the free hand can be used to operate the wearable keypad/mouse-pointing device. This provides for simultaneous use of the ultrasound machine and interaction with the system. The connections between the input device and the system example may be wired or wireless.
Another form of input device is possible by inclusion of appropriate pointing controls on the ultrasound device. Another form of input device is via voice activation. Another form of input device is via eye focus tracking or interest point tracking.

Claims

1 . A portable ultrasound system for use in veterinary applications where an operator is performing an ultrasound examination on an animal, the ultrasound system comprising: a battery powered portable ultrasound device, the ultrasound device having an ultrasound probe and a headset display device with the probe being configured to capture ultrasound images with the device presenting the captured images to the operator live on the headset display device, the ultrasound device being further configured to store one or more of the captured plurality of ultrasound images;
a user interface allowing a user to view the captured images on the headset display device and to select one of the one or more stored ultrasound images; a user input navigation device for receiving an input from a user to allow a user to select a point in the selected image;
an image analysis engine operating upon the selected image, the image analysis engine using the selected point as a seeding section for performing an analysis of the selected frame for identifying a feature in the selected image, wherein the user interface provides the user with a predefined list of choices to identify the purpose of the ultrasound of the animal and allows a user to select one choice from the predefined list, wherein the image analysis engine is configured to employ the selected choice to retrieve previously defined image processing parameters for identifying the feature in the selected image and wherein the identified feature is presented to the user as an outline visible with the selected image viewed on the headset display.
2. A portable ultrasound system according to claim 1 , wherein the user interface allows the user to accept the identified feature, whereupon the system is configured to store the selected image and the associated identified feature in a database, where the selected image and associated identified feature are stored in the database with an identifier for the animal.
3. A portable ultrasound system according to claim 2, wherein the feature identified is a feature defined by an enclosed boundary.
4. A portable ultrasound system according to claim 2, wherein the feature identified by an open boundary.
5. A portable ultrasound system according to any preceding claim, wherein the image analysis engine is configured to calculate a measurement of a dimension of the identified feature in the frame.
6. A portable ultrasound system according to claim 5, wherein image analysis engine is configured to convert the measurement into a biological measurement for the state of the animal using a reference table or algorithm selected based on the selected choice.
7. A portable ultrasound system according to any preceding claim, wherein the image analysis engine is provided on a portable computing device separate from the portable ultrasound device.
8. A portable ultrasound system according to any one of claims 1 to 6, wherein the image analysis engine is provided on the ultrasound device and wherein the user interface is presented on the user headset
9. A portable ultrasound system according to any preceding claim wherein the images are captured as video, with each image being a frame of the video.
PCT/EP2015/071481 2014-09-18 2015-09-18 A portable ultrasound system for use in veterinary applications WO2016042146A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1416542.7A GB2530491A (en) 2014-09-18 2014-09-18 A Portable ultrasound system for use in veterinary Applications
GB1416542.7 2014-09-18

Publications (1)

Publication Number Publication Date
WO2016042146A1 true WO2016042146A1 (en) 2016-03-24

Family

ID=51869142

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/071481 WO2016042146A1 (en) 2014-09-18 2015-09-18 A portable ultrasound system for use in veterinary applications

Country Status (2)

Country Link
GB (1) GB2530491A (en)
WO (1) WO2016042146A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3513731A1 (en) * 2018-01-23 2019-07-24 Koninklijke Philips N.V. Device and method for obtaining anatomical measurements from an ultrasound image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5605155A (en) * 1996-03-29 1997-02-25 University Of Washington Ultrasound system for automatically measuring fetal head size
US6126608A (en) * 1999-05-18 2000-10-03 Pie Medical Equipment B.V. Portable ultrasound diagnostic system with handsfree display
US20050228281A1 (en) * 2004-03-31 2005-10-13 Nefos Thomas P Handheld diagnostic ultrasound system with head mounted display
EP2387949A1 (en) * 2010-05-17 2011-11-23 Samsung Medison Co., Ltd. Ultrasound system for measuring image using figure template and method for operating ultrasound system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6385332B1 (en) * 1999-02-19 2002-05-07 The John P. Roberts Research Institute Automated segmentation method for 3-dimensional ultrasound
US20080267499A1 (en) * 2007-04-30 2008-10-30 General Electric Company Method and system for automatic detection of objects in an image
US9314225B2 (en) * 2012-02-27 2016-04-19 General Electric Company Method and apparatus for performing ultrasound imaging
KR20140093359A (en) * 2013-01-15 2014-07-28 삼성전자주식회사 User interaction based image segmentation apparatus and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5605155A (en) * 1996-03-29 1997-02-25 University Of Washington Ultrasound system for automatically measuring fetal head size
US6126608A (en) * 1999-05-18 2000-10-03 Pie Medical Equipment B.V. Portable ultrasound diagnostic system with handsfree display
US20050228281A1 (en) * 2004-03-31 2005-10-13 Nefos Thomas P Handheld diagnostic ultrasound system with head mounted display
EP2387949A1 (en) * 2010-05-17 2011-11-23 Samsung Medison Co., Ltd. Ultrasound system for measuring image using figure template and method for operating ultrasound system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3513731A1 (en) * 2018-01-23 2019-07-24 Koninklijke Philips N.V. Device and method for obtaining anatomical measurements from an ultrasound image
WO2019145147A1 (en) 2018-01-23 2019-08-01 Koninklijke Philips N.V. Device and method for obtaining anatomical measurements from an ultrasound image
US11931201B2 (en) 2018-01-23 2024-03-19 Koninklijke Philips N.V. Device and method for obtaining anatomical measurements from an ultrasound image

Also Published As

Publication number Publication date
GB2530491A (en) 2016-03-30
GB201416542D0 (en) 2014-11-05

Similar Documents

Publication Publication Date Title
KR102288308B1 (en) Ultrasonic Diagnostic Apparatus
US7857765B2 (en) Protocol-driven ultrasound examination
US20120108960A1 (en) Method and system for organizing stored ultrasound data
KR102491757B1 (en) Echo Window Artifact Classification and Visual Indicators for Ultrasound Systems
US11931201B2 (en) Device and method for obtaining anatomical measurements from an ultrasound image
WO2014086191A1 (en) Ultrasound system, and method and apparatus for associating detection information of the same
CN111971688A (en) Ultrasound system with artificial neural network for retrieving imaging parameter settings of relapsing patients
CN102028498A (en) Ultrasonic diagnosis apparatus and ultrasonic image processing apparatus
US20140153358A1 (en) Medical imaging system and method for providing imaging assitance
CN111214254A (en) Ultrasonic diagnostic equipment and section ultrasonic image acquisition method and readable storage medium thereof
KR20200080906A (en) Ultrasound diagnosis apparatus and operating method for the same
JP2007167116A (en) Ultrasonic diagnosis apparatus
US20220087644A1 (en) Systems and methods for an adaptive interface for an ultrasound imaging system
JP6258026B2 (en) Ultrasonic diagnostic equipment
US11896434B2 (en) Systems and methods for frame indexing and image review
WO2021034981A1 (en) Ultrasound guidance dynamic mode switching
WO2016042146A1 (en) A portable ultrasound system for use in veterinary applications
US20220273267A1 (en) Ultrasonic imaging method and ultrasonic imaging system
JP2024501181A (en) Ultrasound image acquisition, tracking, and review
CN114601494A (en) Ultrasonic diagnostic system and operation support method
WO2016105972A1 (en) Report generation in medical imaging
KR20200099910A (en) Apparatus and method for displaying ultrasound image and computer program product
US11844654B2 (en) Mid-procedure view change for ultrasound diagnostics
JP4843728B2 (en) Ultrasonic diagnostic equipment
US11413019B2 (en) Method and apparatus for displaying ultrasound image of target object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15771068

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WPC Withdrawal of priority claims after completion of the technical preparations for international publication

Ref document number: 1416542.7

Country of ref document: GB

Date of ref document: 20170307

Free format text: WITHDRAWN AFTER TECHNICAL PREPARATION FINISHED

122 Ep: pct application non-entry in european phase

Ref document number: 15771068

Country of ref document: EP

Kind code of ref document: A1