WO2020086811A1 - Procédés et appareil pour effectuer des mesures sur une image ultrasonore - Google Patents

Procédés et appareil pour effectuer des mesures sur une image ultrasonore Download PDF

Info

Publication number
WO2020086811A1
WO2020086811A1 PCT/US2019/057809 US2019057809W WO2020086811A1 WO 2020086811 A1 WO2020086811 A1 WO 2020086811A1 US 2019057809 W US2019057809 W US 2019057809W WO 2020086811 A1 WO2020086811 A1 WO 2020086811A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
icon
ellipse
vertex
touch
Prior art date
Application number
PCT/US2019/057809
Other languages
English (en)
Inventor
David Elgena
Matthew De Jonge
Cristina SHIN
Original Assignee
Butterfly Network, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Butterfly Network, Inc. filed Critical Butterfly Network, Inc.
Publication of WO2020086811A1 publication Critical patent/WO2020086811A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/468Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means allowing annotation or message recording
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data

Definitions

  • Ultrasound systems may be used to perform diagnostic imaging and/or treatment, using sound waves with frequencies that are higher with respect to those audible to humans.
  • Ultrasound imaging may be used to see internal soft tissue body structures, for example to find a source of disease or to exclude any pathology.
  • pulses of ultrasound are transmitted into tissue (e.g., by using a pulser in an ultrasound imaging device)
  • sound waves are reflected off the tissue, with different tissues reflecting varying degrees of sound.
  • These reflected sound waves may then be recorded and displayed as an ultrasound image to the operator.
  • the strength (amplitude) of the sound signal and the time it takes for the wave to travel through the body provide information used to produce the ultrasound image.
  • Many different types of images can be formed using ultrasound systems, including real-time images. For example, images can be generated that show two-dimensional cross-sections of tissue, blood flow, motion of tissue over time, the location of blood, the presence of specific molecules, the stiffness of tissue, or the anatomy of a three-dimensional region.
  • a method includes displaying, on a touch-sensitive display screen of a processing device in operative communication with an ultrasound device: an ultrasound image, a movable measurement tool, and an icon that maintains a fixed distance from a portion of the measurement tool, where the icon is configured to modify the measurement tool, and the icon does not overlap the measurement tool.
  • the measurement tool comprises a line, the icon maintains the fixed distance from an endpoint of the line, and the icon is configured to control a position of the endpoint of the line.
  • the measurement tool comprises an ellipse, the icon maintains the fixed distance from a vertex of the ellipse, and the icon is configured to control a length of an axis of the ellipse that includes the vertex.
  • the measurement tool comprises an ellipse, the icon maintains the fixed distance from a vertex of the ellipse, and the icon is configured to control a rotation of the ellipse.
  • a method includes displaying, on a touch-sensitive display screen of a processing device in operative communication with an ultrasound device: an ultrasound image, a line extending between a first endpoint and a second endpoint, and an icon located a fixed distance from the first endpoint along a direction defined by the line; detecting a dragging movement covering a distance in a horizontal direction and/or a distance in a vertical direction across the touch-sensitive display screen, wherein the dragging movement begins on or within a threshold distance of the icon; displaying the first endpoint at a new location on the touch-sensitive display screen that is removed from the endpoint’s previous location by the distance in the horizontal direction and/or the distance in the vertical direction; displaying the icon at a new location on the touch-sensitive display screen that is removed from the new location of the first endpoint by the fixed distance along the direction defined by the line; and performing a measurement on the ultrasound image based on the line.
  • a method includes displaying, on a touch-sensitive display screen of a processing device in operative communication with an ultrasound device, an ultrasound image and a line extending between a first endpoint and a second endpoint; detecting a dragging movement covering a distance in a horizontal direction and/or a distance in a vertical direction across the touch-sensitive display screen, wherein the dragging movement begins on or within a threshold distance of the line; displaying the first endpoint and the second endpoint of the line at new locations on the touch-sensitive display screen that are removed from their previous locations by the distance in the horizontal direction and/or the distance in the vertical direction; and performing a measurement on the ultrasound image based on the line.
  • a method includes displaying, on a touch-sensitive display screen of a processing device in operative communication with an ultrasound device: an ultrasound image; an ellipse having an axis that is either a major axis or a minor axis of the ellipse, wherein the axis extends between a first vertex and a second vertex of the ellipse; and an icon located a fixed distance from the first vertex along a direction defined by the axis; detecting a dragging movement covering a distance along the direction defined by the axis of the ellipse across the touch-sensitive display screen, wherein the dragging movement begins on or within a threshold distance of the icon; displaying the first vertex at a new location on the touch-sensitive display screen that is removed from the first vertex’s previous location by the distance along the direction defined by the axis of the ellipse; displaying the second vertex at a new location on the touch-sensitive display screen that is removed from the
  • a method includes displaying, on a touch-sensitive display screen of a processing device in operative communication with an ultrasound device: an ultrasound image; an ellipse having an axis that is either a major axis or a minor axis of the ellipse, wherein the axis extends between a first vertex and a second vertex of the ellipse; and an icon located a fixed distance from the first vertex along a direction defined by the axis; detecting a dragging movement covering a distance along and/or a distance orthogonal to the direction defined by the axis of the ellipse across the touch-sensitive display screen, wherein the dragging movement begins on or within a threshold distance of the icon;
  • a method includes displaying, on a touch-sensitive display screen of a processing device in operative communication with an ultrasound device: an ultrasound image; an ellipse having an axis that is either a major axis or a minor axis of the ellipse, wherein the axis extends between a first vertex and a second vertex of the ellipse; and an icon located a fixed distance from the first vertex along a direction defined by the axis; detecting a dragging movement covering distance in a horizontal direction and/or a distance in a vertical direction across the touch-sensitive display screen, wherein the dragging movement begins in an interior of the ellipse or within a threshold distance of a boundary of the ellipse; displaying the first vertex and the second vertex at new locations on the touch- sensitive display screen that are removed from their previous locations by the distance in the horizontal direction and/or the distance in the vertical direction; and performing a
  • a method of operating a processing device configured to display ultrasound images includes displaying an ultrasound image on a display screen of the processing device; displaying a measurement tool overlay on the ultrasound image, the measurement tool overlay comprising a target point; displaying, on the display screen, a touch-sensitive measurement tool control icon corresponding to the target point; and in response to receiving touch input to the display screen, moving the target point and the touch- sensitive measurement tool control icon while maintaining a fixed distance between them.
  • the touch-sensitive measurement tool control icon does not overlap the measurement tool overlay.
  • Some aspects include at least one non-transitory computer-readable storage medium storing processor-executable instructions that, when executed by at least one processor, cause the at least one processor to perform the above aspects and embodiments.
  • Some aspects include an ultrasound system having a processing device configured to perform the above aspects and embodiments.
  • FIG. 1 illustrates an example graphical user interface (GUI) that may be displayed on a touch-sensitive display screen of a processing device in an ultrasound system, in accordance with certain embodiments described herein.
  • GUI graphical user interface
  • the GUI includes a line for performing a measurement on an ultrasound image
  • FIG. 2 illustrates another example of the graphical user interface of FIG. 1, in accordance with certain embodiments described herein;
  • FIG. 3 illustrates another example of the graphical user interface of FIG. 1, in accordance with certain embodiments described herein;
  • FIG. 4 illustrates another example of the graphical user interface of FIG. 1, in accordance with certain embodiments described herein;
  • FIG. 5 illustrates another example of the graphical user interface of FIG. 1, in accordance with certain embodiments described herein;
  • FIG. 6 illustrates an example graphical user interface that may be displayed on a touch-sensitive display screen of a processing device in an ultrasound system, in accordance with certain embodiments described herein.
  • the GUI includes an ellipse for performing a measurement on an ultrasound image;
  • FIG. 7 illustrates another example of the graphical user interface of FIG. 6, in accordance with certain embodiments described herein;
  • FIG. 8 illustrates another example of the graphical user interface of FIG. 6, in accordance with certain embodiments described herein;
  • FIG. 9 illustrates another example of the graphical user interface of FIG. 6, in accordance with certain embodiments described herein;
  • FIG. 10 illustrates another example of the graphical user interface of FIG. 6, in accordance with certain embodiments described herein;
  • FIG. 11 illustrates a method for determining how much to rotate an ellipse based on a dragging movement, in accordance with certain embodiments described herein;
  • FIG. 12 illustrates an example GUI that may be shown when ultrasound data is being collected, in accordance with certain embodiments described herein;
  • FIG. 13 illustrates an example GUI that may be shown upon selection of a freeze option from the GUI of FIG. 12, in accordance with certain embodiments described herein;
  • FIG. 14 illustrates an example GUI that may be shown upon selection of a
  • FIG. 15 illustrates an example process for performing measurements on an ultrasound image based on a line, in accordance with certain embodiments described herein;
  • FIG. 16 illustrates an example process for performing measurements on an ultrasound image based on a line, in accordance with certain embodiments described herein;
  • FIG. 17 illustrates an example process for performing measurements on an ultrasound image based on a line, in accordance with certain embodiments described herein;
  • FIG. 18 illustrates an example process for performing measurements on an ultrasound image based on a line, in accordance with certain embodiments described herein;
  • FIG. 19 illustrates an example process for performing measurements on an ultrasound image based on a line, in accordance with certain embodiments described herein;
  • FIG. 20 illustrates a schematic block diagram illustrating aspects of an example ultrasound system upon which various aspects of the technology described herein may be practiced.
  • FIG. 21 is a schematic block diagram illustrating aspects of another example ultrasound system upon which various aspects of the technology described herein may be practiced.
  • Such imaging devices may include ultrasonic transducers monolithically integrated onto a single semiconductor die to form a monolithic ultrasound device. Aspects of such ultrasound-on-a- chip devices are described in U.S. Patent Application No. 15/415,434 titled“UNIVERSAL ULTRASOUND DEVICE AND RELATED APPARATUS AND METHODS,” filed on January 25, 2017 (and assigned to the assignee of the instant application) and published as U.S. Pat. Pub. No. US-2017-0360397-A1, which is incorporated by reference herein in its entirety.
  • Such an ultrasound device may be in operative communication with a processing device, such as a smartphone or a tablet, having a touch-sensitive display screen.
  • the processing device may display ultrasound images generated from ultrasound data collected by the ultrasound device.
  • Performing measurements may include modifying the position, orientation, and/or shape of a measurement tool such as a line or ellipse displayed on the ultrasound image to perform calculations of spatial length or spatial area represented by the ultrasound image.
  • a measurement tool such as a line or ellipse displayed on the ultrasound image to perform calculations of spatial length or spatial area represented by the ultrasound image.
  • the technology includes icons that are displayed a fixed distance from certain portions of a line or an ellipse, and which in some embodiments do not overlap with any portion of the line or ellipse. The icons may be used to modify the measurement tool.
  • a user may perform a dragging movement across the touch-sensitive display screen that begins on an icon located a fixed distance from the endpoint.
  • the processing device may change the location of the endpoint by a distance corresponding to the distance covered by the dragging movement.
  • the processing device may update, based on the dragging movement, the location of the endpoint at a sufficiently high rate such that the endpoint appears to follow the dragging movement as the dragging movement proceeds.
  • the endpoint may appear to follow the user’s finger.
  • the endpoint may be removed from the user’s finger by the fixed distance as the user drags his/her finger across the touch-sensitive display screen.
  • the endpoint may be visible to the user, and the user may be able to determine when the endpoint has moved to the desired location and release his/her finger from the touch-sensitive display to cause the endpoint to remain in the desired location.
  • the icon may not overlap with any portion of the line, which may further help the user to determine, as s/he drags his/her finger, when the line has been positioned as desired.
  • a processing device may implement different methods in order to cause the same result to occur.
  • code designed to cause the result to occur may implement a different method to cause the result to occur than those described.
  • FIGs. 1-9 illustrate example graphical user interfaces that may be displayed on a touch-sensitive display screen of a processing device in an ultrasound system, in accordance with certain embodiments described herein.
  • FIGs. 1-5 illustrate examples GUIs that include a line for performing a measurement on an ultrasound image.
  • FIGs. 6-10 illustrate example GUIs that include an ellipse for performing a measurement on an ultrasound image.
  • the processing device may be in operative communication with an ultrasound device. Ultrasound systems and devices are described in more detail with reference to FIGs. 20-21.
  • FIG. 1 illustrates an example GUI 100 that includes a line 102, a first icon 104, a second icon 106, a first crosshairs 108, a second crosshairs 110, a measurement value indicator 112, a delete option 114, and an ultrasound image 120.
  • the line 102 extends between a first endpoint 116 and a second endpoint 118.
  • the first crosshairs 108 may help to visually highlight the location of the first endpoint 116 and the second crosshairs 110 may help to visually highlight the location of the second endpoint 118.
  • the line 102 is superimposed on the ultrasound image 120 and may be used to perform a length measurement on the ultrasound image 120.
  • the processing device may perform a calculation of the spatial length represented by the ultrasound image 120 between the first endpoint 116 and the second endpoint 118.
  • the processing device may receive information from the ultrasound device indicating that the ultrasound image 120 was collected from an area having a certain size.
  • the processing device may use this information to determine the spatial size represented by each pixel and thereby determine the spatial length represented by the line 102. (Similar methods may be used for measurements of spatial length and area using an ellipse, as described below).
  • the spatial length represented by the ultrasound image 120 between the first endpoint 116 and the second endpoint 118 is depicted by the measurement value indicator 112.
  • the user may cause the processing device to modify the locations of the first endpoint 116 and/or the second endpoint 118 on the GUI 100.
  • the user may cause the processing device to modify the locations of the first endpoint 116 and/or the second endpoint 118 to coincide with endpoints of a particular anatomical structure visible in the ultrasound image 120 if the user desires to measure the distance between the endpoints of the anatomical structure.
  • the processing device may update the measurement value indicator 112 based on the new distance between the first endpoint 116 and the second endpoint 118.
  • the processing device may remove the line 102, the first icon 104, and the second icon 106 from the touch-sensitive display in response to a user selection of the delete option 114.
  • the first icon 104 and the second icon 106 are circular, although other forms are possible. Additionally, in FIG. 1, no portion of the first icon 104 or the second icon 106 overlaps the line 102. However, in some embodiments, a portion of the first icon 104 or the second icon 106 may overlap the line 102.
  • the inventors have developed technology for assisting a user in modifying the locations of the first endpoint 116 and/or the second endpoint 118 (and thereby modifying the position and/or orientation of the line 102) using a touch-sensitive display screen.
  • the technology includes display of the first icon 104 and the second icon 106.
  • the first icon 104 is positioned a fixed distance 122 from the first endpoint 116.
  • the second icon 106 is positioned the fixed distance 122 from the second endpoint 118.
  • the fixed distance 122 may be a predetermined distance.
  • the fixed distance 122 may be a default distance.
  • the fixed distance 122 may be selected by a user.
  • an icon being positioned a fixed distance from some feature may mean that the center of the icon is positioned the fixed distance from the feature.
  • the fixed distance between the first icon 104 and the first endpoint 116 and the fixed distance between the second icon 106 and the second endpoint 118 may not be the same.
  • the processing device may change the location of the first endpoint 116 based on a dragging movement on the touch-sensitive display screen that begins on or within a threshold distance of the first icon 104.
  • a dragging movement may include, for example, a user touching his/her finger to the touch-sensitive display and dragging his/her finger to a different location on the touch-sensitive display screen.
  • the processing device may change the location of the second endpoint 118 based on a dragging movement on the touch-sensitive display screen that begins on or within a threshold distance of the second icon 106.
  • the processing device may change the location of the first endpoint 116 by that same distance in the horizontal direction and/or distance in the vertical direction.
  • a drag that covers a certain distance in a certain direction need not mean that the drag actually proceeded along that direction, but rather than the drag had a component along that direction.
  • a drag in an arbitrary direction across a touch-sensitive display screen may have a component along the horizontal direction and a component along the vertical direction of the touch- sensitive display screen).
  • the processing device may change the location of the second endpoint 118 by that same distance in the horizontal direction and/or distance in the vertical direction.
  • each pixel having a location that is x pixels in the horizontal direction and a location that is y pixels in the vertical location, where x and y are measured from an origin (e.g., a comer of the touch-sensitive display screen).
  • an origin e.g., a comer of the touch-sensitive display screen.
  • the processing device may change the location of the first endpoint 116 such that the first endpoint 116 is displayed at (elx+(d2x-dlx), ely+(d2y-dly)).
  • the processing device may similarly change the location of the second endpoint 118 based on a drag that begins at or within a threshold distance of the second icon 106.
  • the processing device may display the rest of the line 102 between the first endpoint 116 and/or the second endpoint 118.
  • the processing device may use the Cartesian equation for a line to determine locations for points along the line that are not endpoints.
  • the processing device may update, based on a dragging movement, the location of the first endpoint 116 at a sufficiently high rate such that the first endpoint 116 appears to follow the dragging movement as the dragging movement proceeds.
  • the first endpoint 116 may appear to follow the user’s finger. Because changing the location of the first endpoint 116 may be initiated in this example by the user touching his/her finger to the first icon 104, which may be located a fixed distance away from the first endpoint 116, the first endpoint 116 may be removed from the user’s finger by the fixed distance as the user drags his/her finger across the touch-sensitive display screen.
  • the first endpoint 116 may be visible to the user, and the user may be able to determine when the first endpoint 116 has moved to the desired location and release his/her finger from the touch-sensitive display to cause the first endpoint 116 to remain in the desired location.
  • the same discussion applies to the second endpoint 118 and the second icon 106.
  • the processing device may change the location of the first icon 104 such that the first icon 104 is displayed a fixed distance from the first endpoint 116 along a direction defined by the line 102.
  • the processing device may change the location of the second icon 106 such that the second icon 106 is displayed a fixed distance from the second endpoint 118 along a direction defined by the line 102 (i.e., the direction defined by the line 102 after the location of the first endpoint 116 and/or the location of the second endpoint 118 has changed).
  • the first endpoint 116 is located at (elx, ely), the second endpoint 118 is located at (e2x, e2y), and the fixed distance is d.
  • the processing device may similarly change the location of the second icon 106 based on a new position of the second endpoint 118.
  • the processing device may change the location of the icon by the distance in the horizontal direction and/or the distance in the vertical direction equivalent to the distance in the horizontal direction and/or a distance in the vertical direction covered by the dragging movement, and change the location of the corresponding endpoint to be a fixed distance from the icon’s new position along a direction defined by the line.
  • the processing device may change the location of both the endpoint and the icon by the distance in the horizontal direction and/or the distance in the vertical direction equivalent to the distance in the horizontal direction and/or a distance in the vertical direction covered by the dragging movement.
  • the processing device may remove the first icon 104 from display during a dragging movement that begins at the first icon 104 and remove the second icon 106 from display during a dragging movement that begins at the second icon 106. This may help the user to understand that the measurement will be performed based on the line 102 and not based on either the first icon 104 or the second icon 106. In other words, this may help the user to understand that the line 102 does not extend to the first icon 104 or the second icon 106. However, in other embodiments, the processing device may continue to display the first icon 104 and the second icon 106 during a dragging movement that begins at the first icon 104 or the second icon 106, respectively.
  • FIG. 2 illustrates the example graphical user interface (GUI) 100 after a dragging movement beginning on or within a threshold distance of the first icon 104.
  • GUI graphical user interface
  • the processing device has changed the location of the first endpoint 116 from its location in FIG. 1.
  • the processing device may have changed the location of the first endpoint 116 by a distance in the horizontal direction and/or a distance in the vertical direction equivalent to the distance in the horizontal direction and/or the distance in the vertical direction covered by the dragging movement.
  • the processing device has displayed the rest of the line 102 between the new location of the first endpoint 116 and the previous location of the second endpoint 118.
  • the processing device has also changed the location of the first icon 104 from its location in FIG. 1 to be the fixed distance 122 away from the first endpoint 116 along a direction defined by the line 102. It should be noted that the processing device has changed the measurement value depicted by the measurement value indicator 112 in FIG. 2 from that shown in FIG. 1 based on the change in length of the line 102 from FIG. 1 to FIG. 2.
  • FIG. 3 illustrates the example graphical user interface (GUI) 100 after a dragging movement beginning on or within a threshold distance of the second icon 106.
  • GUI graphical user interface
  • the processing device has changed the location of the second endpoint 118 from its location in FIG. 1.
  • the processing device may have changed the location of the second endpoint 118 by a distance in the horizontal direction and/or a distance in the vertical direction equivalent to the distance in the horizontal direction and/or the distance in the vertical direction covered by the dragging movement.
  • the processing device has displayed the rest of the line 102 between the new location of the second endpoint 118 and the previous location of the first endpoint 116.
  • the processing device has also changed the location of the second icon 106 from its location in FIG. 2 to be the fixed distance 122 away from the second endpoint 118 along a direction defined by the line 102. It should be noted that the processing device has changed the measurement value depicted by the measurement value indicator 112 in FIG. 3 from that shown in FIG. 2 based on the change in length of the line 102 from FIG. 2 to FIG. 3.
  • the processing device may change the position of both the first endpoint 116 and the second endpoint 118 based on a dragging movement that begins on or within a threshold distance of any portion of the line 102.
  • the processing device may change the locations of both the first endpoint 116 and the second endpoint 118 by a distance of (d2x-dlx, d2y-dly).
  • the processing device may also change the locations of the first icon 104 and the second icon 106 such that they are the fixed distance 122 away from the first endpoint 116 and the second endpoint 118, respectively, along the direction of the line 102.
  • the processing device may display the rest of the line 102 between the new locations of the first endpoint 116 and the second endpoint 118.
  • the processing device may use the Cartesian equation for a line to determine locations for points along the line 102 between the first endpoint 116 and the second endpoint 118.
  • the processing device may change the locations of all displayed points along the line 102 by a distance of (d2x-dlx, d2y- dly).
  • FIG. 4 illustrates the example graphical user interface (GUI) 100 after a dragging movement beginning on or within a threshold distance of the line 102.
  • GUI graphical user interface
  • the processing device has changed the location of the line from its location in FIG. 3.
  • the processing device may have changed the location of the first endpoint 116 and the second endpoint 118 by a distance in the horizontal direction and/or a distance in the vertical direction equivalent to the distance in the horizontal direction and/or the distance in the vertical direction covered by the dragging movement.
  • the processing device has displayed the rest of the line 102 between the new locations of the first endpoint 116 and the second endpoint 118.
  • the processing device has also changed the locations of the first icon 104 and the second icon 106 from their locations in FIG. 3 to be the fixed distance 122 away from the first endpoint 116 and the second endpoint 116, respectively, along a direction defined by the line 102.
  • FIG. 5 illustrates the example graphical user interface (GUI) 100 during a dragging movement beginning on or within a threshold distance of the first icon 104.
  • the GUI 100 in FIG. 5 is similar to that shown in FIG. 2, with the addition of an inset 524.
  • the inset 524 depicts a magnification of a portion 526 of the ultrasound image 120.
  • the inset 524 depicts a portion 526 of the ultrasound image 120 that is proximal to the first endpoint 116.
  • the inset 524 further depicts the first endpoint 116, the first crosshairs 108, and a portion of the line 102 that is within the portion 526 of the ultrasound image 120.
  • the processing device may display the inset 524 when the user begins a dragging movement and continue to display the inset 524 as the user continues the dragging movement. Because the inset 524 illustrates the magnified portion 526 of the ultrasound image 120 that is proximal to the first endpoint 116, the user may use the inset 524 to determine how to perform the dragging movement in order to change the location of the first endpoint 116 to the desired location on the ultrasound image 120, and also to determine when the first endpoint 116 is at the desired location. If the user begins a dragging movement on or within a threshold distance of the second icon 106, the processing device may display the inset 524 and show a magnified portion (not shown in FIG.
  • the processing device does not display the first icon 104 during the dragging movement that began on or within a threshold distance of the first icon 104.
  • the processing device does not display the first icon 104 during the dragging movement that began on or within a threshold distance of the first icon 104.
  • the processing may display the first icon 104 during the dragging movement.
  • the processing device may not display the inset 524 during a dragging movement.
  • certain portions of the GUI 100 may be absent.
  • the first crosshairs 108, the second crosshairs 110, and/or the delete option 114 may be absent.
  • the measurement value indicator 112 may have a different form than shown and/or be located at a different location on the touch-sensitive display screen.
  • the GUI 100 shows certain other features that are not described herein (e.g., certain buttons or indicators), in some embodiments such features may be absent or different.
  • FIG. 6 illustrates an example graphical user interface (GUI) 600 that includes an ellipse 638, a first icon 640, a second icon 662, a first measurement value indicator 656, a second measurement value indicator 658, a delete option 660, and an ultrasound image 120.
  • GUI graphical user interface
  • the second icon 662 includes a first arrow 652 and a second arrow 654.
  • the ellipse 638 includes a center location 664, a first axis 674, and a second axis 676.
  • the first axis 674 extends between two endpoints, namely a first vertex 642 and a second vertex 644 of the ellipse 638.
  • the second axis 676 extends between two endpoints, namely a third vertex 646 and a fourth vertex 648 of the ellipse 638.
  • the first axis 674 and the second axis 676 may be equivalent to the major axis and the minor axis of the ellipse, or vice versa. It should be appreciated that the ellipse 638 may be a circle.
  • the ellipse 638 is superimposed on the ultrasound image 120 and may be used by the processing device to perform a measurement on the ultrasound image 120.
  • the processing device displays the value of the spatial length represented by the ultrasound image 120 along the circumference of the ellipse 638 with the first measurement value indicator 656 and the processing device displays the value of the spatial area represented by the ultrasound image 120 within the ellipse 638 with the second measurement value indicator 658.
  • the user may cause the processing device to modify the ellipse (e.g., the position, orientation, and/or shape of the ellipse).
  • the user may cause the processing device to modify the ellipse to coincide with a particular anatomical structure visible in the ultrasound image 120 if the user desires to measure the circumference or area of the anatomical structure as depicted by the ultrasound image 120.
  • the inventors have developed technology for assisting a user in modifying the position, orientation, and shape of the ellipse 638 using a touch-sensitive display screen.
  • the technology includes display of the first icon 640 and the second icon 662.
  • the processing device displays the first icon 640 a fixed distance 650 from the first vertex 642.
  • the processing device displays the second icon 662 the fixed distance 650 from the fourth vertex 648.
  • the fixed distance 650 may be a predetermined distance.
  • the fixed distance 650 may be a default distance.
  • the fixed distance 650 may be selected by a user.
  • an icon being positioned a fixed distance from some feature may mean that the center of the icon is positioned the fixed distance from the feature.
  • the fixed distance between the first icon 640 and the first vertex 642 and the fixed distance between the second icon 662 and the third vertex 646 may not be the same.
  • the first icon 640 and the second icon 662 are circular, although other forms are possible. Additionally, in FIG. 6, no portion of the first icon 640 or the second icon 662 overlaps the ellipse 638. However, in some embodiments, a portion of the first icon 640 or the second icon 662 may overlap the ellipse 638.
  • the processing device may change the length of the first axis 674 based on a dragging movement on the touch-sensitive display screen that begins on or within a threshold distance of the first icon 640.
  • the processing device may change the locations of the first vertex 642 and the second vertex 644 by that same distance away from the ellipse along the direction defined by the first axis 674.
  • the processing device may expand the first axis 674 of the ellipse 638 by two times the distance along the direction defined by the first axis 674.
  • the processing device may change the locations of the first vertex 642 and the second vertex 644 by that same distance towards from the ellipse along the direction defined by the first axis 674. In other words, the processing device may contract the first axis 674 of the ellipse 638 by two times the distance along the direction defined by the first axis 674.
  • the processing device may similarly change the length of the second axis 676 based on a dragging movement on the touch-sensitive display screen that begins on or within a threshold distance of the second icon 662.
  • the processing device may display other points along the ellipse 638 based on new lengths of the first axis 674 and/or the second axis 676. For example, the processing device may determine new locations for other points along the ellipse 638 based on the Cartesian equation for an ellipse 638.
  • the processing device may only use the center location 664 of the ellipse, one of the first vertex 642 and the second vertex 644, and one of the third vertex 646 and the fourth vertex 648.
  • the touch-sensitive display screen having an array of pixels, each pixel having a location that is x pixels in the horizontal direction and a location that is y pixels in the vertical location, where x and y are measured from an origin (e.g., a corner of the touch-sensitive display screen).
  • the first axis 674 of the ellipse 638 is parallel to the vertical direction of the touch-sensitive display and the second axis 676 of the ellipse is parallel to the horizontal direction of the touch-sensitive display.
  • the first vertex 642 is located at (vlx, vly)
  • the second vertex 644 is located at (v2x, v2y)
  • the first icon is located at (ilx, ily).
  • the processing device may change the location of the first vertex 642 such that the first vertex 642 is displayed at (vlx, vly+(d2y-dly)).
  • the processing device may also change the location of the second vertex 644 such that the second vertex 644 is displayed at (v2x, v2y- (d2y-dly)).
  • the processing device may display other points along the ellipse 638 based on the new locations of the first vertex 642 and the second vertex 644.
  • the processing device may update, based on a dragging movement, the location of the first vertex 642 at a sufficiently high rate such that the first vertex 642 appears to follow the dragging movement as the dragging movement proceeds.
  • the first vertex 642 may appear to follow the user’s finger. Because changing the location of the first vertex 642 may be initiated in this example by the user touching his/her finger to the first icon 640, which may be located a fixed distance away from the first vertex 642, the first vertex 642 may be removed from the user’s finger by the fixed distance as the user drags his/her finger across the touch-sensitive display screen.
  • the first vertex 642 may be visible to the user, and the user may be able to determine when the first vertex 642 has moved to the desired location and release his/her finger from the touch-sensitive display to cause the first vertex 642 to remain in the desired location.
  • the same discussion applies to the fourth vertex 648 and the second icon 662.
  • the processing device may change the location of the first icon 640 such that the first icon 640 is displayed a fixed distance from the first vertex 642 along the direction defined by the first axis 674. For example, consider that after the dragging movement, the first vertex 642 is located at (vlx, vly), the second vertex 644 is located at (v2x, v2y), and the fixed distance is d.
  • the direction defined by the first axis 674 is along the vertical direction of the touch-sensitive display, but in the general case where the direction defined by the first axis 674 is rotated to an angle relative to the vertical direction of the touch-sensitive display, the expressions above may be modified to account for such rotation.
  • the processing device may change the location of the third vertex 646, the fourth vertex 648, and the second icon 662 based on a dragging movement that begins on or within a threshold distance of the second icon 662 and covers a certain distance away from/toward the ellipse 638 along the direction defined by the second axis 676.
  • FIG. 7 illustrates the example graphical user interface (GUI) 600 after a dragging movement beginning on or within a threshold distance of the first icon 640 and covering a distance towards the ellipse 638 along the direction of the first axis 674.
  • GUI graphical user interface
  • the GUI 600 may have appeared as shown in FIG. 6.
  • the processing device has changed the locations of the first vertex 642 and the second vertex 644 from their locations in FIG. 6. (In other words, the processing device has changed the length of the first axis 674.)
  • the processing device may have changed the locations of the first vertex 642 and the second vertex 644 by the distance covered by the dragging movement along the direction defined by the first axis 674.
  • the processing device has also changed the location of the first icon 640 from its location in FIG. 6 to be the fixed distance 650 away from the first vertex 674 along a direction defined by the first axis 674. It should be noted that the processing device has changed the measurement values depicted by the first measurement value indicator 656 and the second measurement value indication 658 from that shown in FIG. 6 based on the change in length of the first axis 674 from FIG. 6 to FIG. 7.
  • FIG. 8 illustrates the example graphical user interface (GUI) 600 after a dragging movement beginning on or within a threshold distance of the second icon 662 and covering a distance towards the ellipse 638 along the direction of the second axis 676.
  • GUI graphical user interface
  • the processing device has also changed the location of the second icon 662 from its location in FIG. 7 to be the fixed distance 650 away from the fourth vertex 648 along a direction defined by the second axis 676. It should be noted that the processing device has changed the measurement values depicted by the first measurement value indicator 656 and the second measurement value indicator 658 from that shown in FIG. 7 based on the change in length of the second axis 676 from FIG. 7 to FIG. 8.
  • the processing device may change the position of the ellipse 638 based on a dragging movement that begins in the interior of the ellipse 638, on the boundary of the ellipse 638, or within a threshold distance of the boundary of the ellipse 638.
  • the processing device may change the locations of every point on the ellipse 638, as well as the first icon 640 and the second icon 662, by a distance of (d2x-dlx, d2y-dly).
  • the processing device may change the locations of the center 664 of the ellipse 638, the first vertex 642, the second vertex 644, the third vertex 646, and the fourth vertex 648 by the specific distance and display the rest of the ellipse 638 based on these new locations using the Cartesian equation for an ellipse 638.
  • FIG. 9 illustrates the example graphical user interface (GUI) 600 after a dragging movement beginning in the interior of the ellipse 638 or within a threshold distance of the boundary of the ellipse 638.
  • GUI graphical user interface
  • the processing device has changed the position (but not the orientation or shape) of the ellipse 638 by the distance covered by the dragging movement.
  • the processing device has also changed the location of the first icon 640 from its location in FIG. 8 to be the fixed distance 650 away from the first vertex 674 along a direction defined by the first axis 674, and the location of the second icon 662 from its location in FIG. 8 to be the fixed distance 650 away from the fourth vertex 648 along a direction defined by the second axis 676.
  • the processing device may rotate the ellipse 638 based on a dragging movement that begins on or at the second icon 662 and covers a distance along and/or a distance orthogonal to the direction of the second axis 676 of the ellipse 638.
  • GUI 10 illustrates the example graphical user interface (GUI) 600 after a dragging movement beginning on or within a threshold distance of the second icon 662 and covering a distance along and/or a distance orthogonal to the direction of the second axis 676.
  • GUI 600 may have appeared as shown in FIG. 9.
  • the processing device has rotated the locations of every point of the ellipse 638 based on the drag distance along and/or the drag distance orthogonal to the direction of the second axis 676.
  • the processing device has also changed the location of the first icon 640 from its location in FIG.
  • FIG. 11 illustrates a method for determining how much to rotate the ellipse 638 based on the dragging movement, in accordance with certain embodiments described herein.
  • the left side of FIG. 11 shows the ellipse 638 before the dragging movement and the right side of FIG. 11 shows the ellipse 638 after the dragging movement.
  • the center location 664 is labeled C
  • the second vertex 644 is labeled A
  • the first vertex 642 is labeled B
  • the center location 664 is labeled C’
  • the location of the second vertex 644 is labeled A’
  • the location of the first vertex 642 is labeled B’
  • the location of the second icon 662 is labeled D’.
  • the dragging movement begins at the location of the second icon 662 and ends at a location separated from the previous location by a vector V (where V may have components along and/or orthogonal to the second axis 676).
  • the processing device may determine the location of C’ to be the same as C, namely, the center location 664 may not change.
  • the processing device may determine the location of A’ to be A+V, in other words, the previous location of the second vertex 644 plus the vector of the dragging movement.
  • the processing device may determine the location of B’ to be C+normal( 4’C)*length(BC).
  • the new location of the first vertex 642 may be the center location 664 plus a vector that has a length equal to the distance between the center location 664 and the previous location of the first vertex 642, and a direction that is perpendicular to a vector between the center location 664 and the new location of the second vertex 644.
  • the processing device may determine new locations for the rest of the points on the ellipse 638 based on the new locations for the first vertex 642 and the second vertex 644.
  • rotations of the ellipse 638 may be controlled both by components of a drag distance beginning at or within a threshold distance of the second icon 662 (in other words, the components of the vector V) that are along the direction of the second axis 676 and orthogonal to the direction of the second axis 676.
  • a dragging movement beginning at or within a threshold distance of the second icon 662 and covering a distance along the direction of the second axis 676 may also control the length of the second axis 676.
  • a dragging movement beginning at or within a threshold distance of the second icon 662 and having only a component along the direction of the second axis 676 may only modify the length of the second axis 676.
  • a dragging movement beginning at or within a threshold distance of the second icon 662 and having components both along and orthogonal to the direction of the second axis 676 may modify both the length of the second axis 676 and the rotation of the ellipse 638.
  • the description of FIG. 11 may apply both to the general case of a dragging movement having components both along and orthogonal to the direction of the second axis 676, as well as the special case of a dragging movement having a component only along the direction of the second axis 676.
  • the processing device may use a different method for determining how to rotate an ellipse than the method illustrated by FIG. 11.
  • the first arrow 652 and the second arrow 654 may serve to indicate to a user that the second icon 662 (as opposed to the first icon 640) can be used to rotate the ellipse 638.
  • the positioning of the first arrow 652 and the second arrow 654 may change as the shape of the ellipse 638 changes so that the arrows approximate the curvature of the ellipse 638.
  • GUI 600 may be absent.
  • the second arrow 654 the first arrow 652
  • the first arrow 652 the first arrow 652
  • the second measurement value indicator 658, and/or the delete option 660 may be absent.
  • the first measurement value indicator 656 and/or the second measurement value indicator 658 may have different forms than shown and/or be located at a different locations on the touch-sensitive display screen.
  • the GUI 600 shows certain other features that are not described herein (e.g., certain buttons or indicators), in some embodiments such features may be absent or different.
  • a processing device may perform certain calculations using pixels, in some embodiments the processing device may perform calculations using points. It should be noted that certain calculations described herein may produce fractional pixel results. In some embodiments, fractional pixel results may be rounded to a whole pixel. In some embodiments, the processing device may use antialiasing to interpret pixel values for a fractional pixel result (e.g., to interpret pixel values for pixels (1, 1) and (2, 1) when a calculation indicates that something should be displayed at pixel (1.5, 1)).
  • the processing device may change the location of one feature of a measurement tool (e.g., a line or an ellipse) based on a dragging movement that begins on or within a threshold distance of a certain feature.
  • the distance may be measured in pixels (e.g., 30 pixels). While the above description has described a touch- sensitive display screen, in some embodiments the screen may not be touch-sensitive display screen, and a click and drag of a cursor (e.g., using a mouse) may be the equivalent of a dragging movement.
  • FIG. 12 illustrates an example GUI 1200 that may be shown when ultrasound data is being collected, in accordance with certain embodiments described herein.
  • the GUI 1200 depicts the most recent ultrasound image 120 collected by the processing device from the ultrasound device. As further ultrasound images 120 are collected, the processing device may continuously update the GUI 1200 to depict the most recent ultrasound image 120 collected.
  • the GUI 1200 further includes a freeze option 1226.
  • FIG. 13 illustrates an example GUI 1300 that may be shown upon selection of the freeze option 1226, in accordance with certain embodiments described herein.
  • the GUI 1300 depicts the most recent ultrasound image 120 collected by the processing device from the ultrasound device when the freeze option 1226 was selected.
  • the processing device freezes the most recent ultrasound 120 on the GUI 1300, and the processing device may not update the GUI 1300 with ultrasound images 120 that are collected subsequently.
  • the freeze option 1226 can have a different color or pattern, which may indicate that the GUI 1300 is currently showing a frozen ultrasound image 120.
  • the GUI 1300 depicts a measurement option 1328.
  • FIG. 14 illustrates an example GUI 1400 that may be shown upon selection of the measurement option 1328, in accordance with certain embodiments described herein.
  • the GUI 1400 can depict the freeze option 1226 having a different color or pattern as explained with respect to FIG. 13, as well as a label option 1430, an ellipse measurement option 1432, a line measurement option 1434, and a menu close option 1436.
  • the processing device may display a GUI enabling a user to place labels on the ultrasound image 130.
  • the processing device may display the GUI 600, with the ellipse 638, the first icon 640, and the second icon 662 shown in default positions.
  • the processing device may display the GUI 100, with the line 102, the first icon 104, and the second icon 106 shown in default positions.
  • the processing device may display the GUI 1300 (i.e., remove from display the label option 1430, the ellipse measurement option 1432, and the line measurement option 1434).
  • FIGs. 15-19 illustrate example processes for performing measurements on an ultrasound image, in accordance with certain embodiments described herein.
  • the processes may be performed by a processing device in an ultrasound system.
  • the processing device may be, for example, a mobile phone, tablet, or laptop in operative communication with an ultrasound probe.
  • the ultrasound probe and the processing device may communicate over a wired communication link (e.g., over Ethernet, a Universal Serial Bus (USB) cable or a Lightning cable) or over a wireless communication link (e.g., over a BLUETOOTH, WiLi, or ZIGBEE wireless communication link).
  • FIG. 15 illustrates an example process 1500 for performing measurements on an ultrasound image based on a line, in accordance with certain embodiments described herein. Further description of the process 1500 may be found with reference to FIGs. 1-5.
  • the processing device displays, on a touch-sensitive display screen, (1) an ultrasound image, (2) a line extending between a first endpoint and a second endpoint and (3) an icon located a fixed distance from the first endpoint along a direction defined by the line.
  • the process 1500 proceeds from act 1502 to act 1504.
  • act 1504 the processing device detects a dragging movement covering a distance in the horizontal direction and/or a distance in the vertical direction across the touch-sensitive display screen, where the dragging movement begins on or within a threshold distance of the icon.
  • the process 1500 proceeds from act 1504 to act 1506.
  • act 1506 the processing device displays the first endpoint at a new location on the touch-sensitive display screen that is removed from the endpoint’s previous location by the distance in the horizontal direction and/or the distance in the vertical direction covered by the dragging movement.
  • the process 1500 proceeds from act 1506 to act 1508.
  • act 1508 the processing device displays the icon at a new location on the touch- sensitive display screen that is removed from the new location of the first endpoint by the fixed distance along the direction defined by the line.
  • the process 1500 proceeds from act 1508 to act 1510.
  • the processing device performs a measurement on the ultrasound image based on the line. For example, the processing device may perform a calculation of the spatial length represented by the ultrasound image between the first endpoint and the second endpoint of the line. In some embodiments, the processing device may display the result of the measurement.
  • certain acts of the process 1500 may be absent.
  • act 1510 may be absent.
  • acts 1504-1510 may be absent.
  • acts 1504-1508 may be absent.
  • other combinations of acts may be absent.
  • FIG. 16 illustrates an example process 1600 for performing measurements on an ultrasound image based on a line, in accordance with certain embodiments described herein. Further description of the process 1600 may be found with reference to FIG. 4.
  • the processing device displays, on a touch-sensitive display screen, (1) an ultrasound image and (2) a line extending between a first endpoint and a second endpoint.
  • the process 1600 proceeds from act 1602 to act 1604.
  • act 1604 the processing device detects a dragging movement covering a distance in the horizontal direction and/or a distance in the vertical direction across the touch-sensitive display screen, where the dragging movement begins on or within a threshold distance of the line.
  • the process 1600 proceeds from act 1604 to act 1606.
  • act 1606 the processing device displays both the first endpoint and the second endpoint of the line at new locations on the touch-sensitive display screen that are removed from their previous locations by the distance in the horizontal direction and/or the distance in the vertical direction.
  • the process 1600 proceeds from act 1606 to act 1608.
  • the processing device performs a measurement on the ultrasound image based on the line. For example, the processing device may perform a calculation of the spatial length represented by the ultrasound image between the first endpoint and the second endpoint of the line. In some embodiments, the processing device may display the result of the measurement.
  • certain acts of the process 1600 may be absent.
  • act 1608 may be absent.
  • acts 1604-1608 may be absent.
  • acts 1604-1606 may be absent.
  • other combinations of acts may be absent.
  • FIG. 17 illustrates an example process 1700 for performing measurements on an ultrasound image based on an ellipse, in accordance with certain embodiments described herein. Further description of the process 1700 may be found with reference to FIGs. 6-8.
  • the processing device displays, on a touch-sensitive display screen, (1) an ultrasound image, (2) an ellipse having an axis that is either the major or minor axis of the ellipse, where the axis extends between a first vertex and a second vertex; and (3) an icon located a fixed distance from the first vertex along a direction defined by the axis.
  • the process 1700 proceeds from act 1702 to act 1704.
  • act 1704 the processing device detects a dragging movement covering a distance along the direction defined by the axis of the ellipse across the touch-sensitive display screen, where the dragging movement begins on or within a threshold distance of the icon.
  • the process 1700 proceeds from act 1704 to act 1706.
  • act 1706 the processing device displays the first vertex at a new location on the touch-sensitive display screen that is removed from the first vertex’s previous location by the distance along the direction defined by the axis of the ellipse covered by the dragging movement.
  • the process 1700 proceeds from act 1706 to act 1708.
  • act 1708 the processing device displays the second vertex at a new location on the touch-sensitive display screen that is removed from the second vertex’s previous location by the distance along the direction defined by the axis of the ellipse covered by the dragging movement.
  • the process 1700 proceeds from act 1708 to act 1710.
  • act 1710 the processing device displays the icon at a new location on the touch- sensitive display screen that is removed from the first vertex’s new location by the fixed distance along the direction defined by the axis of the ellipse.
  • the process 1700 proceeds from act 1710 to act 1712.
  • the processing device performs a measurement on the ultrasound image based on the ellipse. For example, the processing device may perform a calculation of the spatial length represented by the ultrasound image along the circumference of the ellipse or a calculation of the spatial area represented by the ultrasound image within the ellipse. In some embodiments, the processing device may display the result of the measurement.
  • certain acts of the process 1700 may be absent.
  • act 1712 may be absent.
  • acts 1704-1712 may be absent.
  • acts 1704-1710 may be absent.
  • other combinations of acts may be absent.
  • FIG. 18 illustrates an example process 1800 for performing measurements on an ultrasound image based on an ellipse, in accordance with certain embodiments described herein. Further description of the process 1800 may be found with reference to FIGs. 10-11.
  • the processing device displays, on a touch-sensitive display screen, (1) an ultrasound image, (2) an ellipse having an axis that is either the major or minor axis of the ellipse, where the axis extends between a first vertex and a second vertex; and (3) an icon located a fixed distance from the first vertex along a direction defined by the axis.
  • the process 1800 proceeds from act 1802 to act 1804.
  • act 1804 the processing device detects a dragging movement covering a distance along and/or a distance orthogonal to the direction defined by the axis of the ellipse across the touch-sensitive display screen, where the dragging movement begins on or within a threshold distance of the icon.
  • the process 1800 proceeds from act 1804 to act 1806.
  • act 1806 the processing device displays the first vertex and the second vertex at new locations on the touch-sensitive display screen that are rotated from their previous locations based on the distance along and/or the distance orthogonal to the direction defined by the axis of the ellipse that is covered by the dragging movement.
  • the process 1800 proceeds from act 1806 to act 1808.
  • act 1808 the processing device displays the icon at a new location on the touch-sensitive display screen that is removed from the first vertex’s new location by the fixed distance along the direction defined by the axis of the ellipse.
  • the process 1800 proceeds from act 1808 to act 1810.
  • the processing device performs a measurement on the ultrasound image based on the ellipse. For example, the processing device may perform a calculation of the spatial length represented by the ultrasound image along the circumference of the ellipse or a calculation of the spatial area represented by the ultrasound image within the ellipse. In some embodiments, the processing device may display the result of the measurement.
  • certain acts of the process 1800 may be absent.
  • act 1810 may be absent.
  • acts 1804- 1810 may be absent.
  • acts 1804-1808 may be absent.
  • other combinations of acts may be absent.
  • FIG. 19 illustrates an example process 1900 for performing measurements on an ultrasound image based on an ellipse, in accordance with certain embodiments described herein. Further description of the process 1900 may be found with reference to FIG. 9.
  • the processing device displays, on a touch-sensitive display screen, (1) an ultrasound image, (2) an ellipse having an axis that is either the major or minor axis of the ellipse, wherein the axis extends between a first vertex and a second vertex; and (3) an icon located a fixed distance from the first vertex along a direction defined by the axis.
  • the process 1900 proceeds from act 1902 to act 1904.
  • act 1904 the processing device detects a dragging movement covering distance in the horizontal direction and/or a distance in the vertical direction across the touch- sensitive display screen, where the dragging movement begins in the interior of the ellipse or within a threshold distance of the boundary of the ellipse.
  • the process 1900 proceeds from act 1904 to act 1906.
  • act 1906 the processing device displays the first vertex and the second vertex at new locations on the touch-sensitive display screen that are removed from their previous locations by the distance in the horizontal direction and/or the distance in the vertical direction covered by the dragging movement.
  • the process 1900 proceeds from act 1906 to act 1908.
  • the processing device performs a measurement performed on the ultrasound image based on the ellipse. For example, the processing device may perform a calculation of the spatial length represented by the ultrasound image along the circumference of the ellipse or a calculation of the spatial area represented by the ultrasound image within the ellipse. In some embodiments, the processing device may display the result of the measurement.
  • certain acts of the process 1900 may be absent.
  • act 1910 may be absent.
  • acts 1904- 1910 may be absent.
  • acts 1904-1908 may be absent.
  • other combinations of acts may be absent.
  • one or more of the icons described above may be absent, and a user may modify measurement tools through dragging movements that begin on or within a threshold distance of a region of the touch-sensitive display screen that is located the fixed distance from the portion of the measurement tool, even though the region does not contain an icon.
  • the processing device may modify measurement tools based on taps.
  • a user may tap an icon and then another location on the touch-sensitive display screen.
  • the processing device may then modify the measurement tool based on the distance in the horizontal and/or vertical direction between the two tapped locations.
  • inventive concepts may be embodied as one or more processes, of which examples have been provided.
  • the acts performed as part of each process may be ordered in any suitable way.
  • embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
  • one or more of the processes may be combined and/or omitted, and one or more of the processes may include additional steps.
  • FIG. 20 illustrates a schematic block diagram illustrating aspects of an example ultrasound system 2000 upon which various aspects of the technology described herein may be practiced.
  • the ultrasound system 2000 includes processing circuitry 2001, input/output devices 2003, ultrasound circuitry 2005, and memory circuitry 2007.
  • the ultrasound circuitry 2005 may be configured to generate ultrasound data that may be employed to generate an ultrasound image.
  • the ultrasound circuitry 2005 may include one or more ultrasonic transducers monolithically integrated onto a single
  • the ultrasonic transducers may include, for example, one or more capacitive micromachined ultrasonic transducers (CMUTs), one or more CMOS ultrasonic transducers (CUTs), one or more piezoelectric micromachined ultrasonic transducers (PMUTs), and/or one or more other suitable ultrasonic transducer cells.
  • CMUTs capacitive micromachined ultrasonic transducers
  • CUTs CMOS ultrasonic transducers
  • PMUTs piezoelectric micromachined ultrasonic transducers
  • the ultrasonic transducers may be formed the same chip as other electronic components in the ultrasound circuitry 2005 (e.g., transmit circuitry, receive circuitry, control circuitry, power management circuitry, and processing circuitry) to form a monolithic ultrasound imaging device.
  • the ultrasound circuitry 2005 e.g., transmit circuitry, receive circuitry, control circuitry, power management circuitry, and processing circuitry
  • the processing circuitry 2001 may be configured to perform any of the functionality described herein.
  • the processing circuitry 2001 may include one or more processors (e.g., computer hardware processors). To perform one or more functions, the processing circuitry 2001 may execute one or more processor-executable instructions stored in the memory circuitry 2007.
  • the memory circuitry 2007 may be used for storing programs and data during operation of the ultrasound system 2000.
  • the memory circuitry 2007 may include one or more storage devices such as non-transitory computer-readable storage media.
  • the processing circuitry 2001 may control writing data to and reading data from the memory circuitry 2007 in any suitable manner.
  • the processing circuitry 2001 may include specially- programmed and/or special-purpose hardware such as an application- specific integrated circuit (ASIC).
  • ASIC application- specific integrated circuit
  • the processing circuitry 2001 may include one or more graphics processing units (GPUs) and/or one or more tensor processing units (TPUs).
  • TPUs may be ASICs specifically designed for machine learning (e.g., deep learning).
  • the TPUs may be employed to, for example, accelerate the inference phase of a neural network.
  • the input/output (I/O) devices 2003 may be configured to facilitate communication with other systems and/or an operator.
  • Example I/O devices 2003 that may facilitate communication with an operator include: a keyboard, a mouse, a trackball, a microphone, a touch-sensitive display screen, a printing device, a display screen, a speaker, and a vibration device.
  • Example I/O devices 2003 that may facilitate communication with other systems include wired and/or wireless communication circuitry such as BLUETOOTH, ZIGBEE, Ethernet, WiFi, and/or USB communication circuitry.
  • the ultrasound system 2000 may be implemented using any number of devices.
  • the components of the ultrasound system 2000 may be integrated into a single device.
  • the ultrasound circuitry 2005 may be integrated into an ultrasound imaging device that is communicatively coupled with a processing device that includes the processing circuitry 2001, the input/output devices 2003, and the memory circuitry 2007.
  • FIG. 21 is a schematic block diagram illustrating aspects of another example ultrasound system 2100 upon which various aspects of the technology described herein may be practiced.
  • the ultrasound system 2100 includes an ultrasound imaging device 2114 in wired and/or wireless communication with a processing device 2102.
  • the processing device 2102 includes an audio output device 2104, an imaging device 2106, a display screen 2108, a processor 2110, a memory 2112, and a vibration device 2109.
  • the processing device 2102 may communicate with one or more external devices over a network 2116.
  • the processing device 2102 may communicate with one or more workstations 2120, servers 2118, and/or databases 2122.
  • the ultrasound imaging device 2114 may be configured to generate ultrasound data that may be employed to generate an ultrasound image.
  • the ultrasound imaging device 2114 may be constructed in any of a variety of ways.
  • the ultrasound imaging device 2114 includes a transmitter that transmits a signal to a transmit beamformer which in turn drives transducer elements within a transducer array to emit pulsed ultrasonic signals into a structure, such as a patient.
  • the pulsed ultrasonic signals may be back- scattered from structures in the body, such as blood cells or muscular tissue, to produce echoes that return to the transducer elements. These echoes may then be converted into electrical signals by the transducer elements and the electrical signals are received by a receiver.
  • the electrical signals representing the received echoes are sent to a receive beamformer that outputs ultrasound data.
  • the processing device 2102 may be configured to process the ultrasound data from the ultrasound imaging device 2114 to generate ultrasound images for display on the display screen 2108.
  • the processing may be performed by, for example, the processor 2110.
  • the processor 2110 may also be adapted to control the acquisition of ultrasound data with the ultrasound imaging device 2114.
  • the ultrasound data may be processed in real-time during a scanning session as the echo signals are received.
  • the displayed ultrasound image may be updated a rate of at least 5Hz, at least 10 Hz, at least 20Hz, at a rate between 5 and 60 Hz, at a rate of more than 20 Hz.
  • ultrasound data may be acquired even as images are being generated based on previously acquired data and while a live ultrasound image is being displayed. As additional ultrasound data is acquired, additional frames or images generated from more-recently acquired ultrasound data are sequentially displayed. Additionally, or alternatively, the ultrasound data may be stored temporarily in a buffer during a scanning session and processed in less than real-time.
  • the processing device 2102 may be configured to perform any of the processes described herein (e.g., using the processor 2110).
  • the processing device 2102 may be configured to automatically determine an anatomical feature being imaged and automatically select, based on the anatomical feature being imaged, an ultrasound imaging preset corresponding to the anatomical feature.
  • the processing device 2102 may include one or more elements that may be used during the performance of such processes.
  • the processing device 2102 may include one or more processors 2110 (e.g., computer hardware processors) and one or more articles of manufacture that include non-transitory computer-readable storage media such as the memory 2112.
  • the processor 2110 may control writing data to and reading data from the memory 2112 in any suitable manner.
  • the processor 2110 may execute one or more processor-executable instructions stored in one or more non-transitory computer-readable storage media (e.g., the memory 2112), which may serve as non-transitory computer-readable storage media storing processor-executable instructions for execution by the processor 2110.
  • non-transitory computer-readable storage media e.g., the memory 2112
  • the processing device 2102 may include one or more input and/or output devices such as the audio output device 2104, the imaging device 2106, the display screen 2108, and the vibration device 2109.
  • the audio output device 2104 may be a device that is configured to emit audible sound such as a speaker.
  • the imaging device 2106 may be configured to detect light (e.g., visible light) to form an image such as a camera.
  • the display screen 2108 may be configured to display images and/or videos such as a liquid crystal display (LCD), a plasma display, and/or an organic light emitting diode (OLED) display.
  • the display screen 2018 may be a touch-sensitive display screen.
  • the vibration device 2109 may be configured to vibrate one or more components of the processing device 2102 to provide tactile feedback. These input and/or output devices may be communicatively coupled to the processor 2110 and/or under the control of the processor 2110.
  • the processor 2110 may control these devices in accordance with a process being executed by the process 2110 (such as the processes shown in FIGs. 15-19).
  • the processor 2110 may control the audio output device 2104 to issue audible instructions and/or control the vibration device 2109 to change an intensity of tactile feedback (e.g., vibration) to issue tactile instructions.
  • the processor 2110 may control the imaging device 2106 to capture non-acoustic images of the ultrasound imaging device 2114 being used on a subject to provide an operator of the ultrasound imaging device 2114 an augmented reality interface.
  • the processing device 2102 may be implemented in any of a variety of ways.
  • the processing device 2102 may be implemented as a handheld device such as a mobile smartphone or a tablet.
  • an operator of the ultrasound imaging device 2114 may be able to operate the ultrasound imaging device 2114 with one hand and hold the processing device 2102 with another hand.
  • the processing device 2102 may be implemented as a portable device that is not a handheld device such as a laptop.
  • the processing device 2102 may be implemented as a stationary device such as a desktop computer.
  • the processing device 2102 may communicate with one or more external devices via the network 2116.
  • the processing device 2102 may be connected to the network 2116 over a wired connection (e.g., via an Ethernet cable) and/or a wireless connection (e.g., over a WiFi network).
  • these external devices may include servers 2118, workstations 2120, and/or databases 2122.
  • the processing device 2102 may communicate with these devices to, for example, off-load computationally intensive tasks.
  • the processing device 2102 may send an ultrasound image over the network 2116 to the server 2118 for analysis (e.g., to identify an anatomical feature in the ultrasound) and receive the results of the analysis from the server 2118.
  • the processing device 2102 may communicate with these devices to access information that is not available locally and/or update a central information repository. For example, the processing device 2102 may access the medical records of a subject being imaged with the ultrasound imaging device 2114 from a file stored in the database 2122. In this example, the processing device 2102 may also provide one or more captured ultrasound images of the subject to the database 2122 to add to the medical record of the subject.
  • ultrasound imaging devices and systems see U.S. Patent Application Publication No. US20170360397A1 titled“UNIVERSAL ULTRASOUND DEVICE AND RELATED APPARATUS AND METHODS.”
  • the phrase“at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase“at least one” refers, whether related or unrelated to those elements specifically identified.
  • the terms“approximately” and“about” may be used to mean within ⁇ 20% of a target value in some embodiments, within ⁇ 10% of a target value in some embodiments, within ⁇ 5% of a target value in some embodiments, and yet within ⁇ 2% of a target value in some embodiments.
  • the terms“approximately” and“about” may include the target value.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Human Computer Interaction (AREA)
  • Vascular Medicine (AREA)
  • User Interface Of Digital Computer (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Des aspects de la technologie de la présente invention comprennent un dispositif de traitement configuré pour afficher, sur un écran d'affichage tactile d'un dispositif de traitement en communication fonctionnelle avec un dispositif à ultrasons, une image ultrasonore, un outil de mesure mobile et une icône qui maintient une distance fixe à partir d'une partie de l'outil de mesure. L'icône peut être configurée pour modifier l'outil de mesure, et l'icône peut ne pas chevaucher l'outil de mesure.
PCT/US2019/057809 2018-10-25 2019-10-24 Procédés et appareil pour effectuer des mesures sur une image ultrasonore WO2020086811A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862750348P 2018-10-25 2018-10-25
US62/750,348 2018-10-25

Publications (1)

Publication Number Publication Date
WO2020086811A1 true WO2020086811A1 (fr) 2020-04-30

Family

ID=70327800

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/057809 WO2020086811A1 (fr) 2018-10-25 2019-10-24 Procédés et appareil pour effectuer des mesures sur une image ultrasonore

Country Status (2)

Country Link
US (2) US11638572B2 (fr)
WO (1) WO2020086811A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10586618B2 (en) * 2014-05-07 2020-03-10 Lifetrack Medical Systems Private Ltd. Characterizing states of subject
USD934288S1 (en) * 2019-11-27 2021-10-26 Bfly Operations, Inc. Display panel or portion thereof with graphical user interface
USD934289S1 (en) * 2019-11-27 2021-10-26 Bfly Operations, Inc. Display panel or portion thereof with graphical user interface
KR20230021665A (ko) * 2020-05-08 2023-02-14 페이지.에이아이, 인크. 디지털 병리학에서 현저한 정보를 결정하기 위해 전자 이미지들을 처리하는 시스템들 및 방법들

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170124700A1 (en) * 2015-10-30 2017-05-04 General Electric Company Method and system for measuring a volume from an ultrasound image
US20180015256A1 (en) * 2016-07-14 2018-01-18 C. R. Bard, Inc. Automated Catheter-To-Vessel Size Comparison Tool And Related Methods
US20180088694A1 (en) * 2014-09-19 2018-03-29 Samsung Electronics Co., Ltd. Ultrasound diagnosis apparatus and method and computer-readable storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110055447A1 (en) 2008-05-07 2011-03-03 Signostics Limited Docking system for medical diagnostic scanning using a handheld device
JP2012019824A (ja) * 2010-07-12 2012-02-02 Hitachi Aloka Medical Ltd 超音波診断装置
US11096668B2 (en) * 2013-03-13 2021-08-24 Samsung Electronics Co., Ltd. Method and ultrasound apparatus for displaying an object
US9538985B2 (en) * 2014-04-18 2017-01-10 Fujifilm Sonosite, Inc. Hand-held medical imaging system with improved user interface for deploying on-screen graphical tools and associated apparatuses and methods
KR102293915B1 (ko) * 2014-12-05 2021-08-26 삼성메디슨 주식회사 초음파 이미지 처리 방법 및 이를 위한 초음파 장치
US11712221B2 (en) 2016-06-20 2023-08-01 Bfly Operations, Inc. Universal ultrasound device and related apparatus and methods
US10856840B2 (en) 2016-06-20 2020-12-08 Butterfly Network, Inc. Universal ultrasound device and related apparatus and methods
CN109310396B (zh) 2016-06-20 2021-11-09 蝴蝶网络有限公司 用于辅助用户操作超声装置的自动图像获取
USD846749S1 (en) 2017-10-27 2019-04-23 Butterfly Network, Inc. Ultrasound probe
USD846128S1 (en) 2018-01-12 2019-04-16 Butterfly Network, Inc Ultrasound probe housing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180088694A1 (en) * 2014-09-19 2018-03-29 Samsung Electronics Co., Ltd. Ultrasound diagnosis apparatus and method and computer-readable storage medium
US20170124700A1 (en) * 2015-10-30 2017-05-04 General Electric Company Method and system for measuring a volume from an ultrasound image
US20180015256A1 (en) * 2016-07-14 2018-01-18 C. R. Bard, Inc. Automated Catheter-To-Vessel Size Comparison Tool And Related Methods

Also Published As

Publication number Publication date
US11937983B2 (en) 2024-03-26
US11638572B2 (en) 2023-05-02
US20200129155A1 (en) 2020-04-30
US20230329676A1 (en) 2023-10-19

Similar Documents

Publication Publication Date Title
US11937983B2 (en) Methods and apparatus for performing measurements on an ultrasound image
US10893850B2 (en) Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
US20200214682A1 (en) Methods and apparatuses for tele-medicine
US20200046322A1 (en) Methods and apparatuses for determining and displaying locations on images of body portions based on ultrasound data
US20200214672A1 (en) Methods and apparatuses for collection of ultrasound data
US20200129156A1 (en) Methods and apparatus for collecting color doppler ultrasound data
US11559279B2 (en) Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
WO2020146244A1 (fr) Procédés et appareils pour la collecte de données ultrasonores
US20200037986A1 (en) Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
US11596382B2 (en) Methods and apparatuses for enabling a user to manually modify an input to a calculation performed based on an ultrasound image
US11727558B2 (en) Methods and apparatuses for collection and visualization of ultrasound data
KR20150114285A (ko) 초음파 진단 장치 및 그 동작방법
JP2023549093A (ja) 高レベル画像理解によるロバストセグメンテーション
CN109069105B (zh) 超声医学检测设备及成像控制方法、成像系统、控制器
US20210330296A1 (en) Methods and apparatuses for enhancing ultrasound data
US20210196237A1 (en) Methods and apparatuses for modifying the location of an ultrasound imaging plane
US10146908B2 (en) Method and system for enhanced visualization and navigation of three dimensional and four dimensional medical images
US10788964B1 (en) Method and system for presenting function data associated with a user input device at a main display in response to a presence signal provided via the user input device
US11631172B2 (en) Methods and apparatuses for guiding collection of ultrasound images
US20220401080A1 (en) Methods and apparatuses for guiding a user to collect ultrasound images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19876040

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19876040

Country of ref document: EP

Kind code of ref document: A1