US20090012394A1 - User interface for ultrasound system - Google Patents
User interface for ultrasound system Download PDFInfo
- Publication number
- US20090012394A1 US20090012394A1 US12/112,946 US11294608A US2009012394A1 US 20090012394 A1 US20090012394 A1 US 20090012394A1 US 11294608 A US11294608 A US 11294608A US 2009012394 A1 US2009012394 A1 US 2009012394A1
- Authority
- US
- United States
- Prior art keywords
- ultrasound
- ultrasound system
- virtual
- control member
- display elements
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4427—Device being portable or laptop-like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/462—Displaying means of special interest characterised by constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/465—Displaying means of special interest adapted to display user selection data, e.g. icons or menus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/56—Details of data transmission or power supply
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4405—Device being mounted on a trolley
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52079—Constructional features
- G01S7/52084—Constructional features related to particular user interfaces
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
A user interface for an ultrasound system is provided. The ultrasound system includes the user interface having at least one user control member and a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display. A function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.
Description
- This application claims priority to and the benefit of the filing date of U.S. Provisional Application No. 60/914,893, filed Apr. 30, 2007 for “PORTABLE 3D/4D ULTRASOUND,” which is hereby incorporated by reference in its entirety.
- This invention relates generally to ultrasound systems and, more particularly, to a user interface for controlling ultrasound imaging systems, especially portable ultrasound medical imaging systems.
- Ultrasound systems typically include ultrasound scanning devices, such as ultrasound probes having transducers that allow for performing various ultrasound scans (e.g., imaging a volume or body). The ultrasound probes are typically connected to an ultrasound system for controlling the operation of the probes. The ultrasound system usually includes a control portion (e.g., a control console or portable unit) that provides interfaces for interacting with a user, such as receiving user inputs. For example, different buttons, knobs, etc. can be provided to allow a user to select different options and control the scanning of an object using the connected ultrasound probe.
- When using volume probes, for example three-dimensional (3D) or four-dimensional (4D) probes, certain procedures may require multiple steps and adjustments that can be controlled by different controllers, for example, using several rotatable control members (commonly referred to as rotaries) to adjust different settings. As a result, numerous control members of each of several different types can be included as part of the control portion. The control members are often mode dependent such that each of the control members control a different function or allow adjusting a different setting based on the mode of operation, for example, a visualization or rendering mode of operation.
- As the size of ultrasound systems continue to decrease, the available space of the various controllers on the control portion is limited. Moreover, as processing power continues to increase, portable ultrasound systems, which have increasingly smaller footprints, often include an entire ultrasound system (e.g., processing components, etc.) embodied within a housing having the dimensions of a typical laptop computer or smaller. Thus, the same functionality is often now available in portable systems as in larger systems. However, with the reduced space available in a compact unit, the reduced number of available control members can make it difficult or complex to control certain procedures or adjust different parameters. In some instances, these portable ultrasound systems may not have enough controls to allow a user to control all of the operations that would otherwise be available on a larger system, but that are still desirable in portable systems.
- In accordance with one embodiment, an ultrasound system is provided that includes a user interface having at least one user control member and a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display. A function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.
- In accordance with another embodiment, an ultrasound system is provided that includes an ultrasound volume probe for acquiring one of three-dimensional (3D) ultrasound data and four-dimensional (4D) ultrasound data and a portable control unit having a user interface and a display. The ultrasound volume probe is connected to the portable control unit and wherein manipulation of one of the 3D ultrasound data and 4D ultrasound data is provided by a single user control member of the user interface.
- In accordance with yet another embodiment, a method for controlling an ultrasound probe using a portable ultrasound system is provided. The method includes receiving a user input selecting one of a plurality of virtual display elements on a display on the portable ultrasound system. The method further includes configuring a user control member of the portable ultrasound system based on the received user input to control an operation of the portable ultrasound system.
-
FIG. 1 is a block diagram of an ultrasound system formed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 2 is a block diagram of the ultrasound processor module ofFIG. 1 formed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 3 is a top perspective view of a portable ultrasound imaging system formed in accordance with an exemplary embodiment of the inventive arrangements having at least one reconfigurable user input member. -
FIG. 4 is a top plan view of a user interface of the portable ultrasound imaging system ofFIG. 3 . -
FIG. 5 is an elevation view of a backend of the portable ultrasound imaging system ofFIG. 3 . -
FIG. 6 is a side elevation view of the portable ultrasound imaging system ofFIG. 3 . -
FIG. 7 is a perspective view of a case for the portable ultrasound imaging system ofFIG. 3 . -
FIG. 8 is a perspective view of a movable cart that is capable of supporting the portable ultrasound imaging system ofFIG. 3 . -
FIG. 9 is a top view of a hand carried or pocket-sized ultrasound imaging system formed in accordance with an exemplary embodiment of the inventive arrangements having at least one reconfigurable user input member. -
FIG. 10 is a screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 11 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance an exemplary embodiment of the inventive arrangements. -
FIG. 12 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 13 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 14 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 15 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 16 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 17 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 18 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 19 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. -
FIG. 20 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements. - The foregoing summary, as well as the following detailed description of certain embodiments of the inventive arrangements, will be better understood when read in conjunction with the appended drawings. To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. Thus, for example, one or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or random access memory, hard disk, or the like). Similarly, the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
- As used herein, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to “one embodiment” of the inventive arrangements are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “comprising” or “having” an element or a plurality of elements having a particular property may include additional such elements not having that property.
- It should be noted that although the various embodiments may be described in connection with an ultrasound system, the methods and systems described herein are not limited to ultrasound imaging. In particular, the various embodiments may be implemented in connection with different types of medical imaging, including, for example, magnetic resonance imaging (MRI) and computed-tomography (CT) imaging. Further, the various embodiments may be implemented in other non-medical imaging systems, for example, non-destructive testing systems.
- Exemplary embodiments of ultrasound systems provide a user interface for an ultrasound system. A plurality of virtual display elements (e.g., display icons) are selectable by a user to change the function controlled by a particular user control member. The selection of the virtual display elements reconfigures one or more of the user control members for controlling certain parameters, settings, etc. based on the selected virtual display element.
-
FIG. 1 illustrates a block diagram of anultrasound system 20 formed in accordance with various embodiments of the inventive arrangements. Theultrasound system 20 includes atransmitter 22 that drives an array of elements 24 (e.g., piezoelectric crystals) within atransducer 26 to emit pulsed ultrasonic signals into a body or volume. A variety of geometries may be used, and thetransducer 26 may be provided as part of, for example, different types of ultrasound probes. For example, the ultrasound probe may be a volume probe such as a three-dimensional (3D) probe or a four-dimensional (4D) probe wherein the array ofelements 24 can be mechanically moved. The array ofelements 24 may be swept or swung about an axis powered by amotor 25. In these embodiments, movement of the array ofelements 24 is controlled by amotor controller 27 andmotor driver 29. However, it should be noted that theultrasound system 20 may have connected thereto an ultrasound probe that is not capable of mechanical movement of the array ofelements 24. In such embodiments, themotor controller 27 andmotor driver 29 may or may not be provided and/or may be deactivated. Accordingly, themotor controller 27 andmotor driver 29 are optionally provided. - The emitted pulsed ultrasonic signals are back-scattered from structures in a body, for example, blood cells or muscular tissue, to produce echoes that return to any of the
elements 24. The echoes are received by areceiver 28. The received echoes are provided to abeamformer 30 that performs beamforming and outputs an RF signal. The RF signal is then provided to anRF processor 32 that processes the RF signal. Alternatively, theRF processor 32 may include a complex demodulator (not shown) that demodulates the RF signal to form IQ data pairs representative of the echo signals. The RF or IQ signal data may then be provided directly to amemory 34 for storage (e.g., temporary storage). - The
ultrasound system 20 also includes aprocessor module 36 to process the acquired ultrasound information (e.g., RF signal data or IQ data pairs) and prepare frames of ultrasound information for display on adisplay 38. Theprocessor module 36 is adapted to perform one or more processing operations according to a plurality of selectable ultrasound modalities on the acquired ultrasound information. Acquired ultrasound information may be processed in real-time during a scanning session as the echo signals are received. Additionally or alternatively, the ultrasound information may be stored temporarily in thememory 34 during a scanning session and processed in less than real-time in a live or off-line operation. Animage memory 40 is included for storing processed frames of acquired ultrasound information that are not scheduled to be displayed immediately. Theimage memory 40 may comprise any known data storage medium, for example, a permanent storage medium, removable storage medium, etc. - The
processor module 36 is connected to auser interface 42 that controls operation of theprocessor module 36 as explained below in more detail and is configured to receive inputs from an operator. Thedisplay 38 includes one or more monitors that present patient information, including diagnostic ultrasound images to the user for review, diagnosis, and/or analysis. Thedisplay 38 may automatically display, for example, one or more planes from a 3D ultrasound data set stored in thememory memories memory user interface 42. - The
display 38 also may display one or morevirtual display elements 50 that are selectable by a user and as described in more detail below. Based on the selection of avirtual display element 49, one or more corresponding controls of theuser interface 42, for example, the operations controlled by a trackball and/or the like (not shown) may be reconfigured. - In operation, the
ultrasound system 20 acquires data, for example, volumetric data sets by various techniques (e.g., 3D scanning, real-time 3D imaging, volume scanning, 2D scanning with transducers having positioning sensors, freehand scanning using a voxel correlation technique, scanning using 2D or matrix array transducers, etc.). The data may be acquired by mechanically moving the array ofelements 24 of thetransducer 26, for example, by performing a sweeping type of scan. Thetransducer 26 also may be moved manually, such as along a linear or arcuate path, while scanning a region of interest (ROI). At each linear or arcuate position, thetransducer 26 obtains scan planes that are stored in thememory 34. -
FIG. 2 illustrates an exemplary block diagram of theprocessor module 36 ofFIG. 1 . Theprocessor module 36 is illustrated conceptually as a collection of sub-modules, but it may also be implemented utilizing any combination of dedicated hardware boards, digital signal processors (DSPs), processors, etc. Alternatively, the sub-modules ofFIG. 2 may be implemented utilizing an off-the-shelf PC with a single processor or multiple processors, with functional operations distributed between the processors. As a further option, the sub-modules ofFIG. 2 may also be implemented utilizing a hybrid configuration, in which certain modular functions are performed utilizing dedicated hardware, while the remaining modular functions are performed utilizing an off-the-shelf PC and/or the like. The sub-modules also may be implemented as software modules within a processing unit. - The operations of the sub-modules illustrated in
FIG. 2 may be controlled by alocal ultrasound controller 50 or by theprocessor module 36. The sub-modules 52-68 perform mid-processor operations. Theultrasound processor module 36 may receiveultrasound data 70 in one of several forms. In the embodiment ofFIG. 2 , for example, the receivedultrasound data 70 constitutes IQ data pairs representing the real and imaginary components associated with each data sample. The IQ data pairs are provided, for example, to one or more of a color-flow sub-module 52, apower Doppler sub-module 54, a B-mode sub-module 56, aspectral Doppler sub-module 58, and an M-mode sub-module 60. Other sub-modules may also be included, such as an Acoustic Radiation Force Impulse (ARFI) sub-module 62, astrain sub-module 64, astrain rate sub-module 66, a Tissue Doppler (TDE) sub-module 68, among others. - Each of sub-modules 52-68 are configured to process the IQ data pairs in a corresponding manner to generate color-
flow data 72,power Doppler data 74, B-mode data 76,spectral Doppler data 78, M-mode data 80,ARFI data 82,echocardiographic strain data 84, echocardiographicstrain rate data 86, and tissue Doppler data 88, all of which may be stored in a memory 90 (ormemory 34 orimage memory 40 shown inFIG. 1 ) temporarily before subsequent processing. The data 72-88 may be stored, for example, as sets of vector data values, where each set defines an individual ultrasound image frame. The vector data values are generally organized based on the polar coordinate system. - A
scan converter sub-module 92 accesses and obtains from thememory 90 the vector data values associated with an image frame and converts the set of vector data values to Cartesian coordinates to generate anultrasound image frame 93 formatted for display. The ultrasound image frames 93 generated by thescan converter sub-module 92 may be provided back to thememory 90 for subsequent processing or may be provided to thememory 34 orimage memory 40. - Once the
scan converter sub-module 92 generates the ultrasound image frames 93 associated with the data, the image frames may be restored in thememory 90 or communicated over abus 96 to a database (not shown), thememory 34, theimage memory 40, and/or to other processors (not shown). - A 2D
video processor sub-module 94 may be used to combine one or more of the frames generated from the different types of ultrasound information. For example, the 2Dvideo processor sub-module 94 may combine different image frames by mapping one type of data to a gray map and mapping the other type of data to a color map for video display. In the final displayed image, the color pixel data is superimposed on the gray scale pixel data to form a singlemulti-mode image frame 98 that is again re-stored in thememory 90 or communicated over thebus 96. Successive frames of images may be stored as a cine loop in thememory 90 or memory 40 (shown inFIG. 1 ). The cine loop represents a first in, first out circular image buffer to capture image data that is displayed in real-time to the user, such as one or more heart cycles. The user may freeze the cine loop by entering a freeze command at theuser interface 42. Theuser interface 42 may include, for example, a keyboard, mouse, trackball, and/or all other input controls associated with inputting information into the ultrasound system 20 (shown inFIG. 1 ), which input controls may be reconfigured automatically based on selection of a virtual display element 49 (shown inFIG. 1 ) by the user. - A
3D processor sub-module 100 is also controlled by theuser interface 42 and accesses thememory 90 to obtain spatially consecutive groups of ultrasound image frames (that may be acquired, for example, by a sweeping ultrasound scan) and to generate three dimensional image representations thereof, such as through volume rendering or surface rendering algorithms, as are known. The three-dimensional images may be generated utilizing various imaging techniques, such as ray-casting, maximum intensity pixel projection, and/or the like. Additionally, the three-dimensional images may be displayed over time, thereby providing four-dimensional operation, as is known. - Various embodiments of the inventive arrangements can also be implemented in a miniaturized ultrasound imaging system, for example, a portable
ultrasound imaging system 110 as shown inFIG. 3 . The portableultrasound imaging system 110 may be, for example, a Voluson i compact 4D ultrasound system available from G.E. Healthcare in Waukesha, Wis. The portableultrasound imaging system 110 controls a probe (not shown) connected to the portableultrasound imaging system 110 via aprobe connector 112 that may be locked to the portableultrasound imaging system 110 using aprobe locking handle 114. Theuser interface 42 includes a plurality of user inputs and/or controls, which may be of different types, and are configured to receive commands from a user or operator. For example, theuser interface 42 may include a plurality of “soft”buttons 116, for example, toggle buttons and akeyboard 118, for example, an alphanumeric keyboard. Additionally, afunctional keyboard portion 120 may be provided that includes other user selectable buttons and controls. Other user controls also may be provided, such as atrackball 122 having atrackball ring 124 and a plurality of associatedbuttons 126, which may be activated by the fingers of a user when operating thetrackball 126. A plurality of sliding control members 128 (e.g., time control gain potentiometers) may also be provided, for example, adjacent thekeyboard 118. - The portable
ultrasound imaging system 110 also includes adisplay 130, for example, an integrated LCD display with adisplay latch 132 provided to lock thedisplay 130 to theuser interface 42. Apower button 134 is provided to power on and off the portableultrasound imaging system 110. The portableultrasound imaging system 110 with theuser interface 42 and the display defines a portable control unit. - It should be noted that as used herein, “miniaturized” generally means that the
ultrasound system 110 is a handheld or hand-carried device and/or is configured to be carried in a person's hand, pocket, briefcase-sized case, backpack, and/or the like. For example, theultrasound system 110 may be a hand-carried device having a size of a typical laptop computer, for instance, having dimensions of approximately 2.5 inches in depth, approximately 14 inches in width, and approximately 12 inches in height. Theultrasound system 110 may weigh about ten pounds or less, and is thus easily portable by the operator. Thedisplay 130 is configured to display, for example, a medical image and virtual display elements, as described below. - It further should be noted that ultrasonic data from the portable
ultrasound imaging system 110 may be sent to an external device (not shown), such as a printer or display, via a wired or wireless network (or direct connection, for example, via a serial or parallel cable or USB port). In some embodiments, the external device may be a computer or a workstation having a display. Alternatively, the external device may be a separate external display or a printer capable of receiving image data from the portableultrasound imaging system 110 and of displaying or printing images that may have greater resolution than thedisplay 130. - With particular reference to the
user interface 42, and as shown in more detail inFIG. 4 , a plurality of user controls may be provided as part of theuser interface 42. For example, “soft”buttons 116 may include afirst menu button 140, asecond menu button 142, a third menu button 144, and afourth menu button 146, each capable of movement in four directions. A plurality ofimaging buttons 148 may also be provided to select different imaging functions or operations. A plurality ofmode selection buttons 136 also may be provided to select different scanning modes, for example, 2D, 4D, pulsed wave doppler (PW), color flow mode (CFM), etc. Thefunctional keyboard portion 120 also includes other user selectable buttons and controls, such as buttons that allow for obtaining saved information, storing information, manipulating information or displayed images, calculating measurements relating to displayed images, changing a display format, etc. - The portable
ultrasound imaging system 110 also includes internal and external connections on aback end 160 as shown inFIG. 5 and on aside portion 170 as shown inFIG. 6 . For example, theback end 160 may include a VGA connector 162 (for connection, for example, to an external monitor), an RGB connector 164 (for connection, for example, to a printer) and apower supply input 166. Anetwork connector 168, for example, an Ethernet LAN input/output also may be provided and one ormore USB connectors 169 may be provided. On theside portion 170, and for example, aprobe connection 172 for connection to a probe, may be provided, and theprobe locking handle 114 is provided. It should be noted that different or additional connectors may be provided as desired or known, for example, based on the scanning applications for the portableultrasound imaging system 110. - The portable
ultrasound imaging system 110 also may be transported, stored, or operated in acase 180, as shown inFIG. 7 . Thecase 180 may be, for example, a padded case to protect the portableultrasound imaging system 110. - The portable
ultrasound imaging system 110 also may be configured for mounting to or to be supported by amoveable base 190, for example, a movable cart as shown inFIG. 8 . Themoveable base 190 includes asupport portion 192 for receiving and supporting the portableultrasound imaging system 110 and atray portion 194 that may be used, for example, to store peripherals. Themovable base 190 also may include one ormore probe holders 196 for supporting and holding therein one or more ultrasound probes, for example, one probe connected to the portableultrasound imaging system 110 and other probes configured to be connected to the portableultrasound imaging system 110. Afoot rest 198 also may be provided. Accordingly, the portableultrasound imaging system 110 may be configured to appear like a console-based type ultrasound imaging system. - However, it should be noted that the various embodiments may be implemented in connection with ultrasound systems having different sizes and shapes. For example, a hand carried or pocket-sized
ultrasound imaging system 200 may be provided as shown inFIG. 9 . In such asystem 200, thedisplay 130 anduser interface 42 can form a single unit. By way of example, the pocket-sizedultrasound imaging system 200 may be a pocket-sized or hand-sized ultrasound system approximately 2 inches wide, approximately 4 inches in length, and approximately ½ inches in depth and/or weigh less than 3 ounces. Thedisplay 130 may be, for example, a 320×320 pixel color LCD display (on which amedical image 210 can be displayed). A typewriter-like keyboard 202 ofbuttons 203 may optionally be included in theuser interface 42. It should be noted that the various embodiments may be implemented in connection with a pocket-sized ultrasound system 200 having different dimensions, weights, and/pr power consumptions. -
Multi-function controls 204 may each be assigned functions in accordance with the mode of system operation. Therefore, each of themulti-function controls 204 may be configured to provide a plurality of different actions.Label display areas 206 associated with themulti-function controls 204 may be included as necessary on thedisplay 130. Thesystem 200 may also have additional keys and/or controls 208 for special purpose functions, which may include, but are not limited to “freeze,” “depth control,” “gain control,” “color-mode,” “print,” and “store.” - Various embodiments of the inventive arrangements provide virtual display elements (e.g., display icons) that are selectable by a user to change the function controlled by a particular user control. The selection of the virtual display elements reconfigures one or more of the user controls for controlling certain parameters, settings, etc. based on the selected virtual display element. In general, and as shown in
FIG. 10 , a user is presented with a plurality of virtual display elements 220 a-220 e that may be displayed, for example, on ascreen 222, such as the display 38 (shown inFIG. 1 ). The virtual display elements 220 a-220 e are displayed on thescreen 222 adjacent (e.g., surrounding) or proximate an image that is selected by a user. For example, when a user places a virtual pointer 224 (e.g., virtual cross-hairs) over aparticular image 225, the virtual display elements 220 a-220 e are displayed on thescreen 222. It should be noted that the virtual display elements 220 a-220 e disappear once theimage 225 is no longer selected or thevirtual pointer 224 is moved away from theimage 225 and anotherimage 226 is selected or thevirtual pointer 224 is moved over that image or another user control member is activated (e.g., depressed). However, the virtual display elements 220 a-220 e may continue to be displayed in connection with theimage 225 for a predetermined period of time (e.g., 2 seconds) even after theimage 225 is no longer selected. - With the virtual display elements 220 a-220 e displayed on the
screen 222, a user may select one of the virtual display elements 220 a-220 e. Upon selecting one of the virtual display elements 220 a-220 e, the corresponding function represented by that virtual display element 220 a-220 e is now adjusted or controlled by one of the controls of the user interface 42 (shown inFIG. 4 ), for example, thetrackball 122. Accordingly, when a virtual display element 220 a-220 e is selected, the operation of thetrackball 122 is reconfigured and the control thereof remapped, for example, as shown in Table 1 below. -
TABLE 1 Screen Ctrl Icon Meaning Rot X Rotate around X-axis when in ref image A (respective axis in B, C, 3D). Rot Y Rotate around Y-axis when in ref image A (respective axis in B, C, 3D). Rot Z Rotate around Z-axis when in ref image A (respective axis in B, C, 3D). Parallel Shift Shift in Z-direction. Curved Render Start Move curved render start Move Move the data around. Borders of — Can be selected to resize the Renderbox Renderbox Home3D Switch 3D rendered image back to initial 3D position. Home3Dflat Toggle between 3D view and flat 3D view of the rendered image. - Accordingly, and for example, if a particular
virtual display element 220 a is selected, which may be selected by using thetrackball 122 to move thevirtual pointer 224 to theimage 225 and pressing one of the buttons 126 (shown inFIG. 4 ), then thetrackball 122 is reconfigured to control or adjust the parameter, function, etc. corresponding to thatvirtual display element 220 a, which, in the embodiment shown in Table 1, is to control rotation around the X-axis of theimage 225. Thus, the operation of thetrackball 122 is reconfigured to control or adjust the X-axis rotation. A user may then click one of thebuttons 126 to return thetrackball 122 to controlling movement of thevirtual pointer 224 and allowing selection of one of the othervirtual display elements 220 b-220 e. Alternatively, another one of the buttons of theuser interface 42 may deselect the operation corresponding to avirtual display element 220 a and allow the selection of one of the othervirtual display elements 220 b-220 e. - It should be noted that the virtual display elements 220 a-220 e may be configured as different icons and correspond to different function or operations than those illustrated in Table 1. It also should be noted that the selection of one of the virtual display elements 220 a-220 e may, instead of reconfiguring the
trackball 122, reconfigure another user control or theuser interface 42 or an external user control (e.g., a connected mouse). - Moreover, other information or selectable elements may be displayed on the
screen 222. For example, a plurality ofselectable elements 230 may be provided to allow for the selection of a particular visualization mode. - Referring now to
FIGS. 11-20 illustratingexemplary screenshots 232 including the virtual display elements 220 a-220 e, a render visualization mode (which may be selected using the selectable elements 230) for a 4D realtime acquisition is shown. Specifically, as shown inFIG. 11 , thevirtual pointer 224, illustrated as a mouse pointer, is moved, for example, using the trackball 122 (shown inFIG. 4 ), over aside 240 of a render box 242 (identifying the region of theimage 244 to be rendered). Theside 240 may be highlighted (e.g., highlighted by a color) when thevirtual pointer 224 is placed over theside 240. When theside 240 is selected, the virtual display elements 220 a-220 d and 220 f (e.g., icons) are displayed. It should be noted that thevirtual display element 220 e is not displayed in this screenshot, but it may be displayed in some embodiments. - As shown in
FIG. 11 , thevirtual pointer 224 has now been placed overvirtual display element 220 f (the icon shaped as a dot) that corresponds to a curved render start function. When thevirtual display element 220 f is selected, or when thevirtual pointer 224 is moved over thevirtual display element 220 f, thevirtual display element 220 f may be highlighted (e.g., highlighted or shadowed in yellow). Once thevirtual display element 220 f is selected, thetrackball 122 is reconfigured to adjust the curved render start function as shown inFIG. 13 . Once thevirtual display element 220 f is selected, thevirtual display element 220 f may be highlighted differently (e.g., highlighted in a different color, such as red) and a curved renderstart portion 244 of the renderbox 240 is displayed. It should be noted that once thevirtual display element 220 f is selected, the other virtual display elements 220 a-220 d disappear, and when thetrackball 122 is moved, the curved renderstart portion 244 is changed, for example, curved as adjusted by thetrackball 122 instead of straight as shown inFIG. 12 . It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown inFIG. 4 ) and the other virtual display elements 220 a-220 d appear again. - It also should be noted that a
virtual representation 246 of thetrackball 122 may be displayed on thedisplay 130 and indicate the functions corresponding to thetrackball 122 and thebuttons 126 in the current active display mode. - As shown in
FIG. 14 , another side 248 (or border) of the renderbox 240 may be selected and which reconfigures the functionality of thetrackball 122 to allow adjustment of the size of the renderbox 240. Theside 248 may be highlighted (e.g., highlighted in red) and all of virtual display elements 220 a-220 d and 220 f disappear. When thetrackball 122 is now moved, the size of the renderbox 240 is changed. For example, the renderbox 240 is now smaller inFIG. 14 than inFIG. 11-13 . It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown inFIG. 4 ) and the virtual display elements 220 a-220 d and 220 f appear again. - In the
screenshot 232 ofFIG. 15 , thevirtual display element 220 a has been selected and the othervirtual display elements 220 b-220 d and 220 f have disappeared. The selectedvirtual display element 220 a corresponds to a rotate around the y-axis, which now may be adjusted by thetrackball 122 that is reconfigured to control this operation. The selectedvirtual display element 220 a may be highlighted (e.g., highlighted in red) and when thetrackball 122 is moved, the volume data displayed is rotated around or about the y-axis, with the content of the displayedimages FIG. 4 ) and thevirtual display elements 220 b-220 d and 220 f appear again. - In
FIG. 16 , thevirtual pointer 224 has been moved over theimage 245. When thevirtual pointer 224 is moved over theimage 245, a different set of virtual display elements 220 a-220 d and now 220 g appear. In particular, thevirtual display element 220 g now appears and is configured as a “house” icon. It should be noted that a renderbox 241 may now appear on theimage 245. Thevirtual display element 220 g when selected changes the 3D display. In particular, as shown inFIG. 17 , the rendered image, specifically, theimage 245 is now displayed at an angle, the renderbox 241 is displayed as a three-dimensional box and thevirtual display element 220 g changes shape, for example, the “house” icon is rotated. If thevirtual display element 220 g is again selected, theimage 245 will again appear as shown inFIG. 16 and the shape of thevirtual display element 220 g will return to the “house” icon as shown inFIG. 16 . -
FIG. 18 illustrates a quad-view mode and the visualization mode now shows sectional planes. Thevirtual pointer 224 is now shown as moved over acenter dot 252 in theimage 250 and the dot is marked, for example, with amarker 254, such as cross-hairs that may be highlighted, for example, highlighted in yellow. The user may then select themarker 254, which may, for example, change color to red and a move center dot function is now assigned to thetrackball 122. All of the other virtual display elements 220 a-220 d also disappear. When thetrackball 122 is moved, thecenter dot 252 is moved and the content of theimage 250, as well as theimages FIG. 4 ) and the virtual display elements 220 a-220 d appear again. - As shown in
FIG. 20 , thevirtual pointer 224 has been moved over theimage 260 and not over any of the virtual display elements 220 a-220 d. Thevirtual pointer 224 now has a different shape, for example, a hand instead of an arrow or pointer. In this mode, if one of thebuttons 126 is selected (e.g., pressed by a user), a move image functionality is selected and assigned to thetrackball 122. When thetrackball 122 is moved, theimage 260 is moved on the display. - Thus, a single user control member can be used to manipulate, for example, 3D or 4D ultrasound data. For example, by assigning different operations to the single user control member based on selecting from a plurality of virtual display elements, the single user control member is reconfigured to control different operations or adjust different settings, parameters, etc.
- It should be noted that the some (or all) of virtual display elements 220 a-220 g may be displayed in different imaging modes, for example, a tomographic ultrasound image (TUI) mode or a SonoVCAD mode. However, different virtual display elements corresponding to different operations or functions may be displayed in addition to or instead of some or all of the virtual display elements 220 a-220 g. Also, it should be noted that in some modes, only a specific image or images can be adjusted and accordingly, the virtual display elements only appear when the
virtual pointer 224 is moved over those images. It also should be noted that only a single image may be displayed instead of the multiple images as illustrated. - Accordingly, the various embodiments automatically reconfigure the operation of a user control member (e.g., a trackball) based on a selected virtual display element such that the control operations performed by the user control member are remapped. The user control member is thereby used to adjust or control different functions based on the virtual display element selected. In one embodiment, based on the selected virtual display element corresponding to a particular function, setting, parameter, etc., the movement of the user control member is remapped to, for example, allow the particular, setting, parameter, etc. to be adjusted or changed based on the movement of the user control member that has been remapped. For example, a table or database is accessed and the corresponding motion of the user control member is mapped for the particular function, setting, parameter, etc. Thereafter, the relative movement of the user control member adjusts the particular function, setting, parameter, etc. corresponding to the selected virtual display element.
- At least one technical effect of the various embodiments of the inventive arrangements is automatically changing the control function or operation of a user control member based on the selection of a virtual display element. The user control member is reconfigured or reassigned to control or adjust a different operation or function based on the selected virtual display element.
- Some embodiments of the inventive arrangements provide a machine-readable medium or media having instructions recorded thereon for a processor or computer to operate an imaging apparatus to perform one or more embodiments of the methods described herein. The medium or media may be any type of CD-ROM, DVD, floppy disk, hard disk, optical disk, flash RAM drive, and/or other type of computer-readable medium, and/or a combination thereof.
- The various embodiments and/or components, for example, the processors, or components and controllers therein, may also be implemented as part of one or more computers or processors. Such a computer or processor may include a computing device, an input device, a display unit, and/or an interface, for example, for accessing the Internet. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus. The computer or processor may also include a memory. The memory may include Random Access Memory (RAM) and/or Read Only Memory (ROM). The computer or processor may further include a storage device, which may be a hard disk drive or a removable storage drive, such as a floppy disk drive, optical disk drive, and/or the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.
- As used herein, the term “computer” may include any processor-based or microprocessor-based system, including systems using microcontrollers, reduced instruction set computers (RISC), application specific integrated circuits (ASICs), logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only and thus not intended to limit in any way the definition and/or meaning of the term “computer.”
- The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired and/or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine.
- The set of instructions may include various commands that instruct the computer or processor as a processing machine to perform specific operations, such as the methods and processes of the various embodiments of the inventive arrangements. The set of instructions may be in the form of a software program. The software may be in various forms, such as system software or application software. In addition, the software may be in the form of a collection of separate programs, a program module within a larger program, or a portion of a program module. The software may also include modular programming in the form of object-oriented programming. The processing of input data by the processing machine may be in response to user commands, or in response to results of previous processing, or in response to a request made by another processing machine.
- As used herein, the terms “software” and “firmware” are interchangeable and include any computer program stored in memory for execution by a computer, including RAM memory, ROM memory, EPROM memory, EEPROM memory, and/or non-volatile RAM (NVRAM) memory. The above memory types are exemplary only and are thus not limiting as to the types of memory usable for storage of a computer program.
- It is to be understood that the above description is intended to be illustrative and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the inventive arrangements without departing from their scope. For example, the ordering of steps recited in a method need not be performed in a particular order unless explicitly stated or implicitly required (e.g., one step requires the results or a product of a previous step to be available). While some of the dimensions and types of materials described herein are intended to define the parameters of the inventive arrangements, they are by not limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reviewing and understanding the above description. The scope of the inventive arrangements should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and they are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. § 112, sixth paragraph, unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure.
- This written description uses examples to disclose the inventive arrangements, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices and/or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
Claims (20)
1. An ultrasound system, comprising:
a user interface having at least one user control member; and
a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display and wherein a function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.
2. The ultrasound system of claim 1 , wherein the at least one user control member is configured to be operated to select one of the plurality of virtual display elements using the virtual pointer.
3. The ultrasound system of claim 1 , wherein the at least one user control member comprises a trackball.
4. The ultrasound system of claim 1 , wherein the display is configured to display only the selected one of the virtual display elements.
5. The ultrasound system of claim 4 , wherein the display is configured to display the other display elements when one of (i) an adjustment using the at least one user control member is completed and (ii) a button corresponding to the user control member is activated.
6. The ultrasound system of claim 1 , wherein the plurality of virtual display elements comprise icons representative of the corresponding controlled function.
7. The ultrasound system of claim 1 , wherein the plurality of virtual display elements are displayed only when the virtual pointer is positioned over an image that can be changed.
8. The ultrasound system of claim 1 , wherein the plurality of virtual display elements are changed based on one of a mode of operation and a mode of visualization.
9. The ultrasound system of claim 1 , wherein the display automatically displays the plurality of virtual display elements when the virtual pointer is positioned over the image.
10. The ultrasound system of claim 1 , wherein the virtual display elements are displayed adjacent the image.
11. The ultrasound system of claim 1 , wherein the selected one of the plurality of virtual display elements is highlighted.
12. The ultrasound system of claim 1 , wherein the function controlled by the at least one user control member comprises an adjustment.
13. The ultrasound system of claim 1 , wherein an icon representing the selected one of the plurality of virtual display elements changes based on an input from the at least one user control member.
14. The ultrasound system of claim 1 , wherein the display is configured to display a virtual representation of the at least one user control member along with at least one indicated function corresponding to at least one user control member and one or more buttons associated with the at least one user control member.
15. The ultrasound system of claim 1 , further comprising:
a portable ultrasound unit including the user interface and the display.
16. An ultrasound system, comprising:
an ultrasound volume probe for acquiring one of three-dimensional (3D) ultrasound data and four-dimensional (4D) ultrasound data; and
a portable control unit having a user interface and a display, the ultrasound volume probe connected to the portable control unit, and wherein manipulation of one of the 3D ultrasound data and 4D ultrasound data is provided by a single user control member of the user interface.
17. The ultrasound system of claim 16 , wherein the single user control member comprises a trackball.
18. The ultrasound system of claim 16 , wherein the display is configured to display a plurality of selectable virtual display elements and a type of manipulation provided by the single user control member is determined based on a selected one of the plurality of selectable virtual display elements.
19. The ultrasound system of claim 16 , wherein the user interface does not include rotary controls and the manipulation is performed without the use of the rotary controls.
20. A method for controlling an ultrasound probe using a portable ultrasound system, comprising:
receiving a user input selecting one of a plurality of virtual display elements on a display on the portable ultrasound system; and
configuring a user control member of the portable ultrasound system based on the received user input to control an operation of the portable ultrasound system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/112,946 US20090012394A1 (en) | 2007-04-30 | 2008-04-30 | User interface for ultrasound system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US91489307P | 2007-04-30 | 2007-04-30 | |
US12/112,946 US20090012394A1 (en) | 2007-04-30 | 2008-04-30 | User interface for ultrasound system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090012394A1 true US20090012394A1 (en) | 2009-01-08 |
Family
ID=40222014
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/112,946 Abandoned US20090012394A1 (en) | 2007-04-30 | 2008-04-30 | User interface for ultrasound system |
US12/112,911 Expired - Fee Related US8038619B2 (en) | 2007-04-30 | 2008-04-30 | Motor driver for ultrasound system |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/112,911 Expired - Fee Related US8038619B2 (en) | 2007-04-30 | 2008-04-30 | Motor driver for ultrasound system |
Country Status (1)
Country | Link |
---|---|
US (2) | US20090012394A1 (en) |
Cited By (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009097652A1 (en) * | 2008-02-07 | 2009-08-13 | Signostics Pty Ltd | Remote display for medical scanning apparatus |
US20100022922A1 (en) * | 2004-10-06 | 2010-01-28 | Guided Therapy Systems, L.L.C. | Method and system for treating stretch marks |
US20110112405A1 (en) * | 2008-06-06 | 2011-05-12 | Ulthera, Inc. | Hand Wand for Ultrasonic Cosmetic Treatment and Imaging |
US20120029353A1 (en) * | 2010-08-02 | 2012-02-02 | Guided Therapy Systems, Llc | Systems and methods for ultrasound treatment |
US8857438B2 (en) | 2010-11-08 | 2014-10-14 | Ulthera, Inc. | Devices and methods for acoustic shielding |
US8915853B2 (en) | 2004-10-06 | 2014-12-23 | Guided Therapy Systems, Llc | Methods for face and neck lifts |
US8932224B2 (en) | 2004-10-06 | 2015-01-13 | Guided Therapy Systems, Llc | Energy based hyperhidrosis treatment |
US9011337B2 (en) | 2011-07-11 | 2015-04-21 | Guided Therapy Systems, Llc | Systems and methods for monitoring and controlling ultrasound power output and stability |
US9011336B2 (en) | 2004-09-16 | 2015-04-21 | Guided Therapy Systems, Llc | Method and system for combined energy therapy profile |
US9039617B2 (en) | 2009-11-24 | 2015-05-26 | Guided Therapy Systems, Llc | Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy |
US9039619B2 (en) | 2004-10-06 | 2015-05-26 | Guided Therapy Systems, L.L.C. | Methods for treating skin laxity |
US20150220259A1 (en) * | 2013-07-01 | 2015-08-06 | Samsung Electronics Co. Ltd. | Method and apparatus for changing user interface based on user motion information |
US9114247B2 (en) | 2004-09-16 | 2015-08-25 | Guided Therapy Systems, Llc | Method and system for ultrasound treatment with a multi-directional transducer |
US20150272547A1 (en) * | 2014-03-31 | 2015-10-01 | Siemens Medical Solutions Usa, Inc. | Acquisition control for elasticity ultrasound imaging |
US9216276B2 (en) | 2007-05-07 | 2015-12-22 | Guided Therapy Systems, Llc | Methods and systems for modulating medicants using acoustic energy |
US9263663B2 (en) | 2012-04-13 | 2016-02-16 | Ardent Sound, Inc. | Method of making thick film transducer arrays |
US9272162B2 (en) | 1997-10-14 | 2016-03-01 | Guided Therapy Systems, Llc | Imaging, therapy, and temperature monitoring ultrasonic method |
US9283409B2 (en) | 2004-10-06 | 2016-03-15 | Guided Therapy Systems, Llc | Energy based fat reduction |
US9283410B2 (en) | 2004-10-06 | 2016-03-15 | Guided Therapy Systems, L.L.C. | System and method for fat and cellulite reduction |
US9320537B2 (en) | 2004-10-06 | 2016-04-26 | Guided Therapy Systems, Llc | Methods for noninvasive skin tightening |
US9452302B2 (en) | 2011-07-10 | 2016-09-27 | Guided Therapy Systems, Llc | Systems and methods for accelerating healing of implanted material and/or native tissue |
US9504446B2 (en) | 2010-08-02 | 2016-11-29 | Guided Therapy Systems, Llc | Systems and methods for coupling an ultrasound source to tissue |
US9510802B2 (en) | 2012-09-21 | 2016-12-06 | Guided Therapy Systems, Llc | Reflective ultrasound technology for dermatological treatments |
US9529080B2 (en) | 2012-12-06 | 2016-12-27 | White Eagle Sonic Technologies, Inc. | System and apparatus having an application programming interface for flexible control of execution ultrasound actions |
US9530398B2 (en) | 2012-12-06 | 2016-12-27 | White Eagle Sonic Technologies, Inc. | Method for adaptively scheduling ultrasound system actions |
US9566454B2 (en) | 2006-09-18 | 2017-02-14 | Guided Therapy Systems, Llc | Method and sysem for non-ablative acne treatment and prevention |
US9694212B2 (en) | 2004-10-06 | 2017-07-04 | Guided Therapy Systems, Llc | Method and system for ultrasound treatment of skin |
US9700340B2 (en) | 2004-10-06 | 2017-07-11 | Guided Therapy Systems, Llc | System and method for ultra-high frequency ultrasound treatment |
US9827449B2 (en) | 2004-10-06 | 2017-11-28 | Guided Therapy Systems, L.L.C. | Systems for treating skin laxity |
US20180021015A1 (en) * | 2015-02-09 | 2018-01-25 | Hitachi, Ltd. | Ultrasonic diagnostic device |
US9907535B2 (en) | 2000-12-28 | 2018-03-06 | Ardent Sound, Inc. | Visual imaging system for ultrasonic probe |
US9983905B2 (en) | 2012-12-06 | 2018-05-29 | White Eagle Sonic Technologies, Inc. | Apparatus and system for real-time execution of ultrasound system actions |
US10031666B2 (en) | 2012-04-26 | 2018-07-24 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying function of button of ultrasound apparatus on the button |
US10039938B2 (en) | 2004-09-16 | 2018-08-07 | Guided Therapy Systems, Llc | System and method for variable depth ultrasound treatment |
US10076313B2 (en) | 2012-12-06 | 2018-09-18 | White Eagle Sonic Technologies, Inc. | System and method for automatically adjusting beams to scan an object in a body |
US10420960B2 (en) | 2013-03-08 | 2019-09-24 | Ulthera, Inc. | Devices and methods for multi-focus ultrasound therapy |
US10499884B2 (en) | 2012-12-06 | 2019-12-10 | White Eagle Sonic Technologies, Inc. | System and method for scanning for a second object within a first object using an adaptive scheduler |
US10561862B2 (en) | 2013-03-15 | 2020-02-18 | Guided Therapy Systems, Llc | Ultrasound treatment device and methods of use |
US20200060669A1 (en) * | 2018-08-22 | 2020-02-27 | Covidien Lp | Surgical retractor including three-dimensional (3d) imaging capability |
US10603521B2 (en) | 2014-04-18 | 2020-03-31 | Ulthera, Inc. | Band transducer ultrasound therapy |
US10864385B2 (en) | 2004-09-24 | 2020-12-15 | Guided Therapy Systems, Llc | Rejuvenating skin by heating tissue for cosmetic treatment of the face and body |
CN112690825A (en) * | 2019-10-22 | 2021-04-23 | 通用电气精准医疗有限责任公司 | Method and system for providing a hand-drawn rendering start line drawing tool and automatic rendering preset selection |
US11207548B2 (en) | 2004-10-07 | 2021-12-28 | Guided Therapy Systems, L.L.C. | Ultrasound probe for treating skin laxity |
US11224895B2 (en) | 2016-01-18 | 2022-01-18 | Ulthera, Inc. | Compact ultrasound device having annular ultrasound array peripherally electrically connected to flexible printed circuit board and method of assembly thereof |
US11235179B2 (en) | 2004-10-06 | 2022-02-01 | Guided Therapy Systems, Llc | Energy based skin gland treatment |
US11241218B2 (en) | 2016-08-16 | 2022-02-08 | Ulthera, Inc. | Systems and methods for cosmetic ultrasound treatment of skin |
US11338156B2 (en) | 2004-10-06 | 2022-05-24 | Guided Therapy Systems, Llc | Noninvasive tissue tightening system |
US11717661B2 (en) | 2007-05-07 | 2023-08-08 | Guided Therapy Systems, Llc | Methods and systems for ultrasound assisted delivery of a medicant to tissue |
US11724133B2 (en) | 2004-10-07 | 2023-08-15 | Guided Therapy Systems, Llc | Ultrasound probe for treatment of skin |
US11883688B2 (en) | 2004-10-06 | 2024-01-30 | Guided Therapy Systems, Llc | Energy based fat reduction |
US11944849B2 (en) | 2018-02-20 | 2024-04-02 | Ulthera, Inc. | Systems and methods for combined cosmetic treatment of cellulite with ultrasound |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8721553B2 (en) * | 2007-05-15 | 2014-05-13 | General Electric Company | Fluid-fillable ultrasound imaging catheter tips |
JP2009011711A (en) * | 2007-07-09 | 2009-01-22 | Toshiba Corp | Ultrasonic diagnosis apparatus |
CN102368955B (en) * | 2009-04-01 | 2014-07-16 | 模拟技术公司 | Ultrasound probe |
US8647279B2 (en) * | 2010-06-10 | 2014-02-11 | Siemens Medical Solutions Usa, Inc. | Volume mechanical transducer for medical diagnostic ultrasound |
US8684933B2 (en) * | 2010-08-17 | 2014-04-01 | Imsonic Medical, Inc. | Handheld ultrasound color flow imaging system with mechanically scanned, mechanically focused multi-element transducers |
JP6069848B2 (en) * | 2012-02-24 | 2017-02-01 | セイコーエプソン株式会社 | Probe head, ultrasonic probe, electronic device and diagnostic device |
GB201204831D0 (en) | 2012-03-20 | 2012-05-02 | Netscientific Ltd | Programmable medical devices |
US10517569B2 (en) | 2012-05-09 | 2019-12-31 | The Regents Of The University Of Michigan | Linear magnetic drive transducer for ultrasound imaging |
US11406415B2 (en) | 2012-06-11 | 2022-08-09 | Tenex Health, Inc. | Systems and methods for tissue treatment |
US9414810B2 (en) * | 2013-01-24 | 2016-08-16 | B-K Medical Aps | Ultrasound imaging system |
JP2015136569A (en) * | 2014-01-24 | 2015-07-30 | 日立金属株式会社 | Ultrasonic probe |
US9962181B2 (en) | 2014-09-02 | 2018-05-08 | Tenex Health, Inc. | Subcutaneous wound debridement |
CN213156021U (en) | 2019-09-20 | 2021-05-11 | 巴德阿克塞斯系统股份有限公司 | Ultrasound system for accessing the vascular system of a patient |
EP4203799A1 (en) * | 2020-09-10 | 2023-07-05 | Bard Access Systems, Inc. | Ultrasound probe with pressure measurement capability |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6039047A (en) * | 1998-10-30 | 2000-03-21 | Acuson Corporation | Method and system for changing the appearance of a control region of a medical device such as a diagnostic medical ultrasound system |
US20050168488A1 (en) * | 2004-02-03 | 2005-08-04 | Montague Roland W. | Combination tool that zooms in, zooms out, pans, rotates, draws, or manipulates during a drag |
US6961905B1 (en) * | 2000-06-23 | 2005-11-01 | Microsoft Corporation | Method and system for modifying an image on a web page |
US7010761B2 (en) * | 2001-10-18 | 2006-03-07 | Sony Computer Entertainment America Inc. | Controller selectable hyperlinks |
US7242387B2 (en) * | 2002-10-18 | 2007-07-10 | Autodesk, Inc. | Pen-mouse system |
US20090054768A1 (en) * | 2007-08-24 | 2009-02-26 | Menachem Halmann | Method and apparatus for voice recording with ultrasound imaging |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5159931A (en) * | 1988-11-25 | 1992-11-03 | Riccardo Pini | Apparatus for obtaining a three-dimensional reconstruction of anatomic structures through the acquisition of echographic images |
US7534211B2 (en) * | 2002-03-29 | 2009-05-19 | Sonosite, Inc. | Modular apparatus for diagnostic ultrasound |
US7282878B1 (en) * | 2006-04-28 | 2007-10-16 | Rakov Mikhail A | Systems for brushless DC electrical drive control |
-
2008
- 2008-04-30 US US12/112,946 patent/US20090012394A1/en not_active Abandoned
- 2008-04-30 US US12/112,911 patent/US8038619B2/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6039047A (en) * | 1998-10-30 | 2000-03-21 | Acuson Corporation | Method and system for changing the appearance of a control region of a medical device such as a diagnostic medical ultrasound system |
US6961905B1 (en) * | 2000-06-23 | 2005-11-01 | Microsoft Corporation | Method and system for modifying an image on a web page |
US7010761B2 (en) * | 2001-10-18 | 2006-03-07 | Sony Computer Entertainment America Inc. | Controller selectable hyperlinks |
US7242387B2 (en) * | 2002-10-18 | 2007-07-10 | Autodesk, Inc. | Pen-mouse system |
US20050168488A1 (en) * | 2004-02-03 | 2005-08-04 | Montague Roland W. | Combination tool that zooms in, zooms out, pans, rotates, draws, or manipulates during a drag |
US20090054768A1 (en) * | 2007-08-24 | 2009-02-26 | Menachem Halmann | Method and apparatus for voice recording with ultrasound imaging |
Cited By (120)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9272162B2 (en) | 1997-10-14 | 2016-03-01 | Guided Therapy Systems, Llc | Imaging, therapy, and temperature monitoring ultrasonic method |
US9907535B2 (en) | 2000-12-28 | 2018-03-06 | Ardent Sound, Inc. | Visual imaging system for ultrasonic probe |
US10039938B2 (en) | 2004-09-16 | 2018-08-07 | Guided Therapy Systems, Llc | System and method for variable depth ultrasound treatment |
US9114247B2 (en) | 2004-09-16 | 2015-08-25 | Guided Therapy Systems, Llc | Method and system for ultrasound treatment with a multi-directional transducer |
US9011336B2 (en) | 2004-09-16 | 2015-04-21 | Guided Therapy Systems, Llc | Method and system for combined energy therapy profile |
US9095697B2 (en) | 2004-09-24 | 2015-08-04 | Guided Therapy Systems, Llc | Methods for preheating tissue for cosmetic treatment of the face and body |
US11590370B2 (en) | 2004-09-24 | 2023-02-28 | Guided Therapy Systems, Llc | Rejuvenating skin by heating tissue for cosmetic treatment of the face and body |
US10864385B2 (en) | 2004-09-24 | 2020-12-15 | Guided Therapy Systems, Llc | Rejuvenating skin by heating tissue for cosmetic treatment of the face and body |
US10328289B2 (en) | 2004-09-24 | 2019-06-25 | Guided Therapy Systems, Llc | Rejuvenating skin by heating tissue for cosmetic treatment of the face and body |
US9895560B2 (en) | 2004-09-24 | 2018-02-20 | Guided Therapy Systems, Llc | Methods for rejuvenating skin by heating tissue for cosmetic treatment of the face and body |
US10960236B2 (en) | 2004-10-06 | 2021-03-30 | Guided Therapy Systems, Llc | System and method for noninvasive skin tightening |
US9283410B2 (en) | 2004-10-06 | 2016-03-15 | Guided Therapy Systems, L.L.C. | System and method for fat and cellulite reduction |
US11883688B2 (en) | 2004-10-06 | 2024-01-30 | Guided Therapy Systems, Llc | Energy based fat reduction |
US11717707B2 (en) | 2004-10-06 | 2023-08-08 | Guided Therapy Systems, Llc | System and method for noninvasive skin tightening |
US11697033B2 (en) | 2004-10-06 | 2023-07-11 | Guided Therapy Systems, Llc | Methods for lifting skin tissue |
US20100022922A1 (en) * | 2004-10-06 | 2010-01-28 | Guided Therapy Systems, L.L.C. | Method and system for treating stretch marks |
US11400319B2 (en) | 2004-10-06 | 2022-08-02 | Guided Therapy Systems, Llc | Methods for lifting skin tissue |
US11338156B2 (en) | 2004-10-06 | 2022-05-24 | Guided Therapy Systems, Llc | Noninvasive tissue tightening system |
US11235179B2 (en) | 2004-10-06 | 2022-02-01 | Guided Therapy Systems, Llc | Energy based skin gland treatment |
US11235180B2 (en) | 2004-10-06 | 2022-02-01 | Guided Therapy Systems, Llc | System and method for noninvasive skin tightening |
US8932224B2 (en) | 2004-10-06 | 2015-01-13 | Guided Therapy Systems, Llc | Energy based hyperhidrosis treatment |
US9283409B2 (en) | 2004-10-06 | 2016-03-15 | Guided Therapy Systems, Llc | Energy based fat reduction |
US11207547B2 (en) | 2004-10-06 | 2021-12-28 | Guided Therapy Systems, Llc | Probe for ultrasound tissue treatment |
US9320537B2 (en) | 2004-10-06 | 2016-04-26 | Guided Therapy Systems, Llc | Methods for noninvasive skin tightening |
US11179580B2 (en) | 2004-10-06 | 2021-11-23 | Guided Therapy Systems, Llc | Energy based fat reduction |
US9421029B2 (en) | 2004-10-06 | 2016-08-23 | Guided Therapy Systems, Llc | Energy based hyperhidrosis treatment |
US9427601B2 (en) | 2004-10-06 | 2016-08-30 | Guided Therapy Systems, Llc | Methods for face and neck lifts |
US9427600B2 (en) | 2004-10-06 | 2016-08-30 | Guided Therapy Systems, L.L.C. | Systems for treating skin laxity |
US9440096B2 (en) | 2004-10-06 | 2016-09-13 | Guided Therapy Systems, Llc | Method and system for treating stretch marks |
US11167155B2 (en) | 2004-10-06 | 2021-11-09 | Guided Therapy Systems, Llc | Ultrasound probe for treatment of skin |
US10888718B2 (en) | 2004-10-06 | 2021-01-12 | Guided Therapy Systems, L.L.C. | Ultrasound probe for treating skin laxity |
US10888716B2 (en) | 2004-10-06 | 2021-01-12 | Guided Therapy Systems, Llc | Energy based fat reduction |
US9522290B2 (en) | 2004-10-06 | 2016-12-20 | Guided Therapy Systems, Llc | System and method for fat and cellulite reduction |
US10888717B2 (en) | 2004-10-06 | 2021-01-12 | Guided Therapy Systems, Llc | Probe for ultrasound tissue treatment |
US10610706B2 (en) | 2004-10-06 | 2020-04-07 | Guided Therapy Systems, Llc | Ultrasound probe for treatment of skin |
US9533175B2 (en) | 2004-10-06 | 2017-01-03 | Guided Therapy Systems, Llc | Energy based fat reduction |
US10610705B2 (en) | 2004-10-06 | 2020-04-07 | Guided Therapy Systems, L.L.C. | Ultrasound probe for treating skin laxity |
US9694211B2 (en) | 2004-10-06 | 2017-07-04 | Guided Therapy Systems, L.L.C. | Systems for treating skin laxity |
US9694212B2 (en) | 2004-10-06 | 2017-07-04 | Guided Therapy Systems, Llc | Method and system for ultrasound treatment of skin |
US9700340B2 (en) | 2004-10-06 | 2017-07-11 | Guided Therapy Systems, Llc | System and method for ultra-high frequency ultrasound treatment |
US9707412B2 (en) | 2004-10-06 | 2017-07-18 | Guided Therapy Systems, Llc | System and method for fat and cellulite reduction |
US9713731B2 (en) | 2004-10-06 | 2017-07-25 | Guided Therapy Systems, Llc | Energy based fat reduction |
US10603519B2 (en) | 2004-10-06 | 2020-03-31 | Guided Therapy Systems, Llc | Energy based fat reduction |
US10603523B2 (en) | 2004-10-06 | 2020-03-31 | Guided Therapy Systems, Llc | Ultrasound probe for tissue treatment |
US10532230B2 (en) | 2004-10-06 | 2020-01-14 | Guided Therapy Systems, Llc | Methods for face and neck lifts |
US9827449B2 (en) | 2004-10-06 | 2017-11-28 | Guided Therapy Systems, L.L.C. | Systems for treating skin laxity |
US9827450B2 (en) | 2004-10-06 | 2017-11-28 | Guided Therapy Systems, L.L.C. | System and method for fat and cellulite reduction |
US9833639B2 (en) | 2004-10-06 | 2017-12-05 | Guided Therapy Systems, L.L.C. | Energy based fat reduction |
US9833640B2 (en) | 2004-10-06 | 2017-12-05 | Guided Therapy Systems, L.L.C. | Method and system for ultrasound treatment of skin |
US10525288B2 (en) | 2004-10-06 | 2020-01-07 | Guided Therapy Systems, Llc | System and method for noninvasive skin tightening |
US8915870B2 (en) | 2004-10-06 | 2014-12-23 | Guided Therapy Systems, Llc | Method and system for treating stretch marks |
US10265550B2 (en) | 2004-10-06 | 2019-04-23 | Guided Therapy Systems, L.L.C. | Ultrasound probe for treating skin laxity |
US8915853B2 (en) | 2004-10-06 | 2014-12-23 | Guided Therapy Systems, Llc | Methods for face and neck lifts |
US9974982B2 (en) | 2004-10-06 | 2018-05-22 | Guided Therapy Systems, Llc | System and method for noninvasive skin tightening |
US10252086B2 (en) | 2004-10-06 | 2019-04-09 | Guided Therapy Systems, Llc | Ultrasound probe for treatment of skin |
US10010724B2 (en) | 2004-10-06 | 2018-07-03 | Guided Therapy Systems, L.L.C. | Ultrasound probe for treating skin laxity |
US10010725B2 (en) | 2004-10-06 | 2018-07-03 | Guided Therapy Systems, Llc | Ultrasound probe for fat and cellulite reduction |
US10010721B2 (en) | 2004-10-06 | 2018-07-03 | Guided Therapy Systems, L.L.C. | Energy based fat reduction |
US10010726B2 (en) | 2004-10-06 | 2018-07-03 | Guided Therapy Systems, Llc | Ultrasound probe for treatment of skin |
US10245450B2 (en) | 2004-10-06 | 2019-04-02 | Guided Therapy Systems, Llc | Ultrasound probe for fat and cellulite reduction |
US10238894B2 (en) | 2004-10-06 | 2019-03-26 | Guided Therapy Systems, L.L.C. | Energy based fat reduction |
US10046181B2 (en) | 2004-10-06 | 2018-08-14 | Guided Therapy Systems, Llc | Energy based hyperhidrosis treatment |
US10046182B2 (en) | 2004-10-06 | 2018-08-14 | Guided Therapy Systems, Llc | Methods for face and neck lifts |
US9039619B2 (en) | 2004-10-06 | 2015-05-26 | Guided Therapy Systems, L.L.C. | Methods for treating skin laxity |
US11724133B2 (en) | 2004-10-07 | 2023-08-15 | Guided Therapy Systems, Llc | Ultrasound probe for treatment of skin |
US11207548B2 (en) | 2004-10-07 | 2021-12-28 | Guided Therapy Systems, L.L.C. | Ultrasound probe for treating skin laxity |
US9566454B2 (en) | 2006-09-18 | 2017-02-14 | Guided Therapy Systems, Llc | Method and sysem for non-ablative acne treatment and prevention |
US11717661B2 (en) | 2007-05-07 | 2023-08-08 | Guided Therapy Systems, Llc | Methods and systems for ultrasound assisted delivery of a medicant to tissue |
US9216276B2 (en) | 2007-05-07 | 2015-12-22 | Guided Therapy Systems, Llc | Methods and systems for modulating medicants using acoustic energy |
WO2009097652A1 (en) * | 2008-02-07 | 2009-08-13 | Signostics Pty Ltd | Remote display for medical scanning apparatus |
US10537304B2 (en) | 2008-06-06 | 2020-01-21 | Ulthera, Inc. | Hand wand for ultrasonic cosmetic treatment and imaging |
US20110112405A1 (en) * | 2008-06-06 | 2011-05-12 | Ulthera, Inc. | Hand Wand for Ultrasonic Cosmetic Treatment and Imaging |
US11723622B2 (en) | 2008-06-06 | 2023-08-15 | Ulthera, Inc. | Systems for ultrasound treatment |
US11123039B2 (en) | 2008-06-06 | 2021-09-21 | Ulthera, Inc. | System and method for ultrasound treatment |
US9039617B2 (en) | 2009-11-24 | 2015-05-26 | Guided Therapy Systems, Llc | Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy |
US9345910B2 (en) | 2009-11-24 | 2016-05-24 | Guided Therapy Systems Llc | Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy |
US9504446B2 (en) | 2010-08-02 | 2016-11-29 | Guided Therapy Systems, Llc | Systems and methods for coupling an ultrasound source to tissue |
US9149658B2 (en) * | 2010-08-02 | 2015-10-06 | Guided Therapy Systems, Llc | Systems and methods for ultrasound treatment |
US20120029353A1 (en) * | 2010-08-02 | 2012-02-02 | Guided Therapy Systems, Llc | Systems and methods for ultrasound treatment |
US10183182B2 (en) | 2010-08-02 | 2019-01-22 | Guided Therapy Systems, Llc | Methods and systems for treating plantar fascia |
US8857438B2 (en) | 2010-11-08 | 2014-10-14 | Ulthera, Inc. | Devices and methods for acoustic shielding |
US9452302B2 (en) | 2011-07-10 | 2016-09-27 | Guided Therapy Systems, Llc | Systems and methods for accelerating healing of implanted material and/or native tissue |
US9011337B2 (en) | 2011-07-11 | 2015-04-21 | Guided Therapy Systems, Llc | Systems and methods for monitoring and controlling ultrasound power output and stability |
US9263663B2 (en) | 2012-04-13 | 2016-02-16 | Ardent Sound, Inc. | Method of making thick film transducer arrays |
US10031666B2 (en) | 2012-04-26 | 2018-07-24 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying function of button of ultrasound apparatus on the button |
US11086513B2 (en) | 2012-04-26 | 2021-08-10 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying function of button of ultrasound apparatus on the button |
US11726655B2 (en) | 2012-04-26 | 2023-08-15 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying function of button of ultrasound apparatus on the button |
US9802063B2 (en) | 2012-09-21 | 2017-10-31 | Guided Therapy Systems, Llc | Reflective ultrasound technology for dermatological treatments |
US9510802B2 (en) | 2012-09-21 | 2016-12-06 | Guided Therapy Systems, Llc | Reflective ultrasound technology for dermatological treatments |
US11883242B2 (en) | 2012-12-06 | 2024-01-30 | White Eagle Sonic Technologies, Inc. | System and method for scanning for a second object within a first object using an adaptive scheduler |
US9529080B2 (en) | 2012-12-06 | 2016-12-27 | White Eagle Sonic Technologies, Inc. | System and apparatus having an application programming interface for flexible control of execution ultrasound actions |
US9530398B2 (en) | 2012-12-06 | 2016-12-27 | White Eagle Sonic Technologies, Inc. | Method for adaptively scheduling ultrasound system actions |
US9773496B2 (en) | 2012-12-06 | 2017-09-26 | White Eagle Sonic Technologies, Inc. | Apparatus and system for adaptively scheduling ultrasound system actions |
US10076313B2 (en) | 2012-12-06 | 2018-09-18 | White Eagle Sonic Technologies, Inc. | System and method for automatically adjusting beams to scan an object in a body |
US10235988B2 (en) | 2012-12-06 | 2019-03-19 | White Eagle Sonic Technologies, Inc. | Apparatus and system for adaptively scheduling ultrasound system actions |
US10499884B2 (en) | 2012-12-06 | 2019-12-10 | White Eagle Sonic Technologies, Inc. | System and method for scanning for a second object within a first object using an adaptive scheduler |
US11490878B2 (en) | 2012-12-06 | 2022-11-08 | White Eagle Sonic Technologies, Inc. | System and method for scanning for a second object within a first object using an adaptive scheduler |
US9983905B2 (en) | 2012-12-06 | 2018-05-29 | White Eagle Sonic Technologies, Inc. | Apparatus and system for real-time execution of ultrasound system actions |
US11517772B2 (en) | 2013-03-08 | 2022-12-06 | Ulthera, Inc. | Devices and methods for multi-focus ultrasound therapy |
US10420960B2 (en) | 2013-03-08 | 2019-09-24 | Ulthera, Inc. | Devices and methods for multi-focus ultrasound therapy |
US11969609B2 (en) | 2013-03-08 | 2024-04-30 | Ulthera, Inc. | Devices and methods for multi-focus ultrasound therapy |
US10561862B2 (en) | 2013-03-15 | 2020-02-18 | Guided Therapy Systems, Llc | Ultrasound treatment device and methods of use |
US20150301712A1 (en) * | 2013-07-01 | 2015-10-22 | Samsung Electronics Co., Ltd. | Method and apparatus for changing user interface based on user motion information |
US10095400B2 (en) * | 2013-07-01 | 2018-10-09 | Samsung Electronics Co., Ltd. | Method and apparatus for changing user interface based on user motion information |
US9904455B2 (en) | 2013-07-01 | 2018-02-27 | Samsung Electronics Co., Ltd. | Method and apparatus for changing user interface based on user motion information |
US10558350B2 (en) | 2013-07-01 | 2020-02-11 | Samsung Electronics Co., Ltd. | Method and apparatus for changing user interface based on user motion information |
US20150220259A1 (en) * | 2013-07-01 | 2015-08-06 | Samsung Electronics Co. Ltd. | Method and apparatus for changing user interface based on user motion information |
US9792033B2 (en) * | 2013-07-01 | 2017-10-17 | Samsung Electronics Co., Ltd. | Method and apparatus for changing user interface based on information related to a probe |
US20150272547A1 (en) * | 2014-03-31 | 2015-10-01 | Siemens Medical Solutions Usa, Inc. | Acquisition control for elasticity ultrasound imaging |
US11351401B2 (en) | 2014-04-18 | 2022-06-07 | Ulthera, Inc. | Band transducer ultrasound therapy |
US10603521B2 (en) | 2014-04-18 | 2020-03-31 | Ulthera, Inc. | Band transducer ultrasound therapy |
US20180021015A1 (en) * | 2015-02-09 | 2018-01-25 | Hitachi, Ltd. | Ultrasonic diagnostic device |
US11224895B2 (en) | 2016-01-18 | 2022-01-18 | Ulthera, Inc. | Compact ultrasound device having annular ultrasound array peripherally electrically connected to flexible printed circuit board and method of assembly thereof |
US11241218B2 (en) | 2016-08-16 | 2022-02-08 | Ulthera, Inc. | Systems and methods for cosmetic ultrasound treatment of skin |
US11944849B2 (en) | 2018-02-20 | 2024-04-02 | Ulthera, Inc. | Systems and methods for combined cosmetic treatment of cellulite with ultrasound |
US20200060669A1 (en) * | 2018-08-22 | 2020-02-27 | Covidien Lp | Surgical retractor including three-dimensional (3d) imaging capability |
US10828020B2 (en) * | 2018-08-22 | 2020-11-10 | Covidien Lp | Surgical retractor including three-dimensional (3D) imaging capability |
CN112690825A (en) * | 2019-10-22 | 2021-04-23 | 通用电气精准医疗有限责任公司 | Method and system for providing a hand-drawn rendering start line drawing tool and automatic rendering preset selection |
KR102500589B1 (en) * | 2019-10-22 | 2023-02-15 | 쥐이 프리시즌 헬스케어 엘엘씨 | Method and system for providing freehand render start line drawing tools and automatic render preset selections |
KR20210048415A (en) * | 2019-10-22 | 2021-05-03 | 쥐이 프리시즌 헬스케어 엘엘씨 | Method and system for providing freehand render start line drawing tools and automatic render preset selections |
Also Published As
Publication number | Publication date |
---|---|
US8038619B2 (en) | 2011-10-18 |
US20090012401A1 (en) | 2009-01-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090012394A1 (en) | User interface for ultrasound system | |
US9943288B2 (en) | Method and system for ultrasound data processing | |
US20120108960A1 (en) | Method and system for organizing stored ultrasound data | |
US9420996B2 (en) | Methods and systems for display of shear-wave elastography and strain elastography images | |
US20170238907A1 (en) | Methods and systems for generating an ultrasound image | |
US8469890B2 (en) | System and method for compensating for motion when displaying ultrasound motion tracking information | |
US8172753B2 (en) | Systems and methods for visualization of an ultrasound probe relative to an object | |
US7894663B2 (en) | Method and system for multiple view volume rendering | |
US8414495B2 (en) | Ultrasound patch probe with micro-motor | |
US20110255762A1 (en) | Method and system for determining a region of interest in ultrasound data | |
US9848849B2 (en) | System and method for touch screen control of an ultrasound system | |
US8480583B2 (en) | Methods and apparatus for 4D data acquisition and analysis in an ultrasound protocol examination | |
JP5475516B2 (en) | System and method for displaying ultrasonic motion tracking information | |
US20080161688A1 (en) | Portable Ultrasonic Diagnostic Imaging System with Docking Station | |
US9332966B2 (en) | Methods and systems for data communication in an ultrasound system | |
US20120116218A1 (en) | Method and system for displaying ultrasound data | |
US20100249589A1 (en) | System and method for functional ultrasound imaging | |
US9390546B2 (en) | Methods and systems for removing occlusions in 3D ultrasound images | |
US20090153548A1 (en) | Method and system for slice alignment in diagnostic imaging systems | |
US20090187102A1 (en) | Method and apparatus for wide-screen medical imaging | |
WO2006111874A2 (en) | Portable ultrasonic diagnostic imaging system with docking station | |
US8636662B2 (en) | Method and system for displaying system parameter information | |
EP3451932B1 (en) | Ultrasonic imaging system with simplified 3d imaging controls | |
US20110055148A1 (en) | System and method for reducing ultrasound information storage requirements | |
US20170086789A1 (en) | Methods and systems for providing a mean velocity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GENERAL ELECTRIC COMPANY, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HOBELSBERGER, PETRA;DUDA, WALTER;REEL/FRAME:021035/0496 Effective date: 20080430 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |