US20180153505A1 - Guided navigation of an ultrasound probe - Google Patents

Guided navigation of an ultrasound probe Download PDF

Info

Publication number
US20180153505A1
US20180153505A1 US15/831,375 US201715831375A US2018153505A1 US 20180153505 A1 US20180153505 A1 US 20180153505A1 US 201715831375 A US201715831375 A US 201715831375A US 2018153505 A1 US2018153505 A1 US 2018153505A1
Authority
US
United States
Prior art keywords
probe
pose
ultrasound
ultrasound probe
deviation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/831,375
Inventor
Charles Cadieu
Ha Hong
Kilian Koepsell
Johan Mathe
Martin Wojtczyk
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Caption Health Inc
Original Assignee
Bay Labs Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bay Labs Inc filed Critical Bay Labs Inc
Assigned to Bay Labs, Inc. reassignment Bay Labs, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CADIEU, CHARLES, HONG, HA, KOEPSELL, KILIAN, MATHE, JOHAN, WOJTCZYK, MARTIN
Publication of US20180153505A1 publication Critical patent/US20180153505A1/en
Assigned to Caption Health, Inc. reassignment Caption Health, Inc. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: Bay Labs, Inc.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5292Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device

Definitions

  • the present invention relates to ultrasound imaging and more particularly to ultrasound image acquisition.
  • Ultrasound imaging also known as sonography, is a medical imaging technique that employs high-frequency sound waves to view three-dimensional structures inside the body of a living being. Because ultrasound images are captured in real-time, ultrasound images also show movement of the internal organs of the body as well as blood flowing through the blood vessels of the human body and the stiffness of tissue. Unlike x-ray imaging, ultrasound imaging does not involve ionizing radiation thereby allowing prolonged usage of ultrasound imaging without threatening tissue and internal organ damage from prolonged radiation exposure.
  • a transducer commonly referred to as a probe
  • a thin layer of gel is applied to the skin so that the ultrasound waves are transmitted from the transducer through the medium of the gel into the body.
  • the ultrasound image is produced based upon a measurement of the reflection of the ultrasound waves off the body structures.
  • the strength of the ultrasound signal, measured as the amplitude of the detected sound wave reflection, and the time taken for the sound wave to travel through the body provide the information necessary to compute an image.
  • ultrasound presents several advantages to the diagnostician and patient.
  • ultrasound imaging provides images in real-time.
  • ultrasound imaging requires equipment that is portable and can be brought to the bedside of the patient.
  • the ultrasound imaging equipment is substantially lower in cost than other medical imaging equipment, and as noted, does not use harmful ionizing radiation. Even still, the production of quality ultrasound images remains highly dependent upon a skilled operator.
  • the skilled operator must know where to initially place the ultrasound probe. Then, the skilled operator must know how to spatially orient the probe and finally, the skilled operator must know where to move the probe so as to acquire the desired imagery.
  • the ultrasound operator is guided in the initial placement, orientation and movement of the probe based upon the visual feedback provided by the imagery produced during the ultrasound.
  • the navigation of the probe is a manual process consisting of iterative trial and error. Plainly, then, the modern process of ultrasound navigation is not optimal.
  • an ultrasound navigation assistance method includes acquiring an image by an ultrasound probe of a target organ of a body.
  • the method also includes submitting the image processing in connection with an estimator formed as a function or programmatic approximator, including by way of example, a classifier, regressor, a state machine or a neural network.
  • the processing with respect to the estimator produces as an output a deviation between a contemporaneous pose of the ultrasound probe, namely the position and orientation of the ultrasound probe relative to the target organ, and an optimal pose of the ultrasound probe for imaging the target organ.
  • the method includes presenting the deviation to an end user operator of the ultrasound probe.
  • the contemporaneous pose of the ultrasound probe is additionally improved based upon linear and angular movement data received from an inertial measurement system including at least one of an accelerometer, gyroscope and magnetometer.
  • the computed deviation is presented visually in a display of a computer system coupled to the probe, audibly through a varying of a tone based upon a proximity of the probe to the optimal pose, audibly by varying a frequency of repeatedly audibly presenting a short-duration sound based upon a proximity of the probe to the optimal pose, or haptically through a varying of vibrations of the probe based upon a proximity of the probe to the optimal pose.
  • an ultrasound imaging data processing system is configured for ultrasound navigation assistance.
  • the system includes a computer with memory and at least one processor, a display coupled to the computer, beamformer circuitry coupled to the computer and the display, and an ultrasound probe that has an array transducer connected to the beamformer circuitry.
  • the system additionally includes a navigation assistance module executing in the memory of the computer.
  • the module includes program code enabled upon execution by the processor of the computer to acquire an image by the ultrasound probe of a target organ of a body, to submit the image for processing in connection with an estimator, for instance, a neural network, so as to product a deviation between a contemporaneous pose of the ultrasound probe relative to the target organ and an optimal pose of the ultrasound probe for imaging the target organ, and to present the computed deviation to an end user operator of the ultrasound probe.
  • an estimator for instance, a neural network
  • FIG. 1 is a pictorial illustration of a process for guided navigation of an ultrasound probe
  • FIG. 2 is a schematic illustration of an ultrasound data processing system configured for guided navigation of an ultrasound probe
  • FIG. 3 is a flow chart illustrating a process for guided navigation of an ultrasound probe.
  • Embodiments of the invention provide for guided navigation of an ultrasound probe.
  • an ultrasound probe is placed on the surface of a body. Then, imagery of a target organ of the body is acquired and a deviation of a contemporaneous pose of the ultrasound pose evident from the acquired image from an optimal pose for the target organ is presented to an end user operator of the ultrasound probe. For example, the deviation is presented visually in respect to a corresponding display, audibly by way of an audible guidance signal, or in the alterative, by way of the text to speech presentation of textual instructions, or haptically through the outer shell of the ultrasound probe.
  • FIG. 1 is a pictorial illustration of a process for guided navigation of an ultrasound probe.
  • an ultrasound probe 120 is placed upon an outer surface of a body 110 such as a human form.
  • Imagery 130 of a target organ is acquired by the operator of the ultrasound probe 120 and the image 130 of the target organ is presented input to an estimator 140 , such as a neural network.
  • the estimator 140 is trained based upon a set of training images 150 of one or more different target organs, each with a known probe pose deviation from the optimal probe pose so that the input of the contemporaneously acquired image 130 to the estimator 140 produces an output of a deviation 190 of the contemporaneous pose of the ultrasound probe 120 from an optimal pose of the ultrasound probe 120 .
  • the ultrasound probe 120 acquires probe orientation and movement data 180 including magnetomic information 180 A, gyroscopic information 180 B and accelerometric information 180 C indicating an orientation and movement of the ultrasound probe 120 so as to compute the change in the pose of the ultrasound probe 120 .
  • Guided navigation logic 160 then processes the probe orientation and movement data 180 so as to better compute the deviation from the optimal pose 190 in consideration not only of the pose deviation 190 output by the estimator 140 in respect to the acquired image 130 but also in consideration of the change in the pose of the ultrasound probe 120 determined from the probe orientation and movement data 180 .
  • Guided navigation logic 160 then processes the pose deviation 190 of the ultrasound probe 120 and emits feedback 170 in the form of visual feedback such as a three-dimensionally rendered scene with the two probe models showing current and optimal probe poses with suggested maneuver; the amount of agreement between the current and optimal poses; or red, green or yellow colors indicating how large an adjustment of the orientation of the ultrasound probe 120 is required to approach the optimal pose, audible feedback such as a tone, or haptic feedback.
  • the ultrasound probe 120 may be caused to vibrate more intensely or with greater frequency responsive to the pose deviation 190 .
  • the ultrasound probe 120 may be caused to emit a sound that is more intense of a different tone when the ultrasound probe 120 based upon the magnitude of the pose deviation 190 .
  • the ultrasound probe 120 may be caused to emit a short-duration sound such as a click or pop repeatedly with a frequency related to the magnitude of the pose deviation 190 .
  • the ultrasound probe 120 may be caused to vibrate more intensely or with greater frequency when the ultrasound probe 120 is moved in a compliant manner based upon a magnitude of the pose deviation 190 .
  • FIG. 2 schematically illustrates an ultrasound data processing system configured for guided navigation of an ultrasound probe.
  • the system includes an ultrasound probe 210 coupled to a host computing system 200 of one or more computers, each with memory and at least one processor.
  • the ultrasound probe 210 is enabled to acquire ultrasound imagery by way of a transducer connected to beamformer circuitry in the host computing system 200 , and transmit the acquired ultrasound imagery to the beamformer circuitry of the host computing system 200 for display in a display of the host computing system 200 through an ultrasound user interface 290 provided in the memory of the host computing system 200 .
  • the ultrasound probe 210 includes an electromechanical vibration generator 230 such as a piezo actuator, and a tone generator 240 .
  • the electromechanical vibration generator 230 may be driven in the ultrasound probe 210 to cause the ultrasound probe 210 to vibrate at a specific frequency and for a specific duration as directed by the host computing system 200 .
  • the tone generator 240 may be driven in the ultrasound probe 210 to cause the ultrasound probe 210 to emit an audible tone at a specific frequency and amplitude and for a specific duration as directed by the host computing system 200 .
  • the tone generator 240 may be disposed in the host computing system 200 .
  • An image data store 260 stores therein a multiplicity of different ultrasound images previously acquired in a controlled setting where the pose deviation of each image is known.
  • the images of the image store 260 are provided as training images in training an estimator 220 such as a neural network providing decisioning of a deviation from an optimal probe pose relative to an input image of a target organ of a human form.
  • the estimator 220 includes a multiplicity of nodes processing different extracted features of an acquired image so as to decision a pose of the ultrasound probe providing a deviation of the decisioned pose from a known, optimal pose in imaging a target organ.
  • the pose can be represented mathematically in Euclidian space or any array of numbers.
  • a navigation assistance module 300 is coupled to the ultrasound user interface 290 .
  • the navigation assistance module 300 includes program code that when executed in the memory of the host computing system 200 acquires a contemporaneous ultrasound image by the ultrasound probe 210 and processes the acquired ultrasound image in the host computing platform 200 utilizing the estimator 220 .
  • the program code of the navigation assistance module 300 during execution in the memory of the host computing system 200 then receives with the assistance of the estimator 220 a computed deviation of a pose evident from the acquired image, from an optimal pose of the ultrasound probe 210 .
  • the ultrasound probe 210 includes an inertial measurement unit 250 .
  • the inertial measurement unit 250 includes each of a magnetometer 250 A, a gyroscope 250 B, and an accelerometer 250 C.
  • data acquired by the inertial measurement unit 250 is translated in the host computing system 200 to estimate a change in pose for a given time interval, for instance by measuring linear acceleration and angular velocity of the probe 210 .
  • This change in pose can be combined with an estimator-derived pose deviation to obtain a more precise pose estimate.
  • One possible example includes the use of a Kalman filter.
  • the process utilizing the inertial measurement unit 250 to tune a determined deviation from the estimator 220 can be expressed as follows:
  • I(t) be the image acquired by the ultrasound probe 210 at time t.
  • ⁇ p(t 1 , t 0 ) img p(t 1 ) img ⁇ p(t 0 ) img .
  • p(t 1 ) img p(t 0 ) img + ⁇ p(t 1 , t 0 ) img .
  • the change of pose between to and t 1 is measured from the inertial measurement unit 250 and denoted as ⁇ p(t 1 , t 0 ) IMU .
  • the program code of the navigation assistance module 300 determines corresponding feedback to be presented through the ultrasound probe 210 .
  • the program code of the navigation assistance module 300 may direct the tone generator 240 to emit a particular tone pattern of specific periodicity proportional or inversely proportional to a determined proximity of the ultrasound probe 210 to the optimal pose.
  • the program code of the navigation assistance module 300 may direct the electromechanical vibration generator 230 of the ultrasound probe 210 to emit a particular vibration of specific intensity proportional or inversely proportional to a determined proximity of the ultrasound probe 210 to the optimal pose.
  • FIG. 3 is a flow chart depicting a process for guided navigation of an ultrasound probe.
  • a target organ within the body is selected in a user interface to an ultrasound application visualizing ultrasound imagery acquired by the ultrasound probe.
  • an estimator such as a neural network pertaining to the target organ is loaded into memory of a computing system coupled to the ultrasound probe.
  • a contemporaneous ultrasound imagery is acquired by the ultrasound probe and in block 340 , optionally, probe orientation and movement data is received from an inertial measurement unit of the ultrasound probe is acquired.
  • the contemporaneous ultrasound imagery is processed in connection with to the estimator in the computing system.
  • a deviation of a pose of the ultrasound probe from an optimal pose that is evident from the acquired image is determined based upon the application of the estimator to the acquired image.
  • the probe orientation and movement data are used to further improve the accuracy of the determined pose deviation.
  • corresponding feedback based upon the deviation is determined such as a graphical representation of the deviation, a particular strength of vibration as part of haptic feedback, or a particular tone of particular frequency, periodicity, amplitude or any combination thereof as part of audible feedback.
  • the determined feedback is output by the computing system or the coupled ultrasound probe.
  • the inertial measurement unit of the ultrasound probe indicates whether or not a threshold change in position or orientation has occurred with respect to the ultrasound probe. If so, or in case where an inertial measurement unit is not present or active, the process may then repeat through block 340 .
  • the present invention may be embodied within a system, a method, a computer program product or any combination thereof.
  • the computer program product may include a computer readable storage medium or media having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Robotics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Analysis (AREA)

Abstract

Embodiments of the invention provide for the guided navigation of an ultrasound probe. In an embodiment of the invention, an ultrasound navigation assistance method includes acquiring an image by an ultrasound probe of a target organ of a body. The method also includes processing the image in connection with an estimator such as a neural network. The processing in turn determines a deviation of a contemporaneous pose evident from the acquired image from an optimal pose of the ultrasound probe for imaging the target organ. Finally, the method includes presenting the computed deviation to an end user operator of the ultrasound probe.

Description

  • This invention was made with government support under SBIR Phase I: Semantic Video Analysis for Video Summarization and Recommendation Proposal Number IIP-1416612 awarded by National Science Foundation (NSF). The United States Government has certain rights in the invention.
  • BACKGROUND OF THE INVENTION Field of the Invention
  • The present invention relates to ultrasound imaging and more particularly to ultrasound image acquisition.
  • Description of the Related Art
  • Ultrasound imaging, also known as sonography, is a medical imaging technique that employs high-frequency sound waves to view three-dimensional structures inside the body of a living being. Because ultrasound images are captured in real-time, ultrasound images also show movement of the internal organs of the body as well as blood flowing through the blood vessels of the human body and the stiffness of tissue. Unlike x-ray imaging, ultrasound imaging does not involve ionizing radiation thereby allowing prolonged usage of ultrasound imaging without threatening tissue and internal organ damage from prolonged radiation exposure.
  • To acquire ultrasound imagery, during an ultrasound exam, a transducer, commonly referred to as a probe, is placed directly on the skin or inside a body opening. A thin layer of gel is applied to the skin so that the ultrasound waves are transmitted from the transducer through the medium of the gel into the body. The ultrasound image is produced based upon a measurement of the reflection of the ultrasound waves off the body structures. The strength of the ultrasound signal, measured as the amplitude of the detected sound wave reflection, and the time taken for the sound wave to travel through the body provide the information necessary to compute an image.
  • Compared to other prominent methods of medical imaging, ultrasound presents several advantages to the diagnostician and patient. First and foremost, ultrasound imaging provides images in real-time. As well, ultrasound imaging requires equipment that is portable and can be brought to the bedside of the patient. Further, as a practical matter, the ultrasound imaging equipment is substantially lower in cost than other medical imaging equipment, and as noted, does not use harmful ionizing radiation. Even still, the production of quality ultrasound images remains highly dependent upon a skilled operator.
  • In this regard, depending upon the portion of the body selected for imaging, the skilled operator must know where to initially place the ultrasound probe. Then, the skilled operator must know how to spatially orient the probe and finally, the skilled operator must know where to move the probe so as to acquire the desired imagery. Generally, the ultrasound operator is guided in the initial placement, orientation and movement of the probe based upon the visual feedback provided by the imagery produced during the ultrasound. Thus, essentially, the navigation of the probe is a manual process consisting of iterative trial and error. Plainly, then, the modern process of ultrasound navigation is not optimal.
  • BRIEF SUMMARY OF THE INVENTION
  • Embodiments of the present invention address deficiencies of the art in respect to ultrasound probe navigation and provide a novel and non-obvious method, system and computer program product for the guided navigation of an ultrasound probe. In an embodiment of the invention, an ultrasound navigation assistance method includes acquiring an image by an ultrasound probe of a target organ of a body. The method also includes submitting the image processing in connection with an estimator formed as a function or programmatic approximator, including by way of example, a classifier, regressor, a state machine or a neural network. The processing with respect to the estimator produces as an output a deviation between a contemporaneous pose of the ultrasound probe, namely the position and orientation of the ultrasound probe relative to the target organ, and an optimal pose of the ultrasound probe for imaging the target organ. Finally, the method includes presenting the deviation to an end user operator of the ultrasound probe.
  • In one aspect of the embodiment, the contemporaneous pose of the ultrasound probe is additionally improved based upon linear and angular movement data received from an inertial measurement system including at least one of an accelerometer, gyroscope and magnetometer. In another aspect of the embodiment, the computed deviation is presented visually in a display of a computer system coupled to the probe, audibly through a varying of a tone based upon a proximity of the probe to the optimal pose, audibly by varying a frequency of repeatedly audibly presenting a short-duration sound based upon a proximity of the probe to the optimal pose, or haptically through a varying of vibrations of the probe based upon a proximity of the probe to the optimal pose.
  • In another embodiment of the invention, an ultrasound imaging data processing system is configured for ultrasound navigation assistance. The system includes a computer with memory and at least one processor, a display coupled to the computer, beamformer circuitry coupled to the computer and the display, and an ultrasound probe that has an array transducer connected to the beamformer circuitry. The system additionally includes a navigation assistance module executing in the memory of the computer. The module includes program code enabled upon execution by the processor of the computer to acquire an image by the ultrasound probe of a target organ of a body, to submit the image for processing in connection with an estimator, for instance, a neural network, so as to product a deviation between a contemporaneous pose of the ultrasound probe relative to the target organ and an optimal pose of the ultrasound probe for imaging the target organ, and to present the computed deviation to an end user operator of the ultrasound probe.
  • Additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The aspects of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention. The embodiments illustrated herein are presently preferred, it being understood, however, that the invention is not limited to the precise arrangements and instrumentalities shown, wherein:
  • FIG. 1 is a pictorial illustration of a process for guided navigation of an ultrasound probe
  • FIG. 2 is a schematic illustration of an ultrasound data processing system configured for guided navigation of an ultrasound probe; and,
  • FIG. 3 is a flow chart illustrating a process for guided navigation of an ultrasound probe.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Embodiments of the invention provide for guided navigation of an ultrasound probe. In accordance with an embodiment of the invention, an ultrasound probe is placed on the surface of a body. Then, imagery of a target organ of the body is acquired and a deviation of a contemporaneous pose of the ultrasound pose evident from the acquired image from an optimal pose for the target organ is presented to an end user operator of the ultrasound probe. For example, the deviation is presented visually in respect to a corresponding display, audibly by way of an audible guidance signal, or in the alterative, by way of the text to speech presentation of textual instructions, or haptically through the outer shell of the ultrasound probe.
  • In illustration, FIG. 1 is a pictorial illustration of a process for guided navigation of an ultrasound probe. As shown in FIG. 1, an ultrasound probe 120 is placed upon an outer surface of a body 110 such as a human form. Imagery 130 of a target organ is acquired by the operator of the ultrasound probe 120 and the image 130 of the target organ is presented input to an estimator 140, such as a neural network. The estimator 140 is trained based upon a set of training images 150 of one or more different target organs, each with a known probe pose deviation from the optimal probe pose so that the input of the contemporaneously acquired image 130 to the estimator 140 produces an output of a deviation 190 of the contemporaneous pose of the ultrasound probe 120 from an optimal pose of the ultrasound probe 120.
  • Optionally, the ultrasound probe 120 acquires probe orientation and movement data 180 including magnetomic information 180A, gyroscopic information 180B and accelerometric information 180C indicating an orientation and movement of the ultrasound probe 120 so as to compute the change in the pose of the ultrasound probe 120. Guided navigation logic 160 then processes the probe orientation and movement data 180 so as to better compute the deviation from the optimal pose 190 in consideration not only of the pose deviation 190 output by the estimator 140 in respect to the acquired image 130 but also in consideration of the change in the pose of the ultrasound probe 120 determined from the probe orientation and movement data 180.
  • Guided navigation logic 160 then processes the pose deviation 190 of the ultrasound probe 120 and emits feedback 170 in the form of visual feedback such as a three-dimensionally rendered scene with the two probe models showing current and optimal probe poses with suggested maneuver; the amount of agreement between the current and optimal poses; or red, green or yellow colors indicating how large an adjustment of the orientation of the ultrasound probe 120 is required to approach the optimal pose, audible feedback such as a tone, or haptic feedback. In regard to the latter, in one aspect of the invention the ultrasound probe 120 may be caused to vibrate more intensely or with greater frequency responsive to the pose deviation 190.
  • In respect to the former, in one aspect of the invention the ultrasound probe 120 may be caused to emit a sound that is more intense of a different tone when the ultrasound probe 120 based upon the magnitude of the pose deviation 190. As well the ultrasound probe 120 may be caused to emit a short-duration sound such as a click or pop repeatedly with a frequency related to the magnitude of the pose deviation 190. Alternatively, in another aspect of the invention the ultrasound probe 120 may be caused to vibrate more intensely or with greater frequency when the ultrasound probe 120 is moved in a compliant manner based upon a magnitude of the pose deviation 190.
  • The process described in connection with FIG. 1 may be implemented in an ultrasound data processing system. In further illustration, FIG. 2 schematically illustrates an ultrasound data processing system configured for guided navigation of an ultrasound probe. The system includes an ultrasound probe 210 coupled to a host computing system 200 of one or more computers, each with memory and at least one processor. The ultrasound probe 210 is enabled to acquire ultrasound imagery by way of a transducer connected to beamformer circuitry in the host computing system 200, and transmit the acquired ultrasound imagery to the beamformer circuitry of the host computing system 200 for display in a display of the host computing system 200 through an ultrasound user interface 290 provided in the memory of the host computing system 200.
  • The ultrasound probe 210 includes an electromechanical vibration generator 230 such as a piezo actuator, and a tone generator 240. The electromechanical vibration generator 230 may be driven in the ultrasound probe 210 to cause the ultrasound probe 210 to vibrate at a specific frequency and for a specific duration as directed by the host computing system 200. As well, the tone generator 240 may be driven in the ultrasound probe 210 to cause the ultrasound probe 210 to emit an audible tone at a specific frequency and amplitude and for a specific duration as directed by the host computing system 200. Optionally, the tone generator 240 may be disposed in the host computing system 200.
  • An image data store 260 stores therein a multiplicity of different ultrasound images previously acquired in a controlled setting where the pose deviation of each image is known. The images of the image store 260 are provided as training images in training an estimator 220 such as a neural network providing decisioning of a deviation from an optimal probe pose relative to an input image of a target organ of a human form. In this regard, the estimator 220 includes a multiplicity of nodes processing different extracted features of an acquired image so as to decision a pose of the ultrasound probe providing a deviation of the decisioned pose from a known, optimal pose in imaging a target organ. In this regard, the pose can be represented mathematically in Euclidian space or any array of numbers.
  • Finally, a navigation assistance module 300 is coupled to the ultrasound user interface 290. The navigation assistance module 300 includes program code that when executed in the memory of the host computing system 200 acquires a contemporaneous ultrasound image by the ultrasound probe 210 and processes the acquired ultrasound image in the host computing platform 200 utilizing the estimator 220. The program code of the navigation assistance module 300 during execution in the memory of the host computing system 200 then receives with the assistance of the estimator 220 a computed deviation of a pose evident from the acquired image, from an optimal pose of the ultrasound probe 210.
  • Optionally, the ultrasound probe 210 includes an inertial measurement unit 250. The inertial measurement unit 250 includes each of a magnetometer 250A, a gyroscope 250B, and an accelerometer 250C. As such, data acquired by the inertial measurement unit 250 is translated in the host computing system 200 to estimate a change in pose for a given time interval, for instance by measuring linear acceleration and angular velocity of the probe 210. This change in pose can be combined with an estimator-derived pose deviation to obtain a more precise pose estimate. One possible example includes the use of a Kalman filter.
  • For instance, algorithmically, the process utilizing the inertial measurement unit 250 to tune a determined deviation from the estimator 220 can be expressed as follows:
  • 1. Let I(t) be the image acquired by the ultrasound probe 210 at time t.
  • 2. Let f be an estimator such as neural network 220 that from an acquired image I(t) outputs the image-estimated pose p(t)img at time t relative to the optimal pose (i.e., deviation of the pose from the optimal pose). That is, p(t)img=f(I(t)).
  • 3. Between t0 and t1 (>t0), the change of pose Δp(t1, t0)img can be computed as: Δp(t1, t0)img=p(t1)img−p(t0)img. Obviously, p(t1)img=p(t0)img+Δp(t1, t0)img.
  • 4. The change of pose between to and t1is measured from the inertial measurement unit 250 and denoted as Δp(t1, t0)IMU. Δp(t1, t0)img and Δp(t1, t0)IMU are combined to produce a better estimate of the pose change by using a Kalman filter expressed as Δp(t1, t0)K=K(Δp(t1, t0)img, Δp(t1, t0)IMU; p(t0)img) where K is a Kalman filter (that takes image-based pose change, inertial measurement unit 250 based pose change, and the pose at to as inputs), Δp(t1, t0)K is the combined pose change that is expected to be more accurate than either Δp(t1, t0)img or Δp(t1, t0)IMU alone.
  • 5. With the more accurate Δp(t1, t0)K, it is then possible to estimate more accurate absolute pose p(t1)K at time p(t1)K=p(t0)img+Δp(t1, t0)K
  • 6. For all subsequent time points: p(tj+1)K=p(tj)K+Δp(tj, tj+1)K
  • In any event, based upon the computed deviation, the program code of the navigation assistance module 300 then determines corresponding feedback to be presented through the ultrasound probe 210. For example, the program code of the navigation assistance module 300 may direct the tone generator 240 to emit a particular tone pattern of specific periodicity proportional or inversely proportional to a determined proximity of the ultrasound probe 210 to the optimal pose. As another example, the program code of the navigation assistance module 300 may direct the electromechanical vibration generator 230 of the ultrasound probe 210 to emit a particular vibration of specific intensity proportional or inversely proportional to a determined proximity of the ultrasound probe 210 to the optimal pose.
  • In yet further illustration of the operation of the navigation assistance module 300, FIG. 3 is a flow chart depicting a process for guided navigation of an ultrasound probe. Beginning in block 310, a target organ within the body is selected in a user interface to an ultrasound application visualizing ultrasound imagery acquired by the ultrasound probe. Subsequently, in block 320 an estimator such as a neural network pertaining to the target organ is loaded into memory of a computing system coupled to the ultrasound probe. In block 330, a contemporaneous ultrasound imagery is acquired by the ultrasound probe and in block 340, optionally, probe orientation and movement data is received from an inertial measurement unit of the ultrasound probe is acquired. In block 350, the contemporaneous ultrasound imagery is processed in connection with to the estimator in the computing system.
  • In block 360, a deviation of a pose of the ultrasound probe from an optimal pose that is evident from the acquired image is determined based upon the application of the estimator to the acquired image. Optionally, the probe orientation and movement data are used to further improve the accuracy of the determined pose deviation. In block 370, corresponding feedback based upon the deviation is determined such as a graphical representation of the deviation, a particular strength of vibration as part of haptic feedback, or a particular tone of particular frequency, periodicity, amplitude or any combination thereof as part of audible feedback. In block 390, then, the determined feedback is output by the computing system or the coupled ultrasound probe. Finally, in decision block 400, the inertial measurement unit of the ultrasound probe indicates whether or not a threshold change in position or orientation has occurred with respect to the ultrasound probe. If so, or in case where an inertial measurement unit is not present or active, the process may then repeat through block 340.
  • The present invention may be embodied within a system, a method, a computer program product or any combination thereof. The computer program product may include a computer readable storage medium or media having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention. The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
  • Finally, the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
  • The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
  • Having thus described the invention of the present application in detail and by reference to embodiments thereof, it will be apparent that modifications and variations are possible without departing from the scope of the invention defined in the appended claims as follows:

Claims (20)

We claim:
1. An ultrasound navigation assistance method comprising:
acquiring an image by an ultrasound probe of a target organ of a body;
processing the image in connection with an estimator, the estimator producing a deviation of a contemporaneous pose evident from the image from an optimal pose of the ultrasound probe for imaging the target organ; and,
presenting the computed deviation to an end user operator of the ultrasound probe.
2. The method of claim 1, wherein the estimator comprises a neural network.
3. The method of claim 1, wherein the contemporaneous pose is additionally determined based upon probe orientation and movement data received from an inertial measurement system comprising at least one of an accelerometer, gyroscope and magnetometer.
4. The method of claim 1, wherein the computed deviation is presented visually in a display of a computer system coupled to the probe.
5. The method of claim 1, wherein the computed deviation is presented audibly through a varying of a tone based upon a proximity of the probe to the optimal pose.
6. The method of claim 1, wherein the computed deviation is presented audibly by varying a frequency of repeatedly audibly presenting a short-duration sound based upon a proximity of the probe to the optimal pose.
7. The method of claim 1, wherein the computed deviation is presented haptically through a varying of vibrations of the probe based upon a proximity of the probe to the optimal pose.
8. An ultrasound imaging data processing system configured for ultrasound navigation assistance, the system comprising:
a computer with memory and at least one processor;
a display coupled to the computer;
beamformer circuitry coupled to the computer and the display;
an ultrasound probe comprising a transducer connected to the beamformer circuitry; and,
a navigation assistance module executing in the memory of the computer, the module comprising program code enabled upon execution by the processor of the computer to acquire an image by the ultrasound probe of a target organ of a body, to process with an estimator the acquired image by determining a deviation between a contemporaneous pose of the ultrasound probe relative to the target organ evident from the acquired image, and an optimal pose of the ultrasound probe for imaging the target organ, and to present the computed deviation to an end user operator of the ultrasound probe.
9. The system of claim 8, wherein the estimator is a neural network.
10. The system of claim 8, wherein the contemporaneous pose is additionally determined based upon probe orientation and movement data received from an inertial measurement system comprising at least one of an accelerometer, gyroscope and magnetometer.
11. The system of claim 8, wherein the computed deviation is presented visually in a display of a computer system coupled to the probe.
12. The system of claim 8, wherein the computed deviation is presented audibly through a varying of a tone based upon a proximity of the probe to the optimal pose.
13. The system of claim 8, wherein the computed deviation is presented haptically through a varying of vibrations of the probe based upon a proximity of the probe to the optimal pose.
14. A computer program product for ultrasound navigation assistance, the computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by a device to cause the device to perform a method comprising:
acquiring an image by an ultrasound probe of a target organ of a body;
processing the image in connection with an estimator, the estimator producing a deviation of a contemporaneous pose of the ultrasound probe evident from the acquired image from an optimal pose of the ultrasound probe for imaging the target organ; and,
presenting the computed deviation to an end user operator of the ultrasound probe.
15. The computer program product of claim 14, wherein the estimator is a neural network.
16. The computer program product of claim 14, wherein the pose is additionally determined based upon probe orientation and movement data received from an inertial measurement system comprising at least one of an accelerometer, gyroscope and magnetometer.
17. The computer program product of claim 14, wherein the computed deviation is presented visually in a display of a computer system coupled to the probe.
18. The computer program product of claim 14, wherein the computed deviation is presented audibly through a varying of a tone based upon a proximity of the probe to the optimal pose.
19. The computer program product of claim 14, wherein the computed deviation is presented audibly by varying a frequency of repeatedly audibly presenting a short-duration sound based upon a proximity of the probe to the optimal pose.
20. The computer program product of claim 14, wherein the computed deviation is presented haptically through a varying of vibrations of the probe based upon a proximity of the probe to the optimal pose.
US15/831,375 2016-12-07 2017-12-04 Guided navigation of an ultrasound probe Abandoned US20180153505A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1662038 2016-12-07
FR1662038A FR3059541B1 (en) 2016-12-07 2016-12-07 GUIDED NAVIGATION OF AN ULTRASONIC PROBE

Publications (1)

Publication Number Publication Date
US20180153505A1 true US20180153505A1 (en) 2018-06-07

Family

ID=58737624

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/831,375 Abandoned US20180153505A1 (en) 2016-12-07 2017-12-04 Guided navigation of an ultrasound probe

Country Status (3)

Country Link
US (1) US20180153505A1 (en)
EP (1) EP3332712A1 (en)
FR (1) FR3059541B1 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109567865A (en) * 2019-01-23 2019-04-05 上海浅葱网络技术有限公司 A kind of intelligent ultrasonic diagnostic equipment towards Non-medical-staff
CN110070576A (en) * 2019-04-29 2019-07-30 成都思多科医疗科技有限公司 A kind of ultrasound based on deep learning network adopts figure intelligent locating method and system
WO2019222478A2 (en) 2018-05-17 2019-11-21 Teratech Corporation Portable ultrasound system
US20200037986A1 (en) * 2018-08-03 2020-02-06 Butterfly Network, Inc. Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
CN110974294A (en) * 2019-12-19 2020-04-10 上海尽星生物科技有限责任公司 Ultrasonic scanning method and device
US10628932B2 (en) 2017-10-27 2020-04-21 Butterfly Network, Inc. Quality indicators for collection of and automated measurement on ultrasound images
US10636323B2 (en) 2017-01-24 2020-04-28 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of medical equipment
US20200129154A1 (en) * 2017-03-20 2020-04-30 Exact Imaging Inc. Method and system for visually assisting an operator of an ultrasound system
WO2020162989A1 (en) * 2019-02-04 2020-08-13 Google Llc Instrumented ultrasound probes for machine-learning generated real-time sonographer feedback
US10751029B2 (en) 2018-08-31 2020-08-25 The University Of British Columbia Ultrasonic image analysis
EP3711674A1 (en) * 2019-03-21 2020-09-23 Medizinische Universität Wien Method for acquiring image data of a body part
US10912619B2 (en) * 2015-11-12 2021-02-09 Intuitive Surgical Operations, Inc. Surgical system with training or assist functions
CN112638273A (en) * 2018-08-23 2021-04-09 皇家飞利浦有限公司 Biometric measurement and quality assessment
CN112773402A (en) * 2019-11-09 2021-05-11 无锡祥生医疗科技股份有限公司 Intelligent auxiliary guiding method, ultrasonic diagnosis device and storage medium
CN112888370A (en) * 2018-10-16 2021-06-01 皇家飞利浦有限公司 Ultrasound imaging guidance and associated devices, systems, and methods based on deep learning
US11058501B2 (en) 2015-06-09 2021-07-13 Intuitive Surgical Operations, Inc. Configuring surgical system with surgical procedures atlas
CN113116377A (en) * 2019-12-31 2021-07-16 无锡祥生医疗科技股份有限公司 Ultrasonic imaging navigation method, ultrasonic device and storage medium
US20210295048A1 (en) * 2017-01-24 2021-09-23 Tienovix, Llc System and method for augmented reality guidance for use of equipment systems
US11129591B2 (en) 2016-04-21 2021-09-28 The University Of British Columbia Echocardiographic image analysis
US20210327303A1 (en) * 2017-01-24 2021-10-21 Tienovix, Llc System and method for augmented reality guidance for use of equipment systems
US20210327304A1 (en) * 2017-01-24 2021-10-21 Tienovix, Llc System and method for augmented reality guidance for use of equpment systems
US11478218B2 (en) * 2017-08-31 2022-10-25 Bfly Operations, Inc. Methods and apparatus for collection of ultrasound data
US11593638B2 (en) 2018-05-15 2023-02-28 New York University System and method for orientating capture of ultrasound images
US20230094631A1 (en) * 2020-03-05 2023-03-30 Koninklijke Philips N.V. Ultrasound imaging guidance and associated devices, systems, and methods
US11646113B2 (en) * 2017-04-24 2023-05-09 Biosense Webster (Israel) Ltd. Systems and methods for determining magnetic location of wireless tools
FR3130551A1 (en) * 2021-12-20 2023-06-23 Centre National d'Études Spatiales Method and system for guiding a user holding an ultrasound observation probe in one hand towards a pre-recorded acoustic window.
US11696745B2 (en) * 2017-03-16 2023-07-11 Koninklijke Philips N.V. Optimal scan plane selection for organ viewing
US11707255B2 (en) 2019-04-02 2023-07-25 Siemens Medical Solutions Usa, Inc. Image-based probe positioning
EP4028123A4 (en) * 2019-09-11 2023-11-08 General Electric Company Delivery of therapeutic neuromodulation
US11844654B2 (en) 2019-08-19 2023-12-19 Caption Health, Inc. Mid-procedure view change for ultrasound diagnostics
US11903760B2 (en) 2021-09-08 2024-02-20 GE Precision Healthcare LLC Systems and methods for scan plane prediction in ultrasound images

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3092241B1 (en) * 2019-01-31 2021-01-01 Bay Labs Inc PRESCRIPTIVE GUIDANCE FOR ULTRASONIC DIAGNOSIS
FR3099985A1 (en) * 2019-08-19 2021-02-26 Bay Labs, Inc. Mid-procedure change of view for ultrasound diagnosis

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030192557A1 (en) * 1998-05-14 2003-10-16 David Krag Systems and methods for locating and defining a target location within a human body
US20040019270A1 (en) * 2002-06-12 2004-01-29 Takashi Takeuchi Ultrasonic diagnostic apparatus, ultrasonic probe and navigation method for acquisition of ultrasonic image
US20160174934A1 (en) * 2013-09-18 2016-06-23 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Method and system for guided ultrasound image acquisition
US20170262982A1 (en) * 2016-03-09 2017-09-14 EchoNous, Inc. Ultrasound image recognition systems and methods utilizing an artificial intelligence network

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9189083B2 (en) * 2008-03-18 2015-11-17 Orthosensor Inc. Method and system for media presentation during operative workflow
EP2807978A1 (en) * 2013-05-28 2014-12-03 Universität Bern Method and system for 3D acquisition of ultrasound images
MX2016012612A (en) * 2014-03-31 2016-12-14 Koninklijke Philips Nv Haptic feedback for ultrasound image acquisition.

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030192557A1 (en) * 1998-05-14 2003-10-16 David Krag Systems and methods for locating and defining a target location within a human body
US20040019270A1 (en) * 2002-06-12 2004-01-29 Takashi Takeuchi Ultrasonic diagnostic apparatus, ultrasonic probe and navigation method for acquisition of ultrasonic image
US20160174934A1 (en) * 2013-09-18 2016-06-23 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Method and system for guided ultrasound image acquisition
US20170262982A1 (en) * 2016-03-09 2017-09-14 EchoNous, Inc. Ultrasound image recognition systems and methods utilizing an artificial intelligence network

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11737841B2 (en) 2015-06-09 2023-08-29 Intuitive Surgical Operations, Inc. Configuring surgical system with surgical procedures atlas
US11058501B2 (en) 2015-06-09 2021-07-13 Intuitive Surgical Operations, Inc. Configuring surgical system with surgical procedures atlas
US10912619B2 (en) * 2015-11-12 2021-02-09 Intuitive Surgical Operations, Inc. Surgical system with training or assist functions
US11751957B2 (en) 2015-11-12 2023-09-12 Intuitive Surgical Operations, Inc. Surgical system with training or assist functions
US11129591B2 (en) 2016-04-21 2021-09-28 The University Of British Columbia Echocardiographic image analysis
US11676513B2 (en) 2017-01-24 2023-06-13 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of equipment
US11011078B2 (en) * 2017-01-24 2021-05-18 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of medical equipment
US20200135055A1 (en) * 2017-01-24 2020-04-30 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of equipment
US20210295048A1 (en) * 2017-01-24 2021-09-23 Tienovix, Llc System and method for augmented reality guidance for use of equipment systems
US20200152088A1 (en) * 2017-01-24 2020-05-14 Tietronix Software, Inc. System and method for three-dimensional augmented reality guidance for use of medical equipment
US11017695B2 (en) * 2017-01-24 2021-05-25 Tienovix, Llc Method for developing a machine learning model of a neural network for classifying medical images
US11017694B2 (en) * 2017-01-24 2021-05-25 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of equipment
US10636323B2 (en) 2017-01-24 2020-04-28 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of medical equipment
US20210104178A1 (en) * 2017-01-24 2021-04-08 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of medical equipment
US20210327304A1 (en) * 2017-01-24 2021-10-21 Tienovix, Llc System and method for augmented reality guidance for use of equpment systems
US10796605B2 (en) * 2017-01-24 2020-10-06 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of equipment
US10818199B2 (en) * 2017-01-24 2020-10-27 Tienovix, Llc System including a non-transitory computer readable program storage unit encoded with instructions that, when executed by a computer, perform a method for three-dimensional augmented reality guidance for use of medical equipment
US20210327303A1 (en) * 2017-01-24 2021-10-21 Tienovix, Llc System and method for augmented reality guidance for use of equipment systems
US11696745B2 (en) * 2017-03-16 2023-07-11 Koninklijke Philips N.V. Optimal scan plane selection for organ viewing
US11857370B2 (en) * 2017-03-20 2024-01-02 National Bank Of Canada Method and system for visually assisting an operator of an ultrasound system
US20200129154A1 (en) * 2017-03-20 2020-04-30 Exact Imaging Inc. Method and system for visually assisting an operator of an ultrasound system
US11646113B2 (en) * 2017-04-24 2023-05-09 Biosense Webster (Israel) Ltd. Systems and methods for determining magnetic location of wireless tools
US11478218B2 (en) * 2017-08-31 2022-10-25 Bfly Operations, Inc. Methods and apparatus for collection of ultrasound data
US20220386990A1 (en) * 2017-08-31 2022-12-08 Bfly Operations, Inc. Methods and apparatus for collection of ultrasound data
US11620740B2 (en) 2017-10-27 2023-04-04 Bfly Operations, Inc. Quality indicators for collection of and automated measurement on ultrasound images
US10628932B2 (en) 2017-10-27 2020-04-21 Butterfly Network, Inc. Quality indicators for collection of and automated measurement on ultrasound images
US10706520B2 (en) 2017-10-27 2020-07-07 Butterfly Network, Inc. Quality indicators for collection of and automated measurement on ultrasound images
EP4218596A1 (en) 2018-05-15 2023-08-02 New York University System and method for orientating capture of ultrasound images
US11593638B2 (en) 2018-05-15 2023-02-28 New York University System and method for orientating capture of ultrasound images
WO2019222478A2 (en) 2018-05-17 2019-11-21 Teratech Corporation Portable ultrasound system
US20200037986A1 (en) * 2018-08-03 2020-02-06 Butterfly Network, Inc. Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
US20210177374A1 (en) * 2018-08-23 2021-06-17 Koninklijke Philips N.V. Biometric measurement and quality assessment
CN112638273A (en) * 2018-08-23 2021-04-09 皇家飞利浦有限公司 Biometric measurement and quality assessment
US10751029B2 (en) 2018-08-31 2020-08-25 The University Of British Columbia Ultrasonic image analysis
CN112888370A (en) * 2018-10-16 2021-06-01 皇家飞利浦有限公司 Ultrasound imaging guidance and associated devices, systems, and methods based on deep learning
US20210369249A1 (en) * 2018-10-16 2021-12-02 Koninklijke Philips N.V. Deep learning-based ultrasound imaging guidance and associated devices, systems, and methods
CN109567865A (en) * 2019-01-23 2019-04-05 上海浅葱网络技术有限公司 A kind of intelligent ultrasonic diagnostic equipment towards Non-medical-staff
WO2020162989A1 (en) * 2019-02-04 2020-08-13 Google Llc Instrumented ultrasound probes for machine-learning generated real-time sonographer feedback
WO2020188407A1 (en) * 2019-03-21 2020-09-24 Medizinische Universität Wien Acquiring image data of a body part
EP3711674A1 (en) * 2019-03-21 2020-09-23 Medizinische Universität Wien Method for acquiring image data of a body part
US11707255B2 (en) 2019-04-02 2023-07-25 Siemens Medical Solutions Usa, Inc. Image-based probe positioning
CN110070576A (en) * 2019-04-29 2019-07-30 成都思多科医疗科技有限公司 A kind of ultrasound based on deep learning network adopts figure intelligent locating method and system
US11844654B2 (en) 2019-08-19 2023-12-19 Caption Health, Inc. Mid-procedure view change for ultrasound diagnostics
EP4028123A4 (en) * 2019-09-11 2023-11-08 General Electric Company Delivery of therapeutic neuromodulation
CN112773402A (en) * 2019-11-09 2021-05-11 无锡祥生医疗科技股份有限公司 Intelligent auxiliary guiding method, ultrasonic diagnosis device and storage medium
CN110974294A (en) * 2019-12-19 2020-04-10 上海尽星生物科技有限责任公司 Ultrasonic scanning method and device
CN113116377A (en) * 2019-12-31 2021-07-16 无锡祥生医疗科技股份有限公司 Ultrasonic imaging navigation method, ultrasonic device and storage medium
US20230094631A1 (en) * 2020-03-05 2023-03-30 Koninklijke Philips N.V. Ultrasound imaging guidance and associated devices, systems, and methods
US11903760B2 (en) 2021-09-08 2024-02-20 GE Precision Healthcare LLC Systems and methods for scan plane prediction in ultrasound images
WO2023118702A1 (en) * 2021-12-20 2023-06-29 Centre National d'Études Spatiales Method and system for guiding a user, holding in one hand an ultrasonic observation probe, towards a pre-recorded acoustic window
FR3130551A1 (en) * 2021-12-20 2023-06-23 Centre National d'Études Spatiales Method and system for guiding a user holding an ultrasound observation probe in one hand towards a pre-recorded acoustic window.

Also Published As

Publication number Publication date
FR3059541A1 (en) 2018-06-08
EP3332712A1 (en) 2018-06-13
FR3059541B1 (en) 2021-05-07

Similar Documents

Publication Publication Date Title
US20180153505A1 (en) Guided navigation of an ultrasound probe
US11730447B2 (en) Haptic feedback for ultrasound image acquisition
Chatelain et al. Confidence-driven control of an ultrasound probe
US10893850B2 (en) Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
US8756033B2 (en) Ultrasonic diagnostic imaging system and control method thereof
CN111035408B (en) Method and system for enhanced visualization of ultrasound probe positioning feedback
TW201923345A (en) Methods and apparatus for configuring an ultrasound device with imaging parameter values
JP7083143B2 (en) Guided navigation of ultrasonic probe
US20210093301A1 (en) Ultrasound system with artificial neural network for retrieval of imaging parameter settings for recurring patient
WO2020028746A1 (en) Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
US10952705B2 (en) Method and system for creating and utilizing a patient-specific organ model from ultrasound image data
WO2019145147A1 (en) Device and method for obtaining anatomical measurements from an ultrasound image
CA2950868A1 (en) Guided navigation of an ultrasound probe
JP2023525741A (en) Automated evaluation of ultrasound protocol trees
JP2023549093A (en) Robust segmentation with high-level image understanding
KR20200096125A (en) Prescriptive guidance for ultrasound diagnostics
CN112515747A (en) Method and system for analyzing ultrasound scenes to provide needle guidance and warning
EP3552554B1 (en) Ultrasonic diagnosis apparatus and method for controlling ultrasonic diagnosis apparatus
KR20180065093A (en) Guided navigation of an ultrasound probe
US11844654B2 (en) Mid-procedure view change for ultrasound diagnostics
EP3773231B1 (en) Ultrasound imaging system and method
EP4260811A1 (en) Graphical user interface for providing ultrasound imaging guidance
CN116612061A (en) Method and system for automatic two-dimensional standard view detection in transesophageal ultrasound images
KR20150059098A (en) Image Display System And Method For Fitting Multiple Models To Image

Legal Events

Date Code Title Description
AS Assignment

Owner name: BAY LABS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CADIEU, CHARLES;HONG, HA;KOEPSELL, KILIAN;AND OTHERS;REEL/FRAME:044292/0630

Effective date: 20161201

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

AS Assignment

Owner name: CAPTION HEALTH, INC., CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:BAY LABS, INC.;REEL/FRAME:058549/0074

Effective date: 20191014

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION