US20230165562A1 - System for acquiring ultrasound images of an organ of a human body - Google Patents

System for acquiring ultrasound images of an organ of a human body Download PDF

Info

Publication number
US20230165562A1
US20230165562A1 US17/920,948 US202117920948A US2023165562A1 US 20230165562 A1 US20230165562 A1 US 20230165562A1 US 202117920948 A US202117920948 A US 202117920948A US 2023165562 A1 US2023165562 A1 US 2023165562A1
Authority
US
United States
Prior art keywords
scanner
imu
ultrasound
operator
housing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/920,948
Other languages
English (en)
Inventor
Elazar Sonnenschein
Yehuda Albeck
Paz ELIA
Menachem BECHER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pulsenmore Ltd
Original Assignee
Pulsenmore Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pulsenmore Ltd filed Critical Pulsenmore Ltd
Assigned to PULSENMORE LTD reassignment PULSENMORE LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ALBECK, YEHUDA, BECHER, Menachem, ELIA, Paz, SONNENSCHEIN, ELAZAR
Publication of US20230165562A1 publication Critical patent/US20230165562A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4263Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors not mounted on the probe, e.g. mounted on an external reference frame
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4272Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue
    • A61B8/4281Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue characterised by sound-transmitting media or devices for coupling the transducer to the tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4272Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue
    • A61B8/429Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue characterised by determining or monitoring the contact between the transducer and the tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4427Device being portable or laptop-like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4455Features of the external shape of the probe, e.g. ergonomic aspects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
    • A61B8/5276Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • A61B8/565Details of data transmission or power supply involving data transmission via a network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • A61B8/585Automatic set-up of the device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0247Pressure sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/031Recognition of patterns in medical or anatomical images of internal organs

Definitions

  • the invention is from the field of medical devices. Specifically, the invention relates to a system and method for accurately positioning and moving a hand-held ultrasound probe utilizing an inertial measurement unit.
  • Knowing the location of a medical sensor or a medical device relative to a patient's anatomical structure and the speed with which the sensor or device is or should be moving is critical for the functionality of advanced remote control, robotic, autonomous, self-feedback, or other automatic medical procedures.
  • an ultrasound probe also referred to herein variously as “scanner,” “ultrasound head,” or simply “probe” for simplicity
  • images taken at speed lower or higher than some range may be deleted or possibly be subject to special filtering and image processing techniques in order to enhance blurred images.
  • instruction may be provided to the operator concerning several aspects of the procedure such as when to stop movements, how to correct the scan path, how to change orientation or to tilt the probe, etc.
  • knowing the two-dimensional or three-dimensional speed at which the ultrasound probe moves is also important to track the overall location, attitude and speed of the gimbal and/or the probe.
  • ultrasound scanning An extremely common procedure is ultrasound scanning, which inter cilia nearly every woman undergoes during prenatal visits to her doctor or a clinic.
  • an ultrasound technician sonographer
  • the operator i.e. technician, midwife, doctor, sonographer, etc. knows, based on their experience, the best position and orientation in which the scanner head or probe must be located in order to image specific structures of the embryo, the right amount of pressure against the belly that is necessary to keep good coupling of the scanner to the body, the angle of the probe relatively to the belly, and the right scanning speed that will allow good imaging.
  • the operator sees the images generated by the probe on a screen in real time and is able to optimize or correct its position.
  • probe or “ultrasound probe” refers to any useful probe, linear or convex, phase array, HIFU, or other sensor.
  • scanner should be understood to refer to an element, housing or device, which must move over the surface of a patient's body to acquire data therefrom, e.g., ultrasound images.
  • the invention encompasses a system for acquiring ultrasound images of internal organs of a human body.
  • the system comprises a scanner and at least one inertial measurement unit (IMU) associated therewith.
  • IMU inertial measurement unit
  • the at least one IMU is one of: a) integral with the scanner; b) connected to the scanner via a plug-in connection; and c) provided in an element associated with the scanner and moving therewith during operation.
  • Some embodiments of the system are configured to issue instructions to the operator of the system that allow scans to be performed also by persons not trained for ultrasound scanning including the patient themselves.
  • the scans are transmitted to a remote location for analysis by a healthcare professional.
  • Some embodiments of the system are configured to allow two-way communication between the operator and a remote individual or non-monitored system, wherein the non-monitored system comprises automated, image analysis circuitry.
  • the two-way communication can be selected from audio, visual, and video communication, and combinations thereof.
  • two way video communication is enabled between the operator and the health care professional, enabling them to see each other while the operator is carrying out the scanning procedure to aid the health care professional in interpreting the images and to provide guidance if necessary.
  • the system is configured such that the output of the system is sent directly to a remote healthcare professional and/or to a non-monitored system either in real time or shortly after the images are acquired.
  • Some embodiments of the system are configured to overlay an image of the scanner on top of the ultrasound scans to aid a healthcare professional in interpreting the images.
  • Embodiments of the scanner of the system comprise a housing that is ergonomically designed to be held by an operator and moved across the skin of a person or animal.
  • Some embodiments of the housing comprise, or has associated therewith, at least the minimum number of components of the system that must be located on the patient's body to obtain the ultrasound images.
  • the minimum number of components in, or associated with the housing are: i) an ultrasound probe head; ii) the at least one IMU, which comprises a three-axis accelerometer and a three-axis gyroscope; iii) electronic components for wired or wireless communication with remote terminals, and iv) a power source.
  • the housing comprises other components that may be arranged in many different configurations in which at least some of them may be located within the housing.
  • the other components of the system are: v) an Analog Front End (AFE) that transmits and receives ultrasound signals by means of electronic components; vi) a processor containing software; vii) a user interface comprising a display screen and means to accept user's instructions; and viii) at least one memory device to store data and images processed by the software in the processor.
  • AFE Analog Front End
  • the other components that are not located within the housing are located at a location near the patient but separated from the housing.
  • the other components that are not located within the housing are in communication with components located within, or associated with the housing.
  • the electronic components of the AFE comprise transmitters, receivers, amplifiers, and analog to digital (A/D and digital to analog (D/A) converters.
  • the software is configured to operate the system and to receive and process ultrasound signals received from the AFE to produce ultrasound images and to receive and process inertial measurement signals received from the IMU.
  • the AFE, IMU, processor, memory devices, and communication components can be provided as separate integrated circuits (ICs) or integrated into one or more ASICs that comprise at least some of the ICs.
  • ICs integrated circuits
  • Processor for example might be FPGA or MCU or MCU that includes ADC or DAC.
  • the AFE that is implemented as ASIC can also include ADC and/or DAC.
  • the additional components comprise at least one of: ix) a remote terminal; x) at least one additional IMU; xi) at least one three-axis magnetometer; xii) at least one pressure sensor; and xiii) a speaker and a microphone for communicating with a remote health care provider.
  • all of the other components v)-viii) are contained within a remote terminal, which is connected to the scanner via a wired or wireless communication link. In other embodiments of the system some of the other components v)-viii) are contained within the scanner and the remainder located at a remote terminal, which is connected to the scanner via a wired or wireless communication link.
  • the remote terminal is a portable communication device.
  • the portable communication device is a smartphone.
  • the portable communication device comprises the display, the IMU, and the processor.
  • the portable communication device fits into a socket in the housing of the scanner.
  • the portable communication device is an integral part of the housing.
  • the portable communication device is not an integral part of the housing, but is fit into the socket in the housing before performing a scan, moved together with the housing during an ultrasound scan, and if desired, later detached for other uses.
  • the portable communication device is connected via a cable or wireless connection to the housing and only the housing is moved.
  • suitable wired communication links include USB, lightning, and fiber optic, but, of course, any additional wired communication is possible.
  • wireless communication links include, but are not limited to, Wi-Fi, UWB, Bluetooth, and IR.
  • the portable communication device can be any of many suitable devices, for example, a mobile phone, tablet, laptop.
  • the housing or a device connected therewith may be in communication with apparatus located in the cloud, adapted to receive data generated by, or in association with, the housing.
  • different combinations of one or more IMUs, processing devices and software, memory devices, power sources, and components of the AFE are located either within the housing or in the smartphone.
  • Some embodiments of the system comprise at least one IMU in the smartphone and at least one IMU in the housing.
  • the processor is configured to receive data collected by all sensors.
  • the software is configured to execute at least one of the following: to produce ultrasound images; to analyze the data; to decide which images are of sufficient quality to be displayed on the display screen; to discard low quality images; to instruct the operator to hold the housing of the scanner in a predetermined manner; to compute the location and attitude of the scanner; to determine if the scanner is being held such that enough pressure is being exerted on the skin to produce an image of sufficient quality; and to effectively provide instructions how to move the scanner correctly in order to obtain satisfactory.
  • system instructions to the operator that are generated by the software are provided visually on the display screen or audibly from the speakers. In some embodiments of the system instructions to the operator are provided visually on the display screen or audibly from the speakers by a trained health care professional located at a remote terminal.
  • the task of computing the navigation is carried out by an Inertial Navigation System (INS) comprising a set of three-axis gyroscopes and three-axis accelerometers in the IMU and other sensors; the processor; and software, which is configured to take initial conditions and calibration data and the output from the IMU and other sensors to compute the Navigation, wherein the other sensors can be at least one of a three-axis magnetometer, a pressure sensor, and a camera.
  • INS Inertial Navigation System
  • Some embodiments of the system are configured to generate accurate scans of ultrasound signals on the skin and to ensure good value images for diagnostic purposes by using a combination of a pressure sensor and IMU and selecting only images that meet optimal values of the speed of scanning and pressure of the scanner against the skin.
  • the speed of the scan is calculated from the angular velocity assuming motion perpendicular to the surface of the body.
  • the body is modeled as a sphere, whose radius can be approximated by one or more of the patient's BMI, the stage of the pregnancy, or a visual estimate, e.g. in the range of 20 cm up to 70 cm for obese patients.
  • typical distances for the scans are in the range of several millimeters up to several tens of centimeters. In some embodiments of the system the speed of the scan is between 1 mm per second and several centimeters per second.
  • the 3-axis gyroscopes and 3-axis accelerometers of the IMU are calibrated by the manufacturer for offset, scale-factor, cross-axis sensitivity and initial orientation; and MEMS IMUs are calibrated by the user before each scan.
  • a one-step calibration in which only the offset of the gyroscopes is estimated, is required, wherein the one-step calibration process comprises holding the IMU still for several minutes and recording the output of the sensors; wherein the average output of the gyroscopes is taken to be their offset and the variance of each sensor is taken to be its noise.
  • the operator performs a seven-phase calibration process, wherein the seven phases of the calibration process in a coordinate system wherein the positive Z-axis points up, the positive Y-axis points towards the right, and the positive X-axis points forward are:
  • the processor determines, during a scan, that not enough pressure is being exerted on the skin, an instruction to increase the pressure is issued to the operator either visually on the display screen, e.g. by displaying a downward pointing arrow, and/or audibly from the speakers.
  • the processor can determine that not enough pressure is being exerted on the skin by at least one of:
  • the processor contains software configured to determine if an insufficient quantity of water-based gel is interposed between the ultrasound probe head and the skin and to issue an alert to the operator either visually on the display screen and/or audibly from the speakers.
  • the software can determine if an insufficient quantity of water based gel is interposed between the ultrasound probe head and the skin by determining if there is weakening of the signals returning to the probe or weakening of the resulting ultrasound image.
  • the invention encompasses a method for allowing a operator not trained for ultrasound scanning to obtain and process ultrasound images of internal organs of a human body.
  • the method comprises:
  • the system is the system of the first aspect of the invention.
  • the instructions issued by the system are the instructions issued by the processor and software of the system of the first aspect of the invention.
  • the invention encompasses a method for acquiring ultrasound images of internal organs of a human body.
  • the method comprises providing a scanner and at least one inertial measurement unit (IMU) associated therewith, and instructions for an untrained user to operate said scanner.
  • IMU inertial measurement unit
  • Some embodiments of the third aspect of the method comprise issuing instructions to the operator of the system that allow scans to be performed also by persons not trained for ultrasound scanning including the patient themselves. Some embodiments of the method of the third aspect comprise transmitting acquired ultrasound images to a remote location for analysis by a healthcare professional. Some embodiments of the method of the third aspect comprise providing circuitry adapted to perform two-way communication between the user and a remote individual or non-monitored system. In some embodiments of the third aspect of the method the non-monitored system comprises automated, image analysis circuitry and the output of an automated analysis is provided to the user and/or to a healthcare professional. In some embodiments of the third aspect of the method the two-way communication is selected from audio, visual, and video communication, and combinations thereof.
  • the scans are performed by untrained operators and the system enables two way video communications between the operator and a health care professional.
  • the output of the system is sent directly to a remote healthcare and/or to a non-monitored system professional in real time, or shortly after images are acquired.
  • the system enables overlaying an image of the scanner on top of the ultrasound scans to aid a healthcare professional in interpreting the images.
  • the third aspect of the method comprise performing a calibration process consisting of seven phases, in a coordinate system wherein the positive Z-axis points up, the positive Y-axis points towards the right, and the positive X-axis points forward, which are:
  • determining whether the not enough pressure is being exerted on the skin can be by at least one of:
  • Some embodiments of the third aspect of the method comprise determining through software analysis if an insufficient quantity of water-based gel is interposed between the ultrasound probe head and the skin and issuing an alert to the operator either visually on the display screen and/or audibly from the speakers if an insufficiency of gel is found.
  • the software can determine if an insufficient quantity of water based gel is interposed between the ultrasound probe head and the skin by determining if there is weakening of the signals returning to the probe or weakening of the resulting ultrasound image.
  • FIG. 1 shows four columns each containing plots of data relating to the calibration process
  • FIG. 2 shows the results of estimating the orientation of the scanner by applying an Extended Kalman Filter to the calibrated gyroscope and the accelerometer data;
  • FIG. 3 repeats a test similar to the one shown in FIG. 2 , but the measurements are fed into the EKF without calibration;
  • FIG. 4 shows the angular velocity with which the scan is being carried out, and the tangential velocity that is derived from the angular velocity
  • FIG. 5 schematically shows an embodiment in which a smartphone comprising components of the system fits into a socket in the housing of the scanner of the system;
  • FIG. 6 schematically shows a typical scene on the screen of a smartphone during a scan with the embodiment of the system shown in FIG. 5 ;
  • FIGS. 7 A- 7 C are screenshots showing the effect on the images of insufficient coupling between the ultrasound probe head and the patient's body
  • FIG. 8 is a screen shot showing the results of a blood pressure measurement overlaid on the scan
  • FIG. 9 illustrates movements of a scanner relative to the patient's body
  • FIG. 10 is a flow chart of a coupling alert process.
  • the invention will be described in detail as a system and method that allow a patient to perform ultrasound scans by themselves. While a detailed example for ob-gyn is provided, a person skilled in the art can easily adapt it to other conditions and other organs, for example, cardiovascular, lungs, kidney, thyroid, liver, prostate, bladder, and for other sensors. Moreover, although conceived as a system for self-use by a person in a home environment, because of its portable nature, the system can also be effectively employed by persons not fully trained for ultrasound scanning, for example by a family member, in ambulances, or in the field by an untrained soldier. Needless to say that trained persons may also derive benefits from using the invention as a first approximation, before operating other, more sophisticated equipment available to them.
  • the other sensors referred to above can include any type of sensor that generates data that is useful in improving and/or add relevant information to that acquired through ultrasound images.
  • blood pressure the importance of which in the context of the invention will be further discussed below
  • a proximity sensor which can be used to alert the user if not enough pressure is applied with the housing to the body, which may result in defective readings.
  • An additional example of a sensor useful in the context of the invention is an image acquisition element, which can be used, independently of IMU components, to alert the user of coupling problems (e.g., due to insufficient pressure of the device against the body or insufficient gel), or if the user is scanning too fast to generate a good quality image.
  • image processing can be performed locally in the housing or remotely by a connected device.
  • the scans can be performed by the patient themselves and then transmitted to a remote location for analysis by a health care professional or a non-monitored system, which comprises automated, image analysis circuitry.
  • a health care professional or a non-monitored system, which comprises automated, image analysis circuitry.
  • Some embodiments of the system are configured to allow the use of two-way video communication, i.e. Telemedicine, enabling the patient and a sonographer or to see each other while the patient is carrying out the scanning procedure.
  • the invention also encompasses a system for obtaining and processing ultrasound images of internal organs of a human body.
  • the system is comprised of many components that be arranged in many different configurations, examples of which will be described herein.
  • the component of the system that is essential to all configurations is called herein a “scanner,” which comprises components of the system that are moved by an operator over the surface of a patient's body to acquire the ultrasound images.
  • FIG. 9 illustrates the possible forms of movement of the scanner relative to the patient's body.
  • the scanner comprises a housing that is ergonomically designed to be held by an operator and to be moved across the skin of a person or animal.
  • the housing comprises at least the minimum number of components of the system that must be located on the patient's body to obtain the ultrasound images.
  • the term “associated with” should be interpreted as meaning that the elements or components to which it is referred must not necessarily be integral with the housing, but must be in useful cooperation therewith, For instance, where an accelerometer is discussed, it must move together with the housing, and when a communication component is discussed, it must be in communication with any other component located within the housing with which it must exchange data, or from which it must receive data.
  • These components are: i) an ultrasound probe head, i.e. an array of ultrasound elements; ii) electronic components for wired or wireless communication with remote terminals, and iii) a power source, e.g.
  • IMU Inertial Measurement Unit
  • the inertial sensors are not integral with the housing. Instead, the inertial sensors of the smartphone or the like portable device (that will be discussed later) can be used, or add-on inertial sensors can be connected to the housing prior to use.
  • the housing can be a “docking housing,” i.e., a housing that only comprises components essential for connecting functional components such as sensors of various types thereto, and said sensors can be connected to the docking housing as needed. This embodiment allows selecting appropriate kinds of sensors for a given use, which can be added as “plug and play” components to the housing.
  • AFE Analog Front End
  • a processor containing software configured to operate the system and to receive and process ultrasound signals received from the AFE to produce ultrasound images and to receive and process inertial measurement signals received from the IMU;
  • a user interface comprising a display screen and means to accept user's instructions, e.g. a keyboard or touch screen; and
  • a memory device or devices to store data and images processed by the software in the processor.
  • some or all of these components may be located within the housing of the scanner or at a location near the patient but separated from the housing. There are many options for arranging these components, which will be easily appreciated by the skilled person.
  • the electronic components i.e. the AFE, IMU, processor, memory devices, and communication components can be provided as separate integrated circuits (ICs) or integrated into one more ASICs that comprise all or some of the ICs.
  • ICs integrated circuits
  • Optional components of the system include: ix) a remote terminal e.g. a smartphone, tablet, PC, or similar communication and computing device that is located near the operator or far from the operator, e.g. in a clinic or doctor's office; x) one or more additional IMUs; x) at least one three-axis magnetometer; xi) at least one pressure sensor; and xi) a speaker and microphone for communicating with a remote health care provider.
  • a remote terminal e.g. a smartphone, tablet, PC, or similar communication and computing device that is located near the operator or far from the operator, e.g. in a clinic or doctor's office
  • IMUs e.g. a triangulation sensor
  • xi at least one pressure sensor
  • xi at least one pressure sensor
  • a speaker and microphone for communicating with a remote health care provider.
  • all the components v)-viii) are contained within (or on in the case of the display) the housing of the scanner.
  • all the components v)-viii) are contained within a remote terminal, which is connected to the scanner via a wired or wireless communication link; wherein the wireless link can be formed using any known technology, e.g. Cellular, WIFI or Bluetooth.
  • some of the components v)-viii), e.g. some or all of the components of the AFE, are contained within the scanner and the remainder in the remote terminal, which is connected to the scanner via a wired or wireless communication link.
  • FIG. 5 schematically shows an embodiment in which the display 10 , an IMU 12 , and the processor 14 are contained in a smartphone 16 , which fits into a socket 18 in the housing 20 that contains the other components of the scanner.
  • the smartphone 16 is not necessarily an integral part of the housing 20 but may be fit into the socket 18 before performing a scan, moved as an integral part of the housing 20 during an ultrasound scan, and later detached for other uses.
  • the smartphone 16 is electrically connected to the housing 20 by a connector 22 in the socket 18 , which fits into a standard port on the smartphone 16 .
  • Seen in FIG. 5 is ultrasound probe head 24 at the bottom of housing 20 .
  • smartphone refers to any portable communication device for which a fitting seat can be created in a housing such as housing 20 of FIG. 5 , and is not intended to limit the invention to any particular type of communication device, existing or to be developed.
  • the smartphone was chosen in this example, to illustrate the invention only, since it is a widespread device available to most people.
  • the smartphone is connected via a cable or a wireless connection to the housing and only the housing or the probe itself is moved, i.e., the smartphone does not necessarily have to move in unison with the ultrasound probe.
  • IMUs In other embodiments, different combinations of one or more IMUs, processing devices and software, memory devices, power sources, and components of the AFE are located either within the housing or in the smartphone.
  • the IMU is, on the one hand, very noisy and, on the other hand, relatively inexpensive, in some embodiments it is advantageous to use several of them in one scanner, e.g. one IMU in the smartphone and another in the housing or two or more IMUs in the housing. This will increase the accuracy of the positioning and motion measurements and improve the signal-to-noise (S/N) ration of the received ultrasound signals.
  • one scanner e.g. one IMU in the smartphone and another in the housing or two or more IMUs in the housing. This will increase the accuracy of the positioning and motion measurements and improve the signal-to-noise (S/N) ration of the received ultrasound signals.
  • S/N signal-to-noise
  • the processor is configured to receive data collected by all sensors and contains software that is configured, inter alia, to produce ultrasound images; to analyze the data; and in some embodiments, to decide which images are of sufficient quality to be displayed on the display screen; to compute the location and attitude of the scanner, to discard low quality images; to instruct the operator to hold the housing of the scanner in a predetermined manner, e.g. such that the display screen (or a designated symbol on the housing surface in embodiments in which the display is remotely located) always faces her/him; to determine if the scanner is being held such that enough pressure is being exerted on the skin to produce an image of sufficient quality; and to effectively provide instructions how to move the scanner correctly in order to obtain satisfactory images by means of an intuitive graphical cue presented on the display screen.
  • the instructions to the operator are provided visually or audibly on the display screen and speakers or by a trained health care professional located at a remote terminal.
  • FIG. 6 schematically shows a typical scene on the screen of a smartphone 16 during a scan with the embodiment of the system shown in FIG. 5 .
  • the blank area 26 on the screen is reserved, in this illustrative embodiment, for instructions to the user from the system.
  • Typical instructions include, for example:
  • the task of computing the scanner's location, orientation, and time derivatives of them is carried out by an Inertial Navigation System (INS).
  • the INS is comprised of the IMU, i.e. a set of three-axis gyroscopes and three-axis accelerometers and other sensors, e.g. usually a three-axis magnetometer and a pressure sensor; the processor; and software configured to take initial conditions and calibration data and the output from the IMU and other sensors to compute the Navigation.
  • a mobile phone there is a front camera that points towards the user and a rear camera that points towards objects in the room.
  • a front camera that points towards the user
  • a rear camera that points towards objects in the room.
  • the rear camera points towards a particular object in the room.
  • the rear camera moves with the housing and the movement relative to the object in the image can be tracked using an optical flow method, thereby providing another piece of information to the navigation algorithm that can be used to correct errors.
  • the system can be configured to generate accurate scans of ultrasound signals on the skin and to ensure good value images for diagnostic purposes by using a combination of a pressure sensor and IMU and selecting only images that meet optimal values of the speed of scanning and pressure of the scanner against the skin.
  • sensors of the inertial measurement unit (IMU) or inertial navigation system (INS) can be implemented using a single chip ASIC that contains all or some of them or as discrete chip that implements each sensor separately or as combinations of sensors.
  • the IMU provides several types of data:
  • the IMU is not perfect. IMU errors, upon integration, form drift, an error that increases over time, and therefore, the error of the computed location and orientation quickly propagates over time.
  • An example can best illustrate the problem. Assume that due to measurement noise and other imperfections, the orientation of the device is known with an error of one milli-radian. This error is considered very small given the quality of, for example, a typical smartphone's IMU. Given this error, the processor misinterprets the accelerometer readings, and interprets the projection of the gravitation as a horizontal acceleration of approximately one cm/sec 2 . This small acceleration error results in a location error of 18 meters over one minute, clearly well beyond the acceptable error. Thus, the processor must have some additional information, and must assume some restrictions in order to provide meaningful navigation.
  • MEMS Micro Electro-Mechanical Systems
  • the IMU of a motionless device still produces measurements as if the device is rotating and accelerating.
  • a calibration procedure must be presented.
  • noise no calibration is perfect, and some residual bias always remains.
  • noise albeit random, only sums up to zero after an infinite number of measurements.
  • the expected value of the noise is the square root of the number of measurements times the standard deviation of the noise.
  • the IMUs installed in smartphones are all MEMS based, subject to strict limits of cost, size and energy consumption, and therefore, are very similar to each other. Their noise and bias figures are in principle the same.
  • the navigation process must integrate more measurements, and utilize some prior assumptions, in order to mitigate the IMU errors.
  • the distances moved are small and the scanning speed is relatively slow, which frequently results in the noise generated in the IMU being larger than the signal.
  • Typical distances for these scans are in the range of several millimeters and up to several tens of centimeters and typical speeds of 1 mm/sec to several centimeters per second.
  • bias errors are calibrated for at the manufacturing level. However, some biases vary over time and must be calibrated prior to use. In the case of the scanner described herein, the calibration process is limited to simple steps that the user can easily perform. A prior assumption that can be made is that the user cooperates by holding the scanner such that she/he faces the display on a horizontal table.
  • the acceleration axis should be equal to 9.81 downward, so if a different value than 9.81 is measured, the processor can calibrate the offset and add the offset to each measurement. If the user is required to calibrate the IMU, then, after the system is activated before the beginning of a scanning session, the user is prompted, either by the software in the processor or by a remotely located technician, how to calibrate the gyroscopes and accelerometers. IMU's, especially those made by MEMS technology, must be calibrated before every use as the calibration values vary from day to day and each time they are turned on.
  • a calibration procedure comprising seven phases will now be described. This procedure is one of many that can be used with the scanner and is only meant to illustrate the principles involved. The inventors have used other calibration procedures comprising fewer than seven phases and anticipate that other procedures involving, for example a different order of the phases or more or less than seven phases, can be devised and used, and the selection of the actual calibration method is not essential as long as it yields the required calibration result. In many situations, especially when only slow motion is allowed and the user keeps the screen toward her within several degrees, a one-step calibration, in which only the offset of the gyroscopes is estimated, provides excellent results. In this protocol, the IMU is held still for some time, and the output of the sensors is recorded. The average output of the gyroscopes is taken to be their offset, and the variance of each sensor is taken to be its noise. The Earth rotation, approximately 15 degrees per hour, is usually negligible compared to the gyroscopes offset.
  • a coordinate system is selected.
  • the positive Z-axis points up, the positive Y-axis points towards the right, and the positive X-axis points forward.
  • the letter T is used for the duration of the calibration, it can be, for example, 1, 3, 5, or 10 seconds or longer pending on the type of the IMU.
  • the value of T is a compromise between the degree of accuracy and the patience of the user.
  • the procedure has the following seven phases:
  • the data from the three accelerometers and three gyroscopes are collected by the electronics and transferred to the processor during these seven phases.
  • An example of gyroscopes data is shown in FIG. 1 .
  • FIG. 1 shows four columns each containing plots of data relating to the calibration process.
  • the three rows refer to the three gyroscopes: x, y and z.
  • the horizontal axis is the time of measurement and the vertical axis is the measurement taken from the gyroscopes or the error of this measurement.
  • Vertical lines mark borders between the seven phases, wherein phases are labeled as follows: the 1 st S0, the 2 nd RY, the 3 rd SY, the 4 th RX, the 5 th SX, the 6 th RZ, and the 7 th SZ.
  • the first letter, either S or R refers to either “stationary” or “rotating” situation.
  • the second letter, either X, Y or Z refers to the axis around which the rotation is taken or the axis around which a rotation was taken prior to the stationary situation.
  • the leftmost column contains the data collected from the gyroscopes.
  • S0 at time 0-5 seconds, the device is stationary and the gyroscopes output their offset and any constant rotation, e.g., the Earth rotation.
  • RY at time 5 to 10 seconds, the first 2.5 seconds shows a 180 degrees rotation around the y-axis.
  • y-gyro shows a large signal. And so on for the other phases.
  • the next column shows the error of the measurements. Note that in this case the error is known since every gyroscope senses either zero rotation, or a known angular velocity of 180 degrees over 2.5 seconds period, or approximately 1.26 rad/sec. In this column one can see three features of the signal. The offset is better seen, there is signal at time of rotation on axes other than the axis of rotation, and the output of the rotating gyro is different than expected. The latter two phenomena result from cross-axis measurement and scale factor errors.
  • the first phase, S0 in this example between 0 and 10 seconds, is stationary, and therefore, apart from a small contribution of the rotation of the Earth, should produce zero. However, it can be seen that, in this example, the measurements are offset at approximately [0.12, ⁇ 0.18, 0.02] rad/sec including less than 10 ⁇ 4 rad/sec for the rotation of the Earth.
  • SY data are taken from the same sensors after the sensor was rotated 180 degrees around y-axis, and in this case the contribution of the rotation of the Earth for axis x and z are inversed. Thus, averaging the data at S0 and SY gives an estimate for the offset of the x- and y-gyro. Similar protocol applies to the other axes using other rotations.
  • next column shows the same data as in the previous two columns after the computed offset is removed. Accordingly, stationary states now show 0 rad/sec plus some noise.
  • the output of the x-gyro during RY phase should have been zero, and is actually approximately 0.013 rad/sec.
  • the ratio between the average output of the x-gyro, approximately 0.013 rad/sec, and the average output of the y-gyro, approximately 0.126 rad/sec produces the cross-axis effect between y and x axes, which is approximately 0.01.
  • the scale factor can also be computed from the data in this column by comparing the error to the expected result. For example, an error of 0.125 rad/sec is seen at the second row at phase RY. This error is approximately 0.1 of the signal, and therefore, the scale factor is 1.1.
  • the scale factor and the cross-axis can be combined into a matrix. Multiplying the original results by the inverse of this matrix and subtracting the original data produces the results on the last column, which only contains noise. This noise is conveniently used to estimate the detector noise required by the Extended Kalman Filter.
  • time-constant angular velocity is only for the clarity of the example. Replacing every computation by its average over the duration of the phase produces the same results.
  • the algorithm used by the software in the processor for calibrating the gyroscopes' cross-axis sensitivity is based on the matrix C ⁇
  • the algorithm used by the software in the processor for computing the three projections of the gravity on the three accelerometers in initial body coordinates is:
  • EKF Extended Kalman Filter
  • the state vector at time k( ⁇ circumflex over (x) ⁇ k ) has seven members: three components of the angular velocity in body-frame coordinates ( ⁇ circumflex over ( ⁇ ) ⁇ k b ) and four components of the quaternion ( ⁇ circumflex over (q) ⁇ k ) representing the orientation of the scanner or the rotation of the scanner's body-coordinates relative to the room's assumed inertial coordinates.
  • the transition function, predicting the next step state vector is:
  • the measurement vector is:
  • ⁇ hacek over (a) ⁇ k b is the output of the accelerometer triad at time k, which in turn equals to:
  • C a is a 3 ⁇ 3 matrix whose diagonal consists of the scale factors of the accelerometers triad, the off-diagonals components are the cross-axis sensitivity of the accelerometers, B a is the biases of the accelerometers, ⁇ a k b is the specific force per unit mass applied due to real accelerations, ⁇ i is gravity in body coordinates, and u a is noise.
  • ⁇ i is taken from A_ref computed at the calibration process.
  • the predicted measurement is:
  • C a and B a are those calculated at the calibration process.
  • this filter assumes that the specific force ⁇ a k b (acceleration without gravity) is very small compared to the gravitation, and therefore the accelerometer output vector points down. This situation caps a strong limit on the rotation error thus restraining gyroscope drift.
  • FIG. 2 shows the results of estimating the orientation of the scanner by applying an Extended Kalman Filter to the calibrated gyroscope and the accelerometer data.
  • the left most column shows the output of the gyroscope with a dotted line, the Extended Kalman Filter (EKF) estimation of the rotation with a broken line, and the true rotation with a solid line.
  • EKF Extended Kalman Filter
  • Each row depicts one axis: x, y and z.
  • the left column relates to the angular velocity as measured at body-fixed coordinates. Looking for example at the x-gyro at the top, the true rotation is zero.
  • the sensor produces approximately ⁇ 0.18 rad/sec, which results from offset.
  • the true rotation and the calibrated signal are close together near zero.
  • the rightmost column shows the four elements of the quaternion used in the EKF to estimate the orientation. Again, solid lines and broken lines are used for the real and the estimated quaternion, and they fall very close to each other.
  • the middle column depicts the orientation in Euler angles, which are easier to interpret. Since an angular velocity of 0.1 rad/sec is applied the y-angle advances at this velocity. The solid and broken lines are so close that they cannot be distinguished.
  • the dynamics of the error can better be seen on the x- and z-angles where some error accumulates when y-angle nears 90 degrees.
  • 180 degrees and minus 180 degrees refer to same angle and are not an error.
  • the accumulation of error when y-rotation nears 90 degrees is not accidental, and results from numerical effect.
  • the translation of quaternion to Euler angles uses inverse trigonometric functions and is very sensitive near 90 degrees.
  • FIG. 3 repeats similar test to the one shown in FIG. 2 , but the measurements are fed into the EKF without calibration.
  • the ultrasound scan depends on holding the scanner such that some pressure is exerted on the skin. When the pressure drops, the scanner produces a flat image.
  • the processor analyzes the image, and upon concluding that the picture is flat or using similar criteria such as measuring the variance of the brightness of the image over some region of interest, instead of the entire picture. If the brightness is smaller than a threshold value, it issues an instruction to the operator to increase pressure.
  • this instruction may include, as an example, the appearance of a down-pointing arrow on the display screen with vocal instruction to increase pressure on the skin.
  • FIG. 7 A is a screenshot showing good coupling between the ultrasound probe head and the patient's body and FIG. 7 B and FIG. 7 C show examples of insufficient or partial coupling. This process can be carried out in the mobile device processor or in the controller of the AFE, in a component of the device containing the ultrasound transducer, or in external software.
  • the speed of the scan can be calculated from the angular velocity.
  • the processor assumes motion perpendicular to the surface of the body.
  • the radius can be better approximated based on the patient's BMI and stage of the pregnancy.
  • the speed can be approximated as:
  • ⁇ circumflex over ( ⁇ ) ⁇ k b is the angular velocity at body coordinates, estimated by the filter
  • R k b is computed as R 0 ⁇ x
  • ⁇ x is the unit vector pointing down from the scanner.
  • the angular velocity is dominantly along the scanner y-axis, i.e., the scanner moves right to left or left to right along a sphere, and the speed is approximately R 0 ⁇ circumflex over ( ⁇ ) ⁇ y k b .
  • FIG. 4 shows the angular velocity with which the scan is being carried out, and the spatial velocity that is derived from the angular velocity.
  • the third column of the figure shows the three components of the velocity in radial coordinates along the belly of a pregnant patient.
  • the X-axis refers to radial motion from the center of the belly to the outside. This motion is by assumption zero.
  • the y-axis refers to motion across the belly from bottom to top, and z-axis refer to motion from right to left.
  • the other columns show the same information as the tree columns in FIG. 2 and are shown for reference.
  • the range of permitted velocities is a characteristic of the scanner, and is typically several centimeters per second.
  • This slow motion produces radial acceleration of as little as one millimeter per second squared, which means that the acceleration of gravity can be used by the EKF as a good approximation of the acceleration in the downward direction.
  • the scan is discarded and an instruction is issued to the patient to go slower.
  • the scanner can ensure that the user is instructed to cover a predetermined range of angles, and to do it within the permitted velocity range. Adding the quality of the image produced by the image processing a proper pressure on the skin is also maintained. Altogether this ensures a good examination.
  • FIG. 8 is a screen shot showing an embodiment of how the results of a blood pressure measurement can be displayed to a physician or other trained healthcare professional both as a written message and as an overlay on the scan.
  • the scanner is a “black box” as far as the operators of the scanner are concerned.
  • the algorithms discussed above are all useful only to the internal working of the system, whose processor is programmed to utilize them in order to generate instructions to the patient to guide them through the process of collecting ultrasound scans that are of sufficient quality to provide useful information.
  • the patients only have to follow the visual or audible instructions that they receive from the components of the system or from a sonographer in the case of Telemedicine. It is also possible to show video instructions by means of animations.
  • a typical set of instructions issued by the system to guide an operator to perform a scan will comprise the following:
  • the output of the scanner may be sent directly to a healthcare professional, e.g. the patient's personal physician, in real time or shortly after they are acquired, and some or all of the instructions to the patient may be sent by the physician, especially if a particular region of the anatomy has to be studied in greater depth than is normally possible from general scans.
  • software in the processor is configured to overlay an image of the scanner on top of the ultrasound scans.
  • the processor is configured to relay the instructions that are sent to the operator during the scan so the physician can understand what instruction was presented and at what time with respect to the images.
  • the following exemplifies a coupling alerting procedure according to one particular embodiment of the invention.
  • the procedure involves the following steps:
  • the process is shown, in flow chart form, in FIG. 10 .
  • the following illustrated a procedure for dealing with a user who is moving the housing too fast to produce a good quality scan.
  • the first step is aimed at distinguishing between the embryo's movements, and the scanner's movement.
  • the embryo's movements are localized and so they do not change a large portion of the image.
  • a scanner movement will change all the image at once.
  • a temporal standard deviation is calculated over 6 frames. If significant change is detected in more than 0.5% of the total scan pixels, this signifies that a movement has been made.
  • a change per second in pixel intensity is evaluated across the image.
  • a pixel temporal standard deviation is used as an estimator for change. For an image I(x, y, n) where “n” is the number of frames, and each frame is taken in time t(n). The following calculation is used to evaluate change:
  • Th is a threshold empirically selected to distinguish between noise- and motion-related changes.
  • the mean standard deviation between 10 sequential frames is calculated for 100 scans, which have been taken with the ultrasound device of the invention. It should be taken when the device is held still, for instance on a pregnant women abdomen with no significant fetal movements. Th value are calculated as the mean and three standard deviations of the calculated values.
  • the frame is considered as a moving frame.
  • an optical flow V x , V y is calculated using the Lucas-Kanade method with pyramids. Corners in the centre of the image are used for calculation using Harris corner detector.
  • the optical flow gives the speed per frame. In order to attain the speed in time it should be normalized by the FPS.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Acoustics & Sound (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
US17/920,948 2020-05-01 2021-04-25 System for acquiring ultrasound images of an organ of a human body Pending US20230165562A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
IL274382A IL274382A (en) 2020-05-01 2020-05-01 A system and method for assisting an unskilled patient in performing ultrasound scans himself
IL274382 2020-05-01
PCT/IL2021/050469 WO2021220263A1 (en) 2020-05-01 2021-04-25 A system for acquiring ultrasound images of an organ of a human body

Publications (1)

Publication Number Publication Date
US20230165562A1 true US20230165562A1 (en) 2023-06-01

Family

ID=78373393

Family Applications (5)

Application Number Title Priority Date Filing Date
US17/920,951 Pending US20230165565A1 (en) 2020-05-01 2021-04-25 System and a method for allowing a non-skilled user to acquire ultrasound images of internal organs of a human body
US17/920,948 Pending US20230165562A1 (en) 2020-05-01 2021-04-25 System for acquiring ultrasound images of an organ of a human body
US17/920,957 Pending US20230165568A1 (en) 2020-05-01 2021-04-25 System for acquiring ultrasound images of internal body organs
US17/920,963 Pending US20230165569A1 (en) 2020-05-01 2021-04-26 System for acquiring ultrasound images
US17/920,967 Pending US20230181157A1 (en) 2020-05-01 2021-04-26 System for acquiring ultrasound images of internal organs of a human body

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/920,951 Pending US20230165565A1 (en) 2020-05-01 2021-04-25 System and a method for allowing a non-skilled user to acquire ultrasound images of internal organs of a human body

Family Applications After (3)

Application Number Title Priority Date Filing Date
US17/920,957 Pending US20230165568A1 (en) 2020-05-01 2021-04-25 System for acquiring ultrasound images of internal body organs
US17/920,963 Pending US20230165569A1 (en) 2020-05-01 2021-04-26 System for acquiring ultrasound images
US17/920,967 Pending US20230181157A1 (en) 2020-05-01 2021-04-26 System for acquiring ultrasound images of internal organs of a human body

Country Status (8)

Country Link
US (5) US20230165565A1 (zh)
EP (5) EP4142603A4 (zh)
JP (5) JP2023523317A (zh)
CN (5) CN115666397A (zh)
AU (5) AU2021265350A1 (zh)
CA (5) CA3180180A1 (zh)
IL (6) IL274382A (zh)
WO (5) WO2021220265A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220199229A1 (en) * 2020-12-03 2022-06-23 Wavebase Inc. Method and system for enhancing medical ultrasound imaging devices with computer vision, computer aided diagnostics, report generation and network communication in real-time and near real-time
US20230389898A1 (en) * 2022-06-01 2023-12-07 Fujifilm Sonosite, Inc. Ultrasound system for a virtual sonography team

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220133270A1 (en) * 2020-11-04 2022-05-05 Luiz Maracaja Accessory technologies for ultrasound systems
CN115670502A (zh) * 2021-07-23 2023-02-03 巴德阿克塞斯系统股份有限公司 用于医疗装置的稳定系统
IL293427A (en) * 2022-05-29 2023-12-01 Pulsenmore Ltd Apparatus and method for identifying fluid-related medical conditions in a patient
CN116158851B (zh) * 2023-03-01 2024-03-01 哈尔滨工业大学 医用远程超声自动扫描机器人的扫描目标定位系统及方法

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6126608A (en) * 1999-05-18 2000-10-03 Pie Medical Equipment B.V. Portable ultrasound diagnostic system with handsfree display
US10726741B2 (en) * 2004-11-30 2020-07-28 The Regents Of The University Of California System and method for converting handheld diagnostic ultrasound systems into ultrasound training systems
WO2006127142A2 (en) * 2005-03-30 2006-11-30 Worcester Polytechnic Institute Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors
NZ568721A (en) * 2005-11-07 2011-01-28 Signostics Ltd Portable ultrasound system with probe and handheld display linked by a cable to locate around a user's neck
US9456800B2 (en) * 2009-12-18 2016-10-04 Massachusetts Institute Of Technology Ultrasound scanning system
US8753278B2 (en) * 2010-09-30 2014-06-17 Siemens Medical Solutions Usa, Inc. Pressure control in medical diagnostic ultrasound imaging
US20120179039A1 (en) * 2011-01-07 2012-07-12 Laurent Pelissier Methods and apparatus for producing video records of use of medical ultrasound imaging systems
US9495018B2 (en) * 2011-11-01 2016-11-15 Qualcomm Incorporated System and method for improving orientation data
US9561019B2 (en) * 2012-03-07 2017-02-07 Ziteo, Inc. Methods and systems for tracking and guiding sensors and instruments
US9021358B2 (en) * 2013-03-15 2015-04-28 eagleyemed, Inc. Multi-site video based computer aided diagnostic and analytical platform
US20160213349A1 (en) * 2013-09-10 2016-07-28 Here Med Ltd. Fetal heart rate monitoring system
CN105813573B (zh) * 2013-12-09 2019-06-04 皇家飞利浦有限公司 使用基于模型的分割的成像视图操纵
US9949715B2 (en) * 2014-02-12 2018-04-24 General Electric Company Systems and methods for ultrasound probe guidance
WO2015142306A1 (en) * 2014-03-20 2015-09-24 Ozyegin Universitesi Method and system related to a portable ultrasonic imaging system
IL236484A (en) * 2014-12-25 2017-11-30 Pulsenmore Ltd Device and system for monitoring internal organs of man or animals
US9877700B1 (en) * 2015-02-24 2018-01-30 Asch-Klaassen Sonics, Llc Ultrasound imaging of anatomy
US9763644B2 (en) * 2015-03-27 2017-09-19 Clarius Mobile Health Corp. System and method for connecting and controlling wireless ultrasound imaging system from electronic device
JP2018520746A (ja) * 2015-06-08 2018-08-02 ザ ボード オブ トラスティーズ オブ ザ レランド スタンフォード ジュニア ユニバーシティー 3d超音波画像化とこれに関連する方法、装置、及びシステム
JP6389963B2 (ja) * 2015-08-31 2018-09-12 富士フイルム株式会社 超音波診断装置および超音波診断装置の制御方法
US20170273664A1 (en) * 2016-03-24 2017-09-28 Elwha Llc Wearable ultrasonic fetal imaging device
US20170273663A1 (en) * 2016-03-24 2017-09-28 Elwha Llc Image processing for an ultrasonic fetal imaging device
IL244746B (en) * 2016-03-24 2021-03-25 Pulsenmore Ltd A complete system for linking sensors to smart devices
EP3445249B1 (en) * 2016-04-19 2020-04-15 Koninklijke Philips N.V. Ultrasound imaging probe positioning
US10959702B2 (en) * 2016-06-20 2021-03-30 Butterfly Network, Inc. Automated image acquisition for assisting a user to operate an ultrasound device
US20180344286A1 (en) * 2017-06-01 2018-12-06 General Electric Company System and methods for at-home ultrasound imaging
AU2018323621A1 (en) * 2017-08-31 2020-02-06 Butterfly Network, Inc. Methods and apparatus for collection of ultrasound data
WO2019051007A1 (en) * 2017-09-07 2019-03-14 Butterfly Network, Inc. ULTRASONIC DEVICE ON A CHIP RELATED TO THE WRIST
EP3720358A1 (en) * 2017-12-08 2020-10-14 Neural Analytics, Inc. Systems and methods for gel management
US11660069B2 (en) * 2017-12-19 2023-05-30 Koninklijke Philips N.V. Combining image based and inertial probe tracking
US20190190952A1 (en) * 2017-12-20 2019-06-20 Mercy Health Systems and methods for detecting a cyberattack on a device on a computer network
WO2019126625A1 (en) * 2017-12-22 2019-06-27 Butterfly Network, Inc. Methods and apparatuses for identifying gestures based on ultrasound data
WO2019157486A1 (en) * 2018-02-12 2019-08-15 Massachusetts Institute Of Technology Quantitative design and manufacturing framework for a biomechanical interface contacting a biological body segment
WO2019173152A1 (en) * 2018-03-05 2019-09-12 Exo Imaging, Inc. Thumb-dominant ultrasound imaging system
US11559279B2 (en) * 2018-08-03 2023-01-24 Bfly Operations, Inc. Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
CA3110077A1 (en) * 2018-08-29 2020-03-05 Butterfly Network, Inc. Methods and apparatuses for collection of ultrasound data
WO2020162989A1 (en) * 2019-02-04 2020-08-13 Google Llc Instrumented ultrasound probes for machine-learning generated real-time sonographer feedback

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220199229A1 (en) * 2020-12-03 2022-06-23 Wavebase Inc. Method and system for enhancing medical ultrasound imaging devices with computer vision, computer aided diagnostics, report generation and network communication in real-time and near real-time
US20230389898A1 (en) * 2022-06-01 2023-12-07 Fujifilm Sonosite, Inc. Ultrasound system for a virtual sonography team

Also Published As

Publication number Publication date
US20230165565A1 (en) 2023-06-01
CN115666397A (zh) 2023-01-31
JP2023523318A (ja) 2023-06-02
WO2021220263A1 (en) 2021-11-04
JP2023523316A (ja) 2023-06-02
CA3180176A1 (en) 2021-11-04
EP4142603A4 (en) 2024-08-21
IL274382A (en) 2021-12-01
WO2021220269A1 (en) 2021-11-04
IL297156A (en) 2022-12-01
EP4142600A4 (en) 2024-06-05
AU2021262613A1 (en) 2022-11-24
IL297157A (en) 2022-12-01
IL297160A (en) 2022-12-01
US20230181157A1 (en) 2023-06-15
CA3180186A1 (en) 2021-11-04
AU2021265595A1 (en) 2022-11-24
CN115668389A (zh) 2023-01-31
CN115697206A (zh) 2023-02-03
AU2021265597A1 (en) 2022-11-24
JP2023523955A (ja) 2023-06-08
EP4142603A1 (en) 2023-03-08
EP4142601A4 (en) 2024-06-05
CA3180182A1 (en) 2021-11-04
AU2021265350A1 (en) 2022-11-24
US20230165569A1 (en) 2023-06-01
EP4142599A1 (en) 2023-03-08
EP4142602A1 (en) 2023-03-08
IL297158A (en) 2022-12-01
CN115666396A (zh) 2023-01-31
EP4142601A1 (en) 2023-03-08
JP2023523956A (ja) 2023-06-08
EP4142599A4 (en) 2024-06-12
WO2021220264A1 (en) 2021-11-04
CA3180167A1 (en) 2021-11-04
US20230165568A1 (en) 2023-06-01
EP4142602A4 (en) 2024-08-21
IL297159A (en) 2022-12-01
EP4142600A1 (en) 2023-03-08
WO2021220265A1 (en) 2021-11-04
CA3180180A1 (en) 2021-11-04
AU2021262615A1 (en) 2022-11-24
CN115666398A (zh) 2023-01-31
JP2023523317A (ja) 2023-06-02
WO2021220270A1 (en) 2021-11-04

Similar Documents

Publication Publication Date Title
US20230165562A1 (en) System for acquiring ultrasound images of an organ of a human body
EP3288465B1 (en) In-device fusion of optical and inertial positional tracking of ultrasound probes
US10806391B2 (en) Method and system for measuring a volume of an organ of interest
US20200214682A1 (en) Methods and apparatuses for tele-medicine
KR20140062252A (ko) 스마트폰을 이용한 3차원 초음파 이미지 생성방법
EP3785627A1 (en) A device and a method for dynamic posturography

Legal Events

Date Code Title Description
AS Assignment

Owner name: PULSENMORE LTD, ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SONNENSCHEIN, ELAZAR;ALBECK, YEHUDA;ELIA, PAZ;AND OTHERS;REEL/FRAME:061516/0597

Effective date: 20211005

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED