WO2021115905A1 - Intuitive control interface for a robotic tee probe using a hybrid imaging-elastography controller - Google Patents

Intuitive control interface for a robotic tee probe using a hybrid imaging-elastography controller Download PDF

Info

Publication number
WO2021115905A1
WO2021115905A1 PCT/EP2020/084392 EP2020084392W WO2021115905A1 WO 2021115905 A1 WO2021115905 A1 WO 2021115905A1 EP 2020084392 W EP2020084392 W EP 2020084392W WO 2021115905 A1 WO2021115905 A1 WO 2021115905A1
Authority
WO
WIPO (PCT)
Prior art keywords
probe
motion
image
state
position control
Prior art date
Application number
PCT/EP2020/084392
Other languages
French (fr)
Inventor
Paul Thienphrapa
Sean Joseph KYNE
Molly Lara FLEXMAN
Ameet Kumar Jain
Sibo Li
Kunal VAIDYA
Marcin Arkadiusz Balicki
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2021115905A1 publication Critical patent/WO2021115905A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4272Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue
    • A61B8/429Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue characterised by determining or monitoring the contact between the transducer and the tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0891Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • A61B8/145Echo-tomography characterised by scanning multiple planes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/445Details of catheter construction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means

Abstract

The following relates generally to systems and methods of transesophageal echocardiography (TEE) automation. Some aspects relate to a TEE probe with ultrasonic transducers on a distal end of the TEE probe. In other aspects, the described techniques are employed in intracardiac echo (ICE) probes, intravascular ultrasound (IVUS) probes, or the like. In some implementations, probe behavior states are identified based on probe gear motion, image motion, and measured tissue elasticity. Identified behavior states are displayed on a user interface, and probe control is performed based on the current behavior state.

Description

INTUITIVE CONTROU INTERFACE FOR A ROBOTIC TEE PROBE USING A HYBRID IMAGING-EEASTOGRAPHY CONTROEEER
FIEED
[0001] The following relates generally to systems and methods of transesophageal echocardiography (TEE).
BACKGROUND
[0002] Transesophageal echocardiography (TEE) is an approach for cardiac ultrasound imaging in which the ultrasound (US) probe includes a flexible tube-like cable with the ultrasound transducer located at its distal tip. The TEE probe is inserted into the esophagus to place it close to the heart. Existing TEE probes typically include mechanical joints that, along with controlled insertion distance and angulation of the TEE probe and electronic beam steering of the ultrasound imaging plane, provide substantial flexibility in positioning the ultrasound transducer and the imaging plane so as to acquire a desired view of the heart. However, concerns include a risk of perforating the esophagus, and difficulty in manipulating the many degrees of control with unintuitive visual feedback to achieve a desired clinical view.
[0003] TEE is often used as a visualization tool for performing catheter-based cardiac interventions. In such tasks, standard views are usually obtained, so that the TEE image has a general pattern that is familiar to the operator, and thus to the interventionist controlling the catheter-based devices. As the cardiac intervention proceeds, the operator often wants to move between different standard views that provide different perspectives on the heart and catheter. Each movement of the TEE probe to a different view takes substantial time, and has the potential to cause injury to the esophagus. Moreover, the more closely the actual TEE probe position is to the standard view, the closer the US image will be to the general pattern for that view that the operator expects to see.
[0004] The following discloses certain improvements.
SUMMARY
[0005] In one disclosed aspect, an ultrasound (US) system comprises: a probe including a tube, ultrasonic transducers on a distal end of the tube, and mechanical joints; a controller comprising at least one electronic processor and at least one memory storing computer program code; the at least one memory and the computer program code configured to, with the at least one electronic processor, cause the probe to steer to a target by an iterative process. Each iteration includes: acquiring, via the ultrasonic transducers, a US image; calculating measurement data based on information received from the probe, including the US image; based on the measurement data, identifying at least one current behavior state the probe; and displaying on a user interface the US image, a current position of the distal end of the probe in an anatomical body, and an indicator of the at least one current behavior state of the probe.
[0006] In some embodiments, the measurement data comprises at least one of: probe position data derived from probe kinematic information and US image data acquired by the ultrasound transducers; and force on the probe derived from probe motor current information and tissue elastography information derived from acquired US image data.
[0007] The indicator of the at least one current behavior state of the probe may be an indicator of one or more of: a free motion state; a backlash state; a hazard state; an error state; a drifting state; and a steady state.
[0008] In some embodiments, the at least one electronic processor is configured to execute the computer-readable instructions to cause the US user interface to display a free motion state indicator upon a determination that: position control elements within the probe are in motion; tissue elastography information indicates tissue stiffness at or below a predetermined stiffness threshold; and the US image moves in concert with the distal end of the probe.
[0009] According to other aspects, the at least one electronic processor is configured to execute the computer- readable instructions to cause the US user interface to display a backlash state indicator upon a determination that: position control elements within the probe are in motion; tissue elastography information indicates no tissue stiffness; and the US image is stationary.
[0010] In other implementations, the at least one electronic processor is configured to execute the computer-readable instructions to cause the US user interface to display a hazard state indicator upon a determination that: position control elements within the probe are in motion; tissue elastography information indicates tissue stiffness above a predetermined stiffness threshold; and the US image moves inconsistently relative to the distal end of the probe.
[0011] In some embodiments, the at least one electronic processor is configured to execute the computer-readable instructions to cause the US user interface to display an error state indicator upon a determination that: position control elements within the probe are in motion; and the probe is motionless.
[0012] In some implementations, the at least one electronic processor is configured to execute the computer-readable instructions to cause the US user interface to display a drifting state indicator upon a determination that: position control elements within the probe are motionless; tissue elastography information indicates tissue stiffness at or below a predetermined stiffness threshold; and the US image is in motion.
[0013] According to some aspects, the at least one electronic processor is configured to execute the computer-readable instructions to cause the US user interface to display a steady state indicator upon a determination that: position control elements within the probe are motionless; tissue elastography information indicates tissue stiffness is unchanging; and the US image is motionless.
[0014] The at least one electronic processor is also configured to execute the computer- readable instructions to cause the user interface to display instructions for controlling motion of the distal end of the probe based on the identified behavior state.
[0015] According to another aspect, a method for controlling an ultrasonic probe, comprises: acquiring an ultrasound image via a probe device comprising ultrasonic transducers; and determining a behavior state of the probe based on a probe position control element motion criterion, an image motion criterion, and a tissue stiffness criterion. The method further comprises displaying on a user interface: the US image; a current position of a distal end of the probe in an anatomical body; an indicator of the at least one current behavior state of the probe; and instructions for controlling the probe based on the displayed behavior state indicator.
[0016] In some embodiments, the indicator of the at least one current behavior state of the probe is an indicator of one or more of: a free motion state; a backlash state; a hazard state; an error state; a drifting state; and a steady state.
[0017] In some implementations, the method comprises displaying the free motion state indicator when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image moves in concert with the distal end of the probe. [0018] In some implementations, the method comprises displaying the backlash state indicator when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image is stationary.
[0019] In some embodiments, the method comprises displaying the hazard state indicator when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness above a predetermined stiffness threshold, and the US image motion criterion indicates the image is moving inconsistently.
[0020] In some aspects, the method comprises displaying the error state indicator when the probe position control element motion criterion indicates position control elements within the probe are in motion, and the US image motion criterion is indicative of a motionless probe. [0021] According to some implementations, the method comprises displaying the drifting state indicator when the probe position control element motion criterion indicates position control elements within the probe are not in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image is in motion.
[0022] In some embodiments, the method comprises displaying the steady state indicator when the probe position control element motion criterion indicates position control elements within the probe are in not motion, the tissue stiffness criterion indicates tissue stiffness is unchanging, and the US image motion criterion indicates the image is stationary.
[0023] According to another aspect, an ultrasound (US) controller comprises: at least one electronic processor and at least one memory storing computer program code; the at least one memory and the computer program code configured to, with the at least one electronic processor, cause the probe to steer to a target by an iterative process in which each iteration includes: acquiring a US image via a probe device comprising ultrasonic transducers; determining a behavior state of the probe based on a probe position control element motion criterion, an image motion criterion, and a tissue stiffness criterion; displaying on a user interface: the US image; a current position of a distal end of the probe in an anatomical body; an indicator of the at least one current behavior state of the probe; and instructions for controlling the distal end of the probe based on the displayed behavior state indicator.
[0024] The indicator of the at least one current behavior state of the probe may be an indicator of one or more of: a free motion state when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image moves in concert with the distal end of the probe; a backlash state when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image is stationary; a hazard state when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness above a predetermined stiffness threshold, and the US image motion criterion indicates the image is moving inconsistently; an error state when the probe position control element motion criterion indicates position control elements within the probe are in motion, and the US image motion criterion is indicative of a motionless probe; a drifting state when the probe position control element motion criterion indicates position control elements within the probe are not in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image is in motion; and a steady state when the probe position control element motion criterion indicates position control elements within the probe are in not motion, the tissue stiffness criterion indicates tissue stiffness is unchanging, and the US image motion criterion indicates the image is stationary.
[0025] One advantage resides in safer use of a probe, such as a transesophageal echocardiography (TEE) probe.
[0026] Another advantage resides in a TEE probe system that is easier to operate.
[0027] A given embodiment may provide none, one, two, more, or all of the foregoing advantages, and/or may provide other advantages as will become apparent to one of ordinary skill in the art upon reading and understanding the present disclosure. BRIEF DESCRIPTION O WINGS [0028] The invention may take form in various components and arrangements of components, and in various steps and arrangements of steps. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. [0029] FIGURE 1 illustrates an example of a probe inserted into an esophagus, and further illustrates example preset distal end positions.
[0030] FIGURE 2 illustrates TEE views for a transseptal puncture procedure.
[0031] FIGURE 3 illustrates an example of a TEE probe exhibiting backlash in the TEE probe internal mechanisms wherein there is an area within the probe range of motion where the probe distal end does not move even though the knobs and/or mechanical position control elements are turning.
[0032] FIGURES 4 and 5 illustrate backlash shown from different perspectives.
[0033] FIGURE 6 illustrates a schematic view of a robot behavior controller, showing inputs into the control decision as well as the processing of decisions to guide the robotic probe.
DETAILED DESCRIPTION
[0034] Over the past approximately 15-20 years, many interventional procedures on the heart, including aortic valve repair, mitral valve repair or replacement, foramen ovale closure, and atrial septal defect closure have migrated from a surgical approach to a transcatheter approach. Transfemoral access is a common technique in which a tiny incision is made near the groin of patent to serve as an instrument portal into the femoral vein, en route to the heart. In transcatheter interventions, the clinician introduces long, flexible tools into the heart through the vasculature.
[0035] Transcatheter approaches have risen in popularity because, compared to surgery, they impose less trauma on patients and require less postoperative recovery time. At the same time, they are technically challenging procedures to perform due to lack of dexterity, visualization, and tactile feedback. Some of these capabilities are restored through technologies such as transesophageal echocardiography (TEE). In particular, TEE imaging restores visualization lost by minimal access approaches, and to a lesser extent replaces tactile feedback with visual feedback of the tool-to-tissue interactions. [0036] One problem with conventional ultrasound images do not offer the rich set of visual cues, including color, lighting, shading, perspective, and texture, found in natural vision and optical camera images; they are abstract representations of surgical scenarios. Additionally, ultrasound images often include noise and unnatural artifacts such as acoustic reflections that require cognitive effort to overcome. These two factors combine to make ultrasound images difficult to interpret. Moreover, TEE images originating from the esophagus behind the heart may be angled from a different point of view than that of the clinician, forcing a disjointed hand- eye coordination that makes acquiring specific views difficult. Still furthermore, clinicians must mentally reconstruct spatial information from inside the heart from multiple cross sectional views. In other words, one ultrasound image does not contain sufficient, actionable information. While conventional TEE does provide 3D volumetric rendering, its visualization does not allow clinicians to delineate the fine details needed in practice. Clinicians must instead steer beam angles in two dimensions to obtain desired higher resolution images slices.
[0037] Consequently, clinicians prefer higher resolution, orthogonal 2D image slices, and steer the plane angle in two dimensions (e.g., up/down, left/right). The task of adjusting both TEE probe position and image plane angle, amidst the backdrop of challenging image interpretation and disjoint hand-eye coordination, is burdensome, cognitively demanding, and ultimately susceptible to inefficiencies and errors. Clinicians must furthermore balance the above task goals while maintaining adequate transducer coupling, minding the physical constraints imposed by the patient esophagus, and monitoring the safety of moving the probe inside the patient. The sum whole of these requirements can be overwhelming, making conventional TEE probes unintuitive to control and requiring extensive training and experience.
[0038] The foregoing problems are overcome by the herein-described systems and methods. [0039] Inset A of FIGURE 1 illustrates probe 105 (which, in some embodiments, is TEE probe 105) including a tube 106 having a distal end 110 which includes ultrasonic transducers 112 (e.g., a phased array of side-emitting ultrasound transducers). In a manual design, the operator may control the movement of the distal end 110 by turning the knobs 170, and by manually extending (i.e. pushing) the tube 106 deeper into the esophagus, retracting (i.e. pulling) the tube 106 out of the esophagus, or rotating the tube while it is in the esophagus. Alternatively, servomotors 120 (controlled, e.g., by electric controller 130) may be used to perform these operations under computer control. A combination of manual and servomotor control is also contemplated (e.g., servomotor controlled joints and manual extension/retraction). Other types of control are also contemplated. For example, magnets, memory alloys, hydraulics, air, and so forth may also be used. The operator may use ultrasound images generated by the ultrasound transducers 112 at the distal end 110 as a guide in controlling the movement of the TEE probe. [0040] Disclosed herein is a control system (e.g., the electronic controller 130 of FIGURE 1) to robotically steer a robotic TEE probe (also referred to herein as a “probe” or a “robot”) using servomotors 120. The control system is suited for intuitive operation of a TEE probe, bounded by the physical and safety constraints imposed by the human esophagus. The controller combines measurements from a probe, especially images and elastography. The control system can process contextual information, interpret image content, and calculate viewpoint transformations in a more efficient manner, making view acquisition a more deterministic and reliable process. The invention is suited for intuitive operation of a TEE probe, bounded by the physical and safety constraints imposed by the human esophagus. The controller combines measurements from a probe, especially images and elastography.
[0041] In a variant embodiment in which the TEE probe is controlled by manual knobs 170 and manual tube extension/retraction rather than servomotors 120, the control system may provide human-perceptible directions (e.g., “retract probe”, “advance probe”, “turn probe to the right”, et cetera), displayed on a control computer display (such as the display 140 of FIGURE 1, and/or articulated using a speech synthesizer) for assisting in manual control of the TEE probe. The electronic controller 130 suitably comprises at least one electronic processor that reads and executes instructions (i.e. computer program code) stored in at least one memory (i.e. at least one non-transitory storage medium) in order to perform the disclosed TEE probe control methods. The electronic processor may, for example, comprise a computer having one or more digital or analog ports for operatively communicating with the servomotor(s) 120. For example, the communication can be digital communication via a USB port or other standard digital port, or may be via digital and/or analog control signals generated by a dedicated control EO card installed in the computer, and/or so forth. In some embodiments, the electronic controller 130 may be integral with the ultrasound imaging controller that controls the ultrasound imaging performed by the TEE probe 105, in which case the display 140 also presents ultrasound images acquired by the TEE probe 105. [0042] FIGURE 1 thus illustrates an example of the probe 105 inserted into an esophagus 115, and illustrates an example ultrasonic FOV 122. For 2D ultrasound imaging, the FOV 122 is typically a two-dimensional wedge. FIGURE 1 further illustrates example preset distal end positions, which may correspond to standard TEE views such as: upper esophageal position 125; mid esophageal position 132; transgastric position 135; and deep transgastric position 142. [0043] Generally, in some embodiments, there is a control system to robotically steer the probe and/or distal end, and automatically steer the image plane angles in service of finding the anatomical and device views required to provide visualization for performing structural heart interventions. The control system can digest contextual information, interpret image content, and calculate viewpoint transformations in an efficient manner, making view acquisition a more deterministic and reliable process compared with unassisted manual manipulation of the TEE probe.
[0044] Put another way, some embodiments involve an improvement in which the manual control of the TEE probe 105 is augmented or replaced by robotic control. Some implementations use a set of rules for iterative robotic control, including, for each iteration: adjusting only the electronic beam steering if the target is in the field of view (FOV); adjusting both the beam steering and the mechanical joints if the target is at the edge of the FOV, biasing toward electronic beam steering; and adjusting only the mechanical joints if the target is not in the FOV. To recognize the target, a database of reference ultrasound images at standard views may be used, or a model of the standard ultrasound view may be used (e.g., a view in which all four chambers of the heart are visible can be modeled using the expected four-compartment image format). In another approach, if the clinician is at a particular view then this image may be stored as a reference image, and if the clinician later wishes to return to that view then the stored reference image is retrieved.
[0045] In addition, some embodiments use Intracardiac Echo (ICE) probes (rather than the above-described TEE probes) which are thin probes that are inserted into the heart. The approaches described herein are able to properly position these probes with respect to the Right Atrium, Left Atrium, etc. In other embodiments, Intravascular Ultrasound (IVTJS) probes are used, which are thin probes used in blood vessels. In addition, some implementations include endobronchial US, and some implementations include transrectal US and transurethral US for urology. It should be understood that the techniques described herein include any in-body ultrasound imaging in general.
[0046] By way of a non-limiting example, FIGURE 2 illustrates TEE views for providing visualization during a transseptal puncture procedure. Long axis 210 and short axis 220 outline orthogonal image planes containing the needle and the target puncture site in the septum. These views may be difficult to find by manipulation of the TEE probe, and furthermore clinicians may need to move the probe to a different view and then attempt to restore these views afterwards. [0047] Table 1 (below) illustrates various definitions of robotic TEE behavior states in a structural heart intervention scenario. Behavior states govern how the robot moves (or refuses to move), depending on circumstance. These well-defined states can be communicated to the user to make robot control more intuitive via the user interface 140 (Figure 1). There is a direct mapping between internal behavior states and the user interface because these modes derive from circumstances found in structural heart interventions.
Figure imgf000012_0001
The value of each behavior definition is that it derives from the challenging and diverse circumstances encountered in structural heart interventions, so it can be used to both govern robot behavior and communicate the state of the robot behavior to the user. This is important because control of a probe is inherently challenging and counterintuitive, requiring extensive training and experience. As such, a valuable component of the described systems and methods is the ability to command the TEE robot to perform clinically relevant motions in an easy to use way. Another benefit of the insights embodied in Table 1 is that they extend into a variety of other applications, including vascular interventions and bronchoscopy. The robot states and corresponding behaviors reflect conditions encountered during intervention, as described below. In one embodiment, the states are determined as a function of a probe position control element motion criterion (e.g., whether the position control elements are in motion or not), an image motion criterion (e.g., whether the acquired US image is in motion, changing, or still), and a tissue stiffness criterion (e.g., whether a measured tissue stiffness is detected or above/below a predetermined stiffness threshold). “Position control elements” a used herein can be, for example, gears and/or joints within the probe body, sensors that detect a shape or position of the probe body, gears, joints, and/or probe tip, software that governs control of the probe body and/or gears or joints therein, an embedded internal measurement unit that is within the probe that monitors shape, position, etc. of the probe body and/or probe tip, or any other suitable means for controlling the probe, determining a shape or position thereof, etc. The probe may further comprise one or more sensors that monitor probe position and/or shape. An indicator of a current behavior state of the probe is selected, based on the values of the above-mentioned criteria, from among a set of behavior states including at least: a free motion state; a backlash state; a hazard state; an error state; a drifting state; and a steady state. A “current” behavior state denotes a real time actual state of the probe behavior, as opposed to a past or a future or predicted state. In one embodiment, the current state of the probe is determined continuously during an examination procedure. In another embodiment, the current state of the probe is determined periodically on a predetermined interval, which may be on the order of, e.g., seconds or milliseconds, or some other predetermined interval.
[0048] Free motion: the probe can move freely in any direction because it is not in contact (or is in mild contact) with tissue, and is in the middle of its mechanical range of motion. This state is detected when the robot position control elements move with no resistance, elastography indicates no tissue stiffness measured or a measured stiffness below a predetermined threshold stiffness, no ultrasound image due to lack of probe-tissue coupling tissue, or when image moves in concert with probe. In this state, the controller can operate on a kinematic model of the robot. Most robotic probes operate under this model. However, free motion in some directions but not others (partial free motion) is also possible, as a combination of different concurrent states. [0049] Backlash: position control elements within the articulating robot are moving with little resistance, yet the probe head does not respond. This behavior may be confusing to the operator and is caused by loose mechanisms within the probe. This behavior state is detected when position control elements are turning, the ultrasound image is present but stationary, and no stiffness is measured or the measured stiffness is below a predetermined threshold stiffness. In this case, the controller enters a special control mode that runs actuators until mechanisms re engage, while carefully monitoring for such re-engagement. Backlash behavior is further illustrated in Figures 3-5, below.
[0050] Hazard: In this behavior state, the probe is applying a lot of force on the esophagus (i.e., an amount of force above a predetermined force threshold). This behavior state is detected when the position control elements are turning, an image is present but changes subtly, and high stiffness is measured (i.e., an amount of stiffness above the predetermined stiffness threshold). In this case, the controller will detect that the robot refuses to move in the direction of hazard, but easily moves away from it.
[0051] Error: In this behavior state, the robot is trying to move the probe, but there is no measurable response. This state may be caused by system errors such as loose position control elements, frozen ultrasound image, bad measurements, etc. The Error behavior state is detected when position control elements are turning but other measurements do not change in response thereto. In this case, the controller detects the error and recovers into a known state (e.g., free motion, etc.) if possible.
[0052] Drifting: In this behavior state, the probe is moving even though the robot model is stationary, indicating external forces that make controlling the probe difficult. This state is detected when the ultrasound image changes significantly, and/or stiffness measurement changes, even though robot control is stationary. In this case, the controller re-engages the robot into a known state using image based control, as the robot may be in backlash.
[0053] Steady: In this behavior state, the robot is maintaining steady contact with tissue. This state is detected when the image is stable and measured stiffness is stable. In this case, the controller enters an impedance control mode that moves the probe as necessary to maintain safe and steady imaging of an anatomical region.
[0054] FIGURE 3 illustrates an example of a TEE probe 105 exhibiting backlash in the TEE probe internal mechanisms wherein there is an area within the probe range of motion where the probe distal end 110 does not move even though the knobs and/or mechanical position control elements are turning. The backlash arc region 300 is also shown. When the probe bend is within this arc range, the probe will continue to bend in one direction, but it will not immediately travel in the opposite direction when the direction of the control knobs changes. In other words, within the arc range, the probe head is insensitive to changes in direction and there is significant latency in doing so.
[0055] FIGURES 4 and 5 illustrate backlash shown from different perspectives. In Figure 4, the measurement is electrical current used to drive position control elements (e.g., motors, gears, sensors, etc.) that contribute to moving the probe. In the backlash region 400, current stays constant even though the knob is turning, meaning the knob moves freely without moving the probe itself. In Figure 5, the measurement is probe position (units of image pixels making this a relative measurement). There are two curves showing how the probe position changes with respect to knob position: A first curve 500 corresponding to knob movement in a first direction, and a second curve 502 corresponding to knob movement in a second direction, which may be opposite the first direction. In the backlash range (roughly center of motion range), the flat area means the probe does not move even as the knob turns. In addition, the two paths taken, which depend on the direction of motion, indicate hysteresis. Hysteresis makes flexible robots difficult to control and is partly a byproduct of backlash.
[0056] FIGURE 6 illustrates a schematic view 600 of a robot behavior controller, showing inputs 601 into the control decision as well as the processing of decisions to guide the robotic probe 105. In one embodiment, the multi-behavioral TEE probe controller follows the strategy 600 depicted in FIGURE 6. The controller 130 of Figure 1, is triggered by setting a desired target view, which may be specified either directly as a set of robotic joint positions and plane angles, or visually as the desired target images are then internally converted to corresponding joints. The robot’s mechanical joints then move from their initial configuration to converge to the target view. The controller nudges the probe in the necessary directions via mechanical joint adjustment and repeatedly checks whether the target view has been reached. During each iterative check, the controller decides how the robot (i.e., the mechanical joints of the robot) should behave in the next motion increment based on its internal state, which in turn in based on measurements including the state of the ultrasound image and measured stiffness via elastography or other means. This process repeats until the target views are reached. If targets are not attainable, the controller halts and indicates an error and if possible, suggestions to reset the probe position to a known state. [0057] In each incremental iteration of robot control, driven behavior, labeled “robot control” 602 in FIGURE 6 is a hybrid response that is a combination of one or more different control schemes, shown in the stack of four boxes on the left hand side of These control schemes are described as follows. Image based control 604, whereby the robotic probe is driven based on detected image features derived from real-time images captured by the probe. This mode is used to put a desired anatomical or device target in the center of the view. Kinematic control 106, whereby the robotic probe is driven by the kinematics of its intrinsic structure, regardless of external environment interactions. Predictive control 608, whereby probe position is adjusted proactively based on cardiac measurements to follow the anatomy better, to maintain imaging and strain targets. Impedance control 610, whereby the robotic probe maintains contact with tissue with a constant force, regardless of the position it has to be in in order to keep the contact force steady.
[0058] The advantage that the foregoing control strategy has with robotic TEE is the ability to redundantly measure position using kinematics and images, and also measure force using motor current and elastography, in order to manage the various control modes required in structural heart interventions. Force can be determined from ultrasound images using elastography techniques. For example, a material with known stiffness can be employed in front of the transducer, and its displacement can be measured during loading. Another technique involves tracking the strain of the tissues in ultrasound images and estimating the stress required to generate such strain, which is integrated over the whole image/volume. Baseline tissue stiffness can be measured using shear wave elastography where the ultrasound transducer generates a traveling shear wave stimuli to compresses tissue, and the tissue behavior is imaged to assess the strain. Strain is related to stress (force per unit area) using Hooke’s law.
[0059] In some embodiments, the system provides user interfaces 140 for communicating robot behavior. A TEE probe can be difficult to control whether manually or robotically, because hand-eye coordination is disjointed and because knob-probe connection is disjointed. The former is a result of ultrasound images that are noisy, low resolution, low quality, have a small field of view, or are cross-sectional or disoriented. The latter is a result of imperfections in the mechanisms and backlash. While clinicians are ultimately able to operate TEE probes, they typically only succeed after much intraoperative trial and error of turning the control knobs and finding views. This reduced level of efficiency can be addressed with intuitive interfaces for operation of the probe, such as are described herein. These interfaces can also allow clinicians to understand any limitations in probe motion and other behaviors.
[0060] According to one embodiment, an intuitive interface for operating and visualizing a TEE probe is provided though 3D augmented reality technology and/or a hologram. The 3D spatial rendering makes it intuitive for the clinician to understand the state and position of the probe relative to the heart, making the probe easier to control. The spectrum of rendering sophistication can range from realistic down to abstract. A hybrid of realism and abstraction of the situation can be most intuitive as it reflects reality while emphasizing that it is a virtual representation. As an alternative to a 3D holographic rendering, the models can be shown through traditional 2D displays as well.
[0061] Augmented reality facilitates the ability to overlay visual information about the probe state and behavior. For example, when the probe force against the esophagus is in excess, both the probe and the heart models can be shaded red (or some other suitable color), allowing the clinician to understand easily that the probe cannot move further into tissue. Another example is when the probe is maintaining constant contact with tissue, which can be indicated via a light that pulsates, for instance, with a representation of the ultrasound image emanating from the probe.
[0062] Other more explicit overlays are contemplated, such as signs, arrows, text, and/or symbols that indicate the motion of the probe, its behavior, its state, and what states it can take on next. Such overlays can further suggest how to move the robotic probe to achieve next desired states or to avoid undesirable states.
[0063] In some embodiments, the user interface facilitates updating robot behavior. For instance, to improve versatility in accommodating different patients and unexpected situations that may arise during a procedure, the system can provide a means of altering robot behavior at any given point of time. The system considers user input as well as current robot states in order to interpret how the behavior will change. For example, if the probe is in steady mode and, is applying light pressure to maintain coupling with the tissue, the user will then be able to command the robot to increase the pressure in that mode. This input can be in the form of, e.g., a context dependent button with corresponding label, a manual gesture that is recorded via a camera, a voice command, etc. If instead the user commands the probe to move away from the tissue, the control system can interpret this as changing the state to free motion.
[0064] Described another way, maintaining the state of the robot as well as tracking state transitions (described below) allows the system to understand commands more easily, since such commands depend on context, which includes the state the robot is in as well as what it is doing within the patient’s body. This maintenance of context or state machine simplifies the user interface for clinician as well, which is valuable in a mission critical setting and for complex procedures to reduce cognitive strain.
[0065] With regard to state transitions, it is important for the controller to govern robot behavior by keeping track of its state, which is a combination of intrinsic (e.g., robot kinematics) and extrinsic (e.g., clinically relevant actions) states. A critical aspect of a state machine is to detect state transitions. Transitions from one state to another, for example, from a state of backlash to free motion, can be detected, measured, or inferred by changes in various measurements, or some combination thereof. In the backlash state, the control system commands the knobs to turn, expecting no change in the position of the probe. This manifests as no change in measured stiffness/strain/force and no change in any imaging as well. As the probe leaves the backlash state, some or all of these measures do start to change: any detected images will see displacement, or forces the probe exerts will increase. This will indicate to the control system that the probe has left the backlash state and entered free motion. When force then increases, the system knows that the probe is leaving free motion and entering contact state. Detecting these state transitions allows the controller to govern a robotic probe behavior appropriately, and at the same time, indicate the probe behavior mode to the user correctly on the user interface. The changes in state can be graphically, audibly, etc. highlighted so that the user understands how the probe behavior from that point onward.
[0066] With regard to concurrent states, the state table (Table 1) captures a breakdown of behavior states of a robotic TEE probe for use in structural heart interventions. The table in practice need not be limited to these exact states. In an example scenario, the probe can simultaneously exist in two states. When the robotic probe is maintaining contact with tissue, it is in steady state going in one direction, but in free motion in the opposite direction. Moreover, if the probe does indeed change directions in this scenario, it will behavi orally transit the backlash state first, prior to free motion. This can be viewed either as a concurrency of basic states, or it can be described equivalently as a unique state that is not explicitly called out in Table 1.
[0067] As an alternative to analytical, explicit control over the probe behavior, these parameters can be determined empirically based on a database of stored desirable behaviors for data driven control.
[0068] With regard to state prediction, given a sufficient representation of robot behavior, be it from a model, compiled and analyzed data, another source, or some hybrid thereof, the most common state transitions can be identified. As a result, when the robotic probe is in one state, the system can predict how long it will be in that state as well as what the next state is expected to be. For state transitions that have multiple possible paths, the actual path taken can be a signal about what is happening clinically in the procedure as well as what the next state transitions will be. For state transitions that are unexpected or for which there is no precedent in past data, the system may check with the user if an error as taken place.
[0069] Some embodiments also involve a user interface. Specifically, the application of this invention can be used to direct the user to perform the motions explicitly or the system can execute required actions automatically. Semi -automation of view finding is applicable as well. [0070] Some embodiments use an intra-procedural update of views. Specifically, to handle the problem of changing anatomical or interventional conditions, which prevent precise replication of views, desired views can be updated using the closest attainable views.
[0071] It will be further appreciated that the techniques disclosed herein may be embodied by a non -transitory storage medium (i.e. at least one memory) storing instructions readable and executable by an electronic data processing device (e.g., the controller 130 of FIGURE 1) to perform the disclosed techniques. Such a non-transitory storage medium may comprise a hard drive or other magnetic storage medium, an optical disk or other optical storage medium, a cloud- based storage medium such as a RAID disk array, flash memory or other non-volatile electronic storage medium, or so forth.
[0072] The invention has been described with reference to the preferred embodiments. Modifications and alterations may occur to others upon reading and understanding the preceding detailed description. It is intended that the exemplary embodiment be construed as including all such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims

CLAIMS:
1. An ultrasound (US) system, comprising: a probe including a tube (106), ultrasonic transducers (112) on a distal end (110) of the tube (106), and mechanical joints; a controller comprising at least one electronic processor (130) and at least one memory storing computer program code; the at least one memory and the computer program code configured to, with the at least one electronic processor (130), cause the probe to steer to a target by an iterative process in which each iteration includes: acquiring, via the ultrasonic transducers, a US image; calculating measurement data based on information received from the probe, including the US image; based on the measurement data, identifying at least one current behavior state of the probe; and displaying on a user interface (140) the US image, a current position of the distal end of the probe in an anatomical body, and an indicator of the at least one current behavior state of the probe.
2. The US system of claim 1, wherein the measurement data comprises at least one of: probe position data derived from probe kinematic information and US image data acquired by the ultrasound transducers; and force on the probe derived from at least one of probe motor current information and tissue elastography information derived from acquired US image data.
3. The US system of any of claims 1-2, wherein the indicator of the at least one current behavior state of the probe is an indicator of one or more of: a free motion state; a backlash state; a hazard state; an error state; a drifting state; and a steady state.
4. The US system of claim 3, wherein the at least one electronic processor (130) is configured to execute the computer-readable instructions to cause the US user interface (120) to display a free motion state indicator upon a determination that: position control elements within the probe are in motion; tissue elastography information indicates tissue stiffness at or below a predetermined stiffness threshold; and the US image moves in concert with the distal end of the probe.
5. The US system of any of claims 3-4, wherein the at least one electronic processor (130) is configured to execute the computer-readable instructions to cause the US user interface (120) to display a backlash state indicator upon a determination that: position control elements within the probe are in motion; tissue elastography information indicates no tissue stiffness; and the US image is stationary.
6. The US system of any of claims 3-5, wherein the at least one electronic processor (130) is configured to execute the computer-readable instructions to cause the US user interface (120) to display a hazard state indicator upon a determination that: position control elements within the probe are in motion; tissue elastography information indicates tissue stiffness above a predetermined stiffness threshold; and the US image moves inconsistently relative to the distal end of the probe.
7. The US system of any of claims 3-6, wherein the at least one electronic processor (130) is configured to execute the computer-readable instructions to cause the US user interface (120) to display an error state indicator upon a determination that: position control elements within the probe in motion; and the probe is motionless.
8. The US system of any of claims 3-7, wherein the at least one electronic processor (130) is configured to execute the computer-readable instructions to cause the US user interface (120) to display a drifting state indicator upon a determination that: position control elements within the probe are motionless; tissue elastography information indicates tissue stiffness at or below a predetermined stiffness threshold; and the US image is in motion.
9. The US system of any of claims 3-8, wherein the at least one electronic processor (130) is configured to execute the computer-readable instructions to cause the US user interface (120) to display a steady state indicator upon a determination that: position control elements within the probe are motionless; tissue elastography information indicates tissue stiffness is unchanging; and the US image is motionless.
10. The US system of any of claims 1-9, wherein the at least one electronic processor (130) is configured to execute the computer-readable instructions to cause the user interface (140) to display instructions for controlling motion of the distal end (110) of the probe (105) based on the identified behavior state.
11. A method for controlling an ultrasonic probe, comprising: acquiring an ultrasound image via a probe device comprising ultrasonic transducers
(112); determining a behavior state of the probe based on a probe position control element motion criterion, an image motion criterion, and a tissue stiffness criterion; displaying on a user interface (140): the US image; a current position of a distal end (110) of the probe (105) in an anatomical body; an indicator of the at least one current behavior state of the probe; and instructions for controlling the probe based on the displayed behavior state indicator.
12. The method of claim 11 , wherein the indicator of the at least one current behavior state of the probe is an indicator of one or more of: a free motion state; a backlash state; a hazard state; an error state; a drifting state; and a steady state.
13. The method of claim 12, further comprising displaying the free motion state indicator when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image moves in concert with the distal end of the probe.
14. The method of any one of claims 12-13, further comprising displaying the backlash state indicator when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image is stationary.
15. The method of any 4ne of claims 12-14, further comprising displaying the hazard state indicator when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness above a predetermined stiffness threshold, and the US image motion criterion indicates the image is moving inconsistently.
16. The method any one of claims 12-15, further comprising displaying the error state indicator when the probe position control element motion criterion indicates position control elements within the probe are in motion, and the US image motion criterion is indicative of a motionless probe.
17. The method of any one of claims 12-16, further comprising displaying the drifting state indicator when the probe position control element motion criterion indicates position control elements within the probe are not in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image is in motion.
18. The method any one of claims 12-17, further comprising displaying the steady state indicator when the probe position control element motion criterion indicates position control elements within the probe are in not motion, the tissue stiffness criterion indicates tissue stiffness is unchanging, and the US image motion criterion indicates the image is stationary.
19. An ultrasound (US) controller, comprising: at least one electronic processor (130) and at least one memory storing computer program code; the at least one memory and the computer program code configured to, with the at least one electronic processor (130), cause the probe to steer to a target by an iterative process in which each iteration includes: acquiring a US image via a probe device (105) comprising ultrasonic transducers (112); determining a behavior state of the probe based on a probe position control element motion criterion, an image motion criterion, and a tissue stiffness criterion; displaying on a user interface (140): the US image; a current position of a distal end (110) of the probe (105) in an anatomical body; an indicator of the at least one current behavior state of the probe; and instructions for controlling the distal end of the probe based on the displayed behavior state indicator.
20. The controller of claim 19, wherein the indicator of the at least one current behavior state of the probe is an indicator of one or more of: a free motion state when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image moves in concert with the distal end of the probe; a backlash state when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image is stationary; a hazard state when the probe position control element motion criterion indicates position control elements within the probe are in motion, the tissue stiffness criterion indicates tissue stiffness above a predetermined stiffness threshold, and the US image motion criterion indicates the image is moving inconsistently; an error state when the probe position control element motion criterion indicates position control elements within the probe are in motion, and the US image motion criterion is indicative of a motionless probe; a drifting state when the probe position control element motion criterion indicates position control elements within the probe are not in motion, the tissue stiffness criterion indicates tissue stiffness at or below a predetermined stiffness threshold, and the US image motion criterion indicates the image is in motion; and a steady state when the probe position control element motion criterion indicates position control elements within the probe are in not motion, the tissue stiffness criterion indicates tissue stiffness is unchanging, and the US image motion criterion indicates the image is stationary.
PCT/EP2020/084392 2019-12-12 2020-12-03 Intuitive control interface for a robotic tee probe using a hybrid imaging-elastography controller WO2021115905A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962947177P 2019-12-12 2019-12-12
US62/947,177 2019-12-12

Publications (1)

Publication Number Publication Date
WO2021115905A1 true WO2021115905A1 (en) 2021-06-17

Family

ID=73726810

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/084392 WO2021115905A1 (en) 2019-12-12 2020-12-03 Intuitive control interface for a robotic tee probe using a hybrid imaging-elastography controller

Country Status (1)

Country Link
WO (1) WO2021115905A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120083692A1 (en) * 2010-09-30 2012-04-05 Siemens Medical Solutions Usa, Inc. Pressure control in medical diagnostic ultrasound imaging
US20160228203A1 (en) * 2013-10-24 2016-08-11 Olympus Corporation Medical manipulator and initialization method for medical manipulator
US20170007202A1 (en) * 2014-03-12 2017-01-12 Koninklijke Philips N.V. System and method of haptic feedback for transesophageal echocardiogram ultrasound transducer probe
WO2019154943A1 (en) * 2018-02-08 2019-08-15 Koninklijke Philips N.V. Devices, systems, and methods for transesophageal echocardiography
EP3569154A1 (en) * 2018-05-15 2019-11-20 Koninklijke Philips N.V. Ultrasound processing unit and method, and imaging system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120083692A1 (en) * 2010-09-30 2012-04-05 Siemens Medical Solutions Usa, Inc. Pressure control in medical diagnostic ultrasound imaging
US20160228203A1 (en) * 2013-10-24 2016-08-11 Olympus Corporation Medical manipulator and initialization method for medical manipulator
US20170007202A1 (en) * 2014-03-12 2017-01-12 Koninklijke Philips N.V. System and method of haptic feedback for transesophageal echocardiogram ultrasound transducer probe
WO2019154943A1 (en) * 2018-02-08 2019-08-15 Koninklijke Philips N.V. Devices, systems, and methods for transesophageal echocardiography
EP3569154A1 (en) * 2018-05-15 2019-11-20 Koninklijke Philips N.V. Ultrasound processing unit and method, and imaging system

Similar Documents

Publication Publication Date Title
US20220142712A1 (en) Training data collection for machine learning models
US9333044B2 (en) System and method for detection and avoidance of collisions of robotically-controlled medical devices
US20060200026A1 (en) Robotic catheter system
Tully et al. Shape estimation for image-guided surgery with a highly articulated snake robot
EP3414737A1 (en) Autonomic system for determining critical points during laparoscopic surgery
CN114126527A (en) Composite medical imaging system and method
WO2021115905A1 (en) Intuitive control interface for a robotic tee probe using a hybrid imaging-elastography controller
Marahrens et al. Towards autonomous robotic minimally invasive ultrasound scanning and vessel reconstruction on non-planar surfaces
CN111491567A (en) System and method for guiding an ultrasound probe
US20230012353A1 (en) Hybrid robotic-image plane control of a tee probe
US20220409292A1 (en) Systems and methods for guiding an ultrasound probe
US20230010773A1 (en) Systems and methods for guiding an ultrasound probe
US11903656B2 (en) Automatic control and enhancement of 4D ultrasound images
JP7099901B2 (en) Ultrasound image processing equipment and programs
JP7421548B2 (en) Diagnostic support device and diagnostic support system
US20210251602A1 (en) System, device and method for constraining sensor tracking estimates in interventional acoustic imaging
CN114027975A (en) CT three-dimensional visualization system of puncture surgical robot
WO2023192395A1 (en) Registration of medical robot and/or image data for robotic catheters and other uses
CN117355862A (en) Systems, methods, and media including instructions for connecting model structures representing anatomic passageways
Koolwal Localization and mapping for guiding minimally invasive cardiac procedures

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20820102

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20820102

Country of ref document: EP

Kind code of ref document: A1