US20220211293A1 - Real-time display of tissue deformation by interactions with an intra-body probe - Google Patents
Real-time display of tissue deformation by interactions with an intra-body probe Download PDFInfo
- Publication number
- US20220211293A1 US20220211293A1 US17/701,830 US202217701830A US2022211293A1 US 20220211293 A1 US20220211293 A1 US 20220211293A1 US 202217701830 A US202217701830 A US 202217701830A US 2022211293 A1 US2022211293 A1 US 2022211293A1
- Authority
- US
- United States
- Prior art keywords
- tissue
- optionally
- data
- probe
- geometrical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 109
- 239000000523 sample Substances 0.000 title claims description 232
- 238000009877 rendering Methods 0.000 claims abstract description 111
- 238000000034 method Methods 0.000 claims abstract description 104
- 239000000463 material Substances 0.000 claims abstract description 39
- 210000000056 organ Anatomy 0.000 claims abstract description 11
- 230000000694 effects Effects 0.000 claims description 82
- 238000002679 ablation Methods 0.000 claims description 47
- 210000002216 heart Anatomy 0.000 claims description 43
- 230000008859 change Effects 0.000 claims description 36
- 230000003902 lesion Effects 0.000 claims description 30
- 206010030113 Oedema Diseases 0.000 claims description 25
- 230000006870 function Effects 0.000 claims description 20
- 239000000126 substance Substances 0.000 claims description 18
- 238000002604 ultrasonography Methods 0.000 claims description 18
- 238000002347 injection Methods 0.000 claims description 14
- 239000007924 injection Substances 0.000 claims description 14
- 230000008961 swelling Effects 0.000 claims description 13
- 238000007674 radiofrequency ablation Methods 0.000 claims description 10
- 238000011161 development Methods 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 8
- 230000007246 mechanism Effects 0.000 claims description 5
- 238000004520 electroporation Methods 0.000 claims description 3
- 230000002427 irreversible effect Effects 0.000 claims description 3
- 238000000608 laser ablation Methods 0.000 claims description 3
- 238000012546 transfer Methods 0.000 claims description 3
- 210000000936 intestine Anatomy 0.000 claims description 2
- 230000007794 irritation Effects 0.000 claims description 2
- 210000003734 kidney Anatomy 0.000 claims description 2
- 210000004185 liver Anatomy 0.000 claims description 2
- 210000002784 stomach Anatomy 0.000 claims description 2
- 210000005166 vasculature Anatomy 0.000 claims description 2
- 238000004088 simulation Methods 0.000 abstract description 85
- 230000033001 locomotion Effects 0.000 abstract description 40
- 238000005259 measurement Methods 0.000 abstract description 29
- 230000000007 visual effect Effects 0.000 abstract description 28
- 238000013507 mapping Methods 0.000 abstract description 26
- 230000035479 physiological effects, processes and functions Effects 0.000 abstract description 14
- 230000003287 optical effect Effects 0.000 abstract description 12
- 230000036961 partial effect Effects 0.000 abstract description 8
- 230000008569 process Effects 0.000 abstract description 7
- 230000002828 effect on organs or tissue Effects 0.000 abstract description 2
- 210000001519 tissue Anatomy 0.000 description 277
- 238000011282 treatment Methods 0.000 description 59
- 238000003384 imaging method Methods 0.000 description 22
- 210000005242 cardiac chamber Anatomy 0.000 description 20
- 230000008901 benefit Effects 0.000 description 19
- 230000000875 corresponding effect Effects 0.000 description 17
- 238000007373 indentation Methods 0.000 description 17
- 230000009471 action Effects 0.000 description 12
- 238000012545 processing Methods 0.000 description 11
- 238000013515 script Methods 0.000 description 10
- 230000006399 behavior Effects 0.000 description 9
- 238000006243 chemical reaction Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 9
- 230000001965 increasing effect Effects 0.000 description 9
- 238000003860 storage Methods 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000002497 edematous effect Effects 0.000 description 6
- 238000011277 treatment modality Methods 0.000 description 6
- 230000018109 developmental process Effects 0.000 description 5
- 238000006073 displacement reaction Methods 0.000 description 5
- 230000005672 electromagnetic field Effects 0.000 description 5
- 238000010438 heat treatment Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 230000008458 response to injury Effects 0.000 description 5
- 238000011298 ablation treatment Methods 0.000 description 4
- 210000003484 anatomy Anatomy 0.000 description 4
- 238000001816 cooling Methods 0.000 description 4
- 230000006378 damage Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 230000002829 reductive effect Effects 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 4
- 238000012800 visualization Methods 0.000 description 4
- 206010047139 Vasoconstriction Diseases 0.000 description 3
- 230000004913 activation Effects 0.000 description 3
- 238000010009 beating Methods 0.000 description 3
- 150000001875 compounds Chemical class 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 3
- 230000003176 fibrotic effect Effects 0.000 description 3
- 238000002594 fluoroscopy Methods 0.000 description 3
- 230000036571 hydration Effects 0.000 description 3
- 238000006703 hydration reaction Methods 0.000 description 3
- 230000002779 inactivation Effects 0.000 description 3
- 210000004971 interatrial septum Anatomy 0.000 description 3
- 238000002595 magnetic resonance imaging Methods 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 239000000779 smoke Substances 0.000 description 3
- 230000008719 thickening Effects 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 230000025033 vasoconstriction Effects 0.000 description 3
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000001746 atrial effect Effects 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 238000012512 characterization method Methods 0.000 description 2
- 230000008602 contraction Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000005684 electric field Effects 0.000 description 2
- 239000012530 fluid Substances 0.000 description 2
- 210000004491 foramen ovale Anatomy 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000010247 heart contraction Effects 0.000 description 2
- 210000005003 heart tissue Anatomy 0.000 description 2
- 230000001976 improved effect Effects 0.000 description 2
- 239000004615 ingredient Substances 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 210000005246 left atrium Anatomy 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 230000003278 mimic effect Effects 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 230000010412 perfusion Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000035790 physiological processes and functions Effects 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000008093 supporting effect Effects 0.000 description 2
- 208000024891 symptom Diseases 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000024883 vasodilation Effects 0.000 description 2
- 208000032544 Cicatrix Diseases 0.000 description 1
- 206010016654 Fibrosis Diseases 0.000 description 1
- 108010057266 Type A Botulinum Toxins Proteins 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 210000003403 autonomic nervous system Anatomy 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 229940089093 botox Drugs 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000002638 denervation Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000005489 elastic deformation Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 235000019441 ethanol Nutrition 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 238000001125 extrusion Methods 0.000 description 1
- 230000004761 fibrosis Effects 0.000 description 1
- 230000005714 functional activity Effects 0.000 description 1
- 210000001035 gastrointestinal tract Anatomy 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 239000003102 growth factor Substances 0.000 description 1
- 210000002837 heart atrium Anatomy 0.000 description 1
- 238000002847 impedance measurement Methods 0.000 description 1
- 230000036540 impulse transmission Effects 0.000 description 1
- 238000000338 in vitro Methods 0.000 description 1
- 238000001727 in vivo Methods 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 230000001404 mediated effect Effects 0.000 description 1
- 201000003152 motion sickness Diseases 0.000 description 1
- 230000002107 myocardial effect Effects 0.000 description 1
- 210000004165 myocardium Anatomy 0.000 description 1
- 238000012633 nuclear imaging Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000000144 pharmacologic effect Effects 0.000 description 1
- 230000010399 physical interaction Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000010349 pulsation Effects 0.000 description 1
- 239000000700 radioactive tracer Substances 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 210000005245 right atrium Anatomy 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 231100000241 scar Toxicity 0.000 description 1
- 230000037387 scars Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000451 tissue damage Effects 0.000 description 1
- 231100000827 tissue damage Toxicity 0.000 description 1
- 238000013334 tissue model Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012285 ultrasound imaging Methods 0.000 description 1
- 230000002792 vascular Effects 0.000 description 1
- 238000010792 warming Methods 0.000 description 1
- 230000037303 wrinkles Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
- A61B5/061—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
- A61B5/064—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B18/04—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
- A61B18/12—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
- A61B18/14—Probes or electrodes therefor
- A61B18/1492—Probes or electrodes therefor having a flexible, catheter-like structure, e.g. for heart ablation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B18/18—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
- A61B18/1815—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using microwaves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00571—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
- A61B2018/00577—Ablation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00571—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
- A61B2018/00613—Irreversible electroporation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B18/02—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by cooling, e.g. cryogenic techniques
- A61B2018/0212—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by cooling, e.g. cryogenic techniques using an instrument inserted into a body lumen, e.g. catheter
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
- A61B2034/104—Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/252—User interfaces for surgical systems indicating steps of a surgical procedure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/065—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N7/00—Ultrasound therapy
- A61N7/02—Localised ultrasound hyperthermia
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present invention in some embodiments thereof, relates to the field of medical procedures using intrabody probes navigable within intrabody spaces, and more particularly, to presentation of procedure data dynamically acquired during the course of a catheter procedure.
- Graphical game engines currently available comprise suites of software-implemented capabilities supporting the dynamic display and updating of simulated three-dimensional scenes.
- game engines include API calls supporting the creation and modification of a variety of scene objects (chiefly terrain, various types of physical objects, camera viewpoints, and lighting), a visual rendering pipeline, and optionally further services assisting tasks such as coding, animating, and/or debugging.
- User inputs are accepted from various user interface devices (including pointer devices, keyboards, game controllers, motion sensors, touch screens and the like) and converted into events in the simulated environment.
- Well-known game engines include the Unreal® and Unity® graphical game engines (www(dot)unrealengine(dot)com; www(dot)unity3d(dot)com).
- the rendering pipelines of modern game engines typically include facilities for creating realistic-looking visualizations of scene elements, based on properties assigned to instantiations of data objects representing those scene elements.
- Intrabody sensing from catheter probes to determine information about, for example, tissue contact and/or lesion assessment has also been described (e.g., International Patent Application No. PCT IB2016/052690 to Schwartz et al. filed May 11, 2016; and International Patent Application No. IB2016/052686 to Schwartz et al. filed May 11, 2016).
- a method of visually displaying effects of a medical procedure comprising: receiving interaction data from an intrabody probe indicating touching contacts between the intrabody probe and a body tissue region, wherein the interaction data at least associate the contacts to contacted positions of the body tissue region; adjusting geometrical rendering data representing a shape of the body tissue region to obtain adjusted geometrical rendering data, wherein the adjusting is based on an indication in the interaction data of a change in the shape of the body tissue region due to the contacting; rendering the adjusted geometrical rendering data to a rendered image; and displaying the rendered image.
- the intrabody probe is a catheter probe.
- the geometrical rendering data are adjusted as a function of time relative to a time of occurrence of at least one of the indicated contacts.
- the receiving, the adjusting, and the displaying are performed iteratively for a sequence of contacts for which interaction data is received.
- the adjusting is at a frame rate of 10 frames per second or more.
- the rendering and the displaying are at a frame rate of 10 frames per second or more.
- the geometrical rendering data include a representation of 3-D surface positions and a representation of surface orientations; wherein the two representations each correspond to a same portion of the shape of the body tissue region; and wherein the adjusting comprises adjusting the surface orientation representation to change a geometrical appearance in the rendering.
- the representation of surface orientation is adjusted separately from the representation of 3-D surface positions.
- the extent and degree of the adjusting model a change in a thickness of the body tissue region.
- the interaction data describe an exchange of energy between the intrabody probe and the body tissue region by a mechanism other than contact pressure.
- the adjusting comprises updating the geometrical rendering data based on a history of interaction data describing the exchange of energy.
- the exchange of energy comprises operation of an ablation modality.
- the updating changes an indication of lesion extent in the geometrical rendering data based on the history of interaction data describing the exchange of energy by operation of the ablation modality.
- the updating comprises adjusting the geometrical rendering data to indicate a change in mechanical tissue properties, based on the history of interaction data describing the exchange of energy.
- the ablation energy exchanged between the intrabody probe and the body tissue region comprises at least one of the group consisting of: radio frequency ablation, cryoablation, microwave ablation, laser ablation, irreversible electroporation, substance injection ablation, and high-intensity focused ultrasound ablation.
- the updating comprises adjusting the geometrical rendering data to indicate a change in tissue thickness, based on the history of interaction data describing the exchange of energy.
- effects of the history of interaction data describing the exchange of energy are determined from modelling of thermal effects of the exchange of energy on the body tissue region.
- the modelling of thermal effects accounts for local tissue region properties affecting transfer of thermal energy between the intrabody probe and the body tissue region.
- the adjusting is as a function of time relative to a time of occurrence of at least one of the indicated contacts, and comprises adjusting the geometrical rendering data to indicate gradual development of a change in geometry of the body tissue region as a result of the contacts.
- the gradually developed change in geometry indicates a developing state of edema.
- the method comprises geometrically distorting the rendering of the geometrical rendering data into a swollen appearance, to an extent based on the indicated development of the state of edema.
- the contacts comprise mechanical contacts, and the gradual development of a change in geometry indicates swelling of the body tissue region in response to tissue irritation by the mechanical contacts.
- the contacts comprise an exchange of energy between the intrabody probe and the body tissue region by a mechanism other than contact pressure.
- the interaction data indicate a contact force between the intrabody probe and the body tissue region.
- the interaction data indicate a contact quality between the intrabody probe and the body tissue region.
- the interaction data indicate a geometrical distortion introduced by touching contact between the intrabody probe and the body tissue region.
- the adjusting comprises geometrically distorting the rendering of the geometrical rendering data at a region of touching contact to an extent based on the interaction data.
- the geometrically distorting the rendering of the geometrical rendering data includes geometrically distorting a portion of the geometrical rendering data which is not geometrically corresponding to the portion of the body tissue region from which the interaction data were obtained.
- the interaction data comprises a 2-D image including a cross-sectional view of the body tissue region, and the distorted portion of the geometrical rendering extends out of a plane in the geometrical rendering data corresponding to the plane of the cross-sectional view.
- the interaction data describes injection of a substance from the intrabody probe to the body tissue region
- the adjusting comprises changing a thickness of tissue in the body tissue region, corresponding to an effect of the injection of the substance.
- the rendering includes a view of the intrabody probe. In some embodiments, the rendering is rendered from a viewpoint at least partially defined by a measured position of the intrabody probe relative to a surface of the body tissue region.
- the measured position includes a measured orientation of the intrabody probe.
- the intrabody probe contacts a lumenal surface of the body tissue region.
- the intrabody probe contacts an external surface of an organ comprising the body tissue region.
- the body tissue region comprises a tissue of at least one organ of the group consisting of the heart, vasculature, stomach, intestines, liver and kidney.
- the method further comprises assigning material appearance properties across an extent of the geometrical rendering data, based on the interaction data; and wherein the displaying of the rendered image uses the assigned material appearance properties.
- the rendering comprises a rendering in cross-section of the body tissue region.
- the extent and degree of the adjusting simulate stretching of the body tissue region.
- the geometrical rendering data represent a shape of a body tissue region comprising a heart chamber; and wherein the adjusting comprises adjusting a size of the heart chamber, based on the current heart rate data.
- the adjusting a size of the heart chamber comprises adjusting a size of a lumen of the heart chamber, based on the current heart rate data.
- the adjusting a size of the heart chamber comprises adjusting a thickness of a wall of the heart chamber, based on the current heart rate data.
- the adjusting geometrical rendering data comprises adjusting a position of the intrabody probe in the geometrical rendering data relative to a wall of the heart chamber, based on the current heart rate data.
- a system for visually displaying effects of interactions between an intrabody probe and a body tissue region comprising computer circuitry configured to: receive interaction data indicating the interactions, and associated to positions on a surface of the body tissue region; adjust geometrical rendering data representing a shape of the body tissue region to obtain adjusted geometric rendering data, wherein the adjusting is based on an indication in the interaction data of a change in the shape of the body tissue region; render the adjusted geometrical rendering data to a rendered image; and present the rendered image.
- the rendering is performed using a graphical game engine, and the interaction data include sensed positions of the intrabody probe.
- the interaction data include probe-sensed characteristics of tissue in the vicinity of the intrabody probe.
- the interaction data includes operational data describing operation of the intrabody probe to treat tissue.
- a method of visually displaying a medical procedure comprising: receiving position data indicating the position of an intracardial probe within a heart; receiving heart rate data for the heart; adjusting geometrical rendering data representing a shape of the heart and a shape and position of the intracardial probe to obtain adjusted geometric rendering data; wherein the adjusting is based on the heart rate data to maintain an accuracy of positioning of the intracardial probe relative to the heart as average size of the heart changes as a function of a heart rate; rendering the adjusted geometrical rendering data to a rendered image; and displaying the rendered image.
- aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.”
- some embodiments of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- Implementation of the method and/or system of some embodiments of the invention can involve performing and/or completing selected tasks manually, automatically, or a combination thereof.
- several selected tasks could be implemented by hardware, by software or by firmware and/or by a combination thereof, e.g., using an operating system.
- a data processor such as a computing platform for executing a plurality of instructions.
- the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data.
- a network connection is provided as well.
- a display and/or a user input device such as a keyboard or mouse are optionally provided as well.
- the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
- a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
- a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer readable medium and/or data used thereby may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
- Computer program code for carrying out operations for some embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- FIG. 1A is a schematic flowchart illustrating the calculation and display of an image of a scene comprising simulated tissue having a geometry and/or geometrical appearance dynamically linked to interactions of the tissue with a catheter probe, according to some embodiments of the present disclosure
- FIG. 1B is a schematic flowchart illustrating the calculation and display of a geometry and/or geometrical appearance dynamically changing over time as a result of prior interaction of the tissue with a catheter probe, according to some embodiments of the present disclosure.
- FIGS. 2A-2E illustrate a 3-D rendered display for indicating lesioning status to a user, according to some exemplary embodiments of the present disclosure
- FIGS. 3A, 3D, 3G, and 3J schematically represent a sequence of rendered views of a catheter probe passing through a tissue wall portion, according to some embodiments of the present disclosure
- FIGS. 3B, 3E, 3H, and 3K schematically represent a graph of position versus time and measured contact versus time for the catheter probe of FIGS. 3A, 3D, 3G, and 3J , according to some embodiments of the present disclosure
- FIGS. 3C, 3F, 3I, and 3L schematically represent an ultrasound image at a cross-section of a heart at the atrial level, and corresponding to the sequence of FIGS. 3A, 3D, 3G, and 3J , according to some embodiments of the present disclosure;
- FIGS. 4A-4D schematically represent aspects of geometrical deformation of a tissue region due to an internal change such as edema, according to some embodiments of the present disclosure
- FIGS. 5A-5B schematically represent global geometrical deformation of a tissue structure, for example, due to hydration state and/or more global edema than the example of FIGS. 4A-4D , according to some embodiments of the present disclosure
- FIG. 6 is a schematic representation of a system configured for display of interactions between a catheter probe and a body tissue region, and/or their effects, according to some embodiments of the present disclosure
- FIG. 7 schematically represents software components and data structures of an interaction analyzer of a system, according to some embodiments of the present disclosure
- FIG. 8 schematically represents components, inputs, and outputs of a graphical game engine operating to manage and render scene elements to images for presentation at motion frame-rate, according to some embodiments of the present disclosure
- FIGS. 9A-9B schematically represent, respectively, different geometrical data representations of flat and indented surfaces, according to some embodiments of the present disclosure
- FIGS. 10A-10B illustrate normal mapping superimposed on a tissue region in order to provide the geometrical appearance of a swelling, according to some embodiments of the present disclosure
- FIGS. 10C-10D schematically represent aspects of geometrical deformation of a tissue region in touching contact with a catheter probe, according to some embodiments of the present disclosure
- FIG. 11A schematically illustrates a rendered image rendered from a camera viewpoint looking at tissue region along an axis parallel to an intrabody probe; according to some embodiments of the present disclosure.
- FIG. 11B schematically illustrates a field of view projected from camera viewpoint, including indication of axis, according to some embodiments of the present disclosure.
- the present invention in some embodiments thereof, relates to the field of medical procedures using intrabody probes navigable within intrabody spaces, and more particularly, to presentation of procedure data dynamically acquired during the course of a catheter procedure.
- An aspect of some embodiments of the current invention relates to the motion frame-rate, real-time display of geometrical effects on a simulation scene comprising simulated tissue, wherein the geometrical effects comprise changes to a geometrical representation of one or more elements in the scene, and wherein the changes are made based on ongoing and/or intermittent measurements of interactions between a catheter probe and the actual tissue being simulated.
- Geometrical effects optionally comprise one or both of changes to the 3-D position of simulated elements, and changes to the geometrical appearance of simulated elements.
- Geometrical appearance as distinct from 3-D position, comprises geometrical that can give a relatively raised, indented, smoothed, irregular, blurred, focused, closer, further, shaded, and/or unshaded appearance to a portion of a surface, without affecting 3-D coordinates of the surface itself.
- Geometrical appearance optionally comprises features implemented at least in part by computational methods-for example, normal mapping, depth mapping, and/or shadow mapping.
- a software environment specialized for interactive visual simulations (for example a 3-D graphical game engine such as the Unreal® and/or Unity® graphical game engines) is used as a basis for implementing a simulation of a scene comprising simulated tissue (herein, such a scene is referred to as a simulation scene).
- geometrical rendering data are optionally supplemented with one or more material appearance properties (preferably a plurality of such properties) that describe how virtual materials such as simulated tissue interact with simulated optical laws and lighting conditions to generate images for display.
- the geometrical rendering data optionally comprises a geometrical representation of a scene including tissue.
- the rendering is implemented, by a rendering pipeline of the graphical game engine.
- game engine should be understood to encompass computer-implemented collections of such typical game engine capabilities as may be used by some embodiments of the present invention (examples of which are described herein), whether or not they have been packaged into a game engine distribution.
- the term “rendering” refers to the process of generating an image from a 2-D or 3-D model or models by means of one or more computer programs.
- the model may contain object parameter definitions and/or data structures; for example, geometry, viewpoint, texture, lighting, and/or shading information as a description of the virtual model.
- the data contained in the model may be passed to a rendering program to be processed and output to a digital image or raster graphics image file.
- the processing comprises one or more processing stages referred to collectively as a “pipeline”, and carried out by the software and hardware of a rendering device.
- the rendering device includes one or more of a general purpose CPU and graphics hardware specialized for use within a rendering pipeline.
- updating of the simulation scene during a procedure is at least partially based on data inputs from one or more data sources supplying data during the procedure (for example, sources of probe-tissue interaction data such as sensing data and/or treatment status data described in relation to FIG. 6 and FIG. 7 ).
- Graphical game engines typically receive inputs from game input devices such as pointer devices, keyboards, game controllers, body motion sensors, and the like.
- inputs optionally are from one or more additional or alternative inputs related to the performance of a catheter procedure—for example, catheter probe position data, data tracking the intrabody use of catheter probes (particularly but not exclusively use to deliver treatment; e.g.
- an intrabody probe by delivering treatment energies
- measurement data for example measurement data obtained from an intrabody probe
- a catheter probe is used as an example of an intrabody probe, but it should be understood that another intrabody probe is optionally used in some embodiments; e.g., a capsule probe).
- the simulated world (also referred to herein as a simulated scene) maintained by a game engine does not directly correspond to any simultaneous objective-world state.
- an object of some embodiments of the current invention is to simulate the reality of a clinical situation sufficiently to allow substantially seamless interaction with that reality via a presentation of the scene simulation. In some embodiments, this comprises maintaining and displaying a simulated scene having a useful level of correlation with the changing reality of the actual tissue environment (as reflected in data available to characterize it).
- usefulness derives from actions which are taken by an operator on the basis of information in the scene simulation presentation which reveals to a catheter operator the changing state of the tissue environment.
- the useful level of correlation with the changing reality of the actual tissue environment allows an operator to realize the state of the tissue or a change in that state, optionally without adding to the scene annotations indicative of such state or state change.
- usefulness derives from the presented scene simulation providing fidelity of representation sufficient that actions the operator takes based on the presented scene simulation produce effects as intended in the corresponding real-world environment.
- the useful level of correlation with the changing reality of the actual tissue environment is a level of correlation sufficient to allow the operator to perform actions within the real-world environment based on the presented scene simulation.
- the presented scene simulation may include effects simulating results of the actions taken by the operator.
- a display of a user interface is updated at motion frame rate with rendered images of a simulation scene simulating an intrabody probe (for example, a probe at the end of a catheter) and its tissue environment.
- the updating optionally indicates changes to an actual intrabody probe and tissue environment which occur as an operator manipulates the actual intrabody probe (wherein the updating is based, e.g., on position data describing the position of the intrabody probe), and/or operates the intrabody probe for treatment and/or diagnostic measurement of the actual tissue environment (wherein the updating is based, e.g., on operational data describing operation of the intrabody probe to treat tissue and/or measure properties of the tissue).
- changes are shown in the rendered images as if occurring within the actual material of the tissue environment.
- tissue is deflected and/or an intrabody probe shape is distorted in rendered images of a simulation scene based on interaction data indicating touching contacts.
- a smoothly updating, naturalistic appearance of a rendered view of a simulation scene is achieved even when available inputs indicating changes to the simulation scene are incomplete, slowly updating, irregular, and/or lagging (for example, as described in relation to FIG. 1B ).
- “naturalistic” scene appearance means that the displayed scene gives an operator the impression of substantial materials (i.e., volume-occupying, as opposed to merely shell defining materials) and/or reactive materials existing in a fluidly navigable environment. The reactions of the materials in turn become a significant part of the information which an operator relies on to act within the actual environment that the scene simulates.
- a material moreover may be simulated as occupying volume per se (for example, as a wall having thickness), rather than merely as a boundary extending in space (for example, as a structure defining a surface, but having no well-defined thickness).
- appearances in rendered views of simulation scene objects are moreover “realistic” in some aspects.
- tissues in some embodiments, are provided with material appearances that mimic their appearance in life, and to this extent are “realistic”.
- geometrical deformation of tissue in a simulation scene is directly based on deformation measurements, for example, ultrasound images of septal wall deflection during transseptal puncture are optionally converted into movements in three dimensions of a simulated septal wall's deflection.
- non-realistic material appearances and even objects are optionally or additionally provided to a naturalistic scene.
- Degree of tissue compression is optionally used as a visual proxy for probe-tissue contact force (force of touching contact), whether or not the real tissue is indeed compressed.
- motion due to normal heart pulsations is indicated in the simulation by pulses with corresponding timing; this potentially helps an operator understand the difference between a probe in intermittent wall-touching contact and continuous wall-touching contact.
- the amplitude of the simulated pulses is reduced from the real state, to stabilize the visual environment an operator uses for navigation.
- some geometrical states such as degree of vasodilation and/or vasoconstriction are optionally exaggerated for clarity.
- the size of one or more heart chambers is adjusted based on current heart rate, and/or the size and/or movements of a probe relative to the heart chamber are scaled based on current heart rate. It has been observed that as heart rate increases, the maximum size of the heart between contractions correspondingly decreases. This decrease can also be observed in the sizes adopted by heart chamber at other phases of the heartbeat cycle. For example, in some embodiments, the average rendered size of the heart over the course of a heartbeat cycle is decreased as a function of measured heart rate increase. The average size change is optionally to either a beating or non-beating rendered representation of the heart. Optionally heart wall thickness correspondingly increases with decreasing chamber size.
- visual rendering of blood is preferably suppressed, making visualization possible from within a vascular or cardiac lumen.
- tissue properties such as temperature are encoded by visual conventions; appearing as, for example in the case of temperature: ice, flame, smoke, and/or steam.
- guiding marks related to planning and/or procedure progress are optionally provided as part of the simulation scene's naturalistic rendering to images.
- motion physics simulators e.g., for modeling collisions, accelerations, elastic deformations, object destruction, and the like.
- one or more these motion physics simulators is used to increase the naturalistic impression and/or realistic fidelity of a rendered simulation scene.
- one or more of these motion physics simulators is used to increase the naturalistic impression of a scene.
- geometrical deformations are used to indicate aspects of a procedure where a probe contacts tissue. As for the case of material appearances, the geometrical deformations may be, but are not necessarily realistic.
- a general potential benefit of naturalistic (optionally also realistic) presentation of a scene comprising simulated tissue is to reduce cognitive load on a catheter operator and/or team of operators working with an intra-body probe.
- Such procedures typically have multiple interacting factors and requirements affecting procedure outcome. These factors and requirements preferably are tracked simultaneously and/or may need to be accounted for with little time for consideration. Examples of these factors and requirements in a standard operating environment optionally include any one or more of the following:
- immediate visual presentation of material appearance helps to control the complexity these factors can create.
- a naturalistic display of information is more immediately understood by the clinical personnel, and/or intuitively draws attention to clinically relevant state updates. For example, instead of the operator team having to consider and/or calculate whether a previously lesioned tissue region was lesioned long enough ago to have converted to edematous tissue: in some embodiments, the edema is directly displayed as edematous tissue. Where a continuous lesion is planned, likely gaps in lesion extent can be directly seen in their overall context in the scene simulation, helping to guide the decision as to whether and/or how the procedure should be adapted to compensate.
- a naturalistic presentation of catheter procedure information also contrasts, for example, with the presentation of this information using graphs and/or symbols. Familiarization with more abstract symbols, measures and graphs potentially requires prolonged training. An extra level of symbolic abstraction also potentially slows recognition by the physician of important changes in the state of the catheter interface or the tissue.
- a substantially continuous stream of input data describing a tissue region and/or probe interactions with it is used as a basis for correspondingly continuous updating of a scene simulating the tissue region.
- the input data comprise only partial and/or indirect description of the tissue region.
- spatially partial input data (such as from a cross-sectional image) is used in some embodiments to infer spatial changes over a larger region (such as a three-dimensional space extending outside the cross-sectional image).
- sensed pressure data from a catheter probe is optionally converted into corresponding movements in three-dimensional space of pressed-against tissue in the simulation scene.
- effects on tissue by energy delivered from a lesioning probe are optionally simulated in a scene based on a model of energy dispersion in the tissue (e.g., thermal modeling, optionally thermal modeling incorporating information from anatomical data), and knowing a few parameters about how the energy was delivered (e.g., how long, with what energy, where, and/or with what efficacy).
- a model of energy dispersion in the tissue e.g., thermal modeling, optionally thermal modeling incorporating information from anatomical data
- sensed input data is used as a basis for updating the state of the scene-representation of the probe itself.
- sensed input data is used to adjust the position of the probe's scene representation, and/or to control the parameters of a viewpoint used in creating a rendered image of the simulation scene, wherein the viewpoint is defined by a position of the probe.
- sensed input data e.g., indicating tissue contact force and/or quality
- the shape may be adjusted based, for example, on a mechanical model of the actual probe and/or a catheter or other device that carries the probe (e.g., a mechanical model which models the flexibility and geometry of the actual probe and/or associated carrying device).
- some probes such as lasso electrode probes comprise a flexible portion that can be bent in response to the forces of touching contact.
- an otherwise stiff probe may be carried on a flexible member such as a catheter used to manipulate the probe.
- sensed input data indicates forces applied to the actual probe, and the simulated probe is modified in response to the indicated forces according to the parameters of the mechanical model. The modification may also take into account other data, for example, a position of the probe itself, geometry of the chamber in which the probe is positioned, and/or a position of an aperture via which a probe is passed into a heart chamber or other body lumen. Potentially, the modeling allows a changing simulated probe shape to indicate changes to the actual intrabody probe in use, without requiring direct measurement of the actual intrabody probe's shape (e.g., by imaging).
- correlation between a simulation scene and the actual tissue region it represents is maintained at least in part by treating occasional inputs as describing events that (in the real world) trigger and/or entail certain predictable consequences to follow.
- the input optionally acts as a trigger for software routines that simulate those consequences.
- longer-term effects of lesioning are optionally simulated by a physiological simulation. For example, a simulation converts estimated lesion damage into parameters for a script describing the gradual onset of tissue edema as it appears in rendered views of the simulation scene.
- partial and/or occasional inputs optionally guide calibration of the simulation scene maintained by the game engine so that it better-corresponds to the state of the actual tissue region.
- sensing of tissue state or position directly using the probe as a sensing modality is optionally used to update a model state, potentially restoring and/or improving a degree of synchronization between the actual tissue region and the simulation scene.
- FIG. 1A is a schematic flowchart illustrating the calculation and display of an image of a simulation scene, the simulation scene comprising simulated tissue having a geometry and/or geometrical appearance dynamically linked to interactions of the tissue with a catheter probe 11 (shown, for example, in FIGS. 3A, and 6 ), according to some embodiments of the present disclosure.
- a cycle of activities of the method includes, in some embodiments:
- FIG. 6 is a schematic representation of a system 1 configured to present interactions between a catheter probe 11 and a body tissue region 7 , and/or effects of these interactions.
- System 1 is optionally configured to present the interactions and/or their effects at user interface 55 .
- FIG. 7 schematically represents software components and data structures of an interaction analyzer 21 of system 1 , according to some embodiments of the present disclosure.
- a system 1 configured for display of interactions between a catheter probe 11 and a body tissue region 7 and/or results of such interactions receives interaction data.
- the interaction data may include, for example, data acquired by a sensing modality, and/or operation data of a treatment modality.
- the interaction data comprise data indicating and/or numerically describing characteristics of interactions between probe 11 and tissue region 7 ; including, for example, positions of the probe and/or of contacts between the probe and the tissue region, contact characteristics characterizing a contact between the probe and the tissue region, measurements taken by the probe (for example, measurements of the physiological state and/or dielectric properties of the tissue region), and/or actions of the probe (e.g., operations comprising delivery of treatment).
- interaction data comprise imaging data obtained during probe-tissue interactions.
- System 1 of FIG. 6 indicates examples of sources of interaction data that are optionally provided in some embodiments of the present disclosure.
- Interaction data is optionally received in raw form, or in any suitable stage of intermediate processing to indicate a parameter and/or status of more direct applicability.
- FIG. 6 details for certain types of interaction data available in some embodiments of the invention (e.g., one type, all types, or any other combination of types) are now described for: position data, imaging data, dielectric tissue property sensing, general sensing (for example, of temperature and/or contact force), and treatment interactions.
- Position data is sensed by use of an electromagnetic field navigation subsystem, comprising body surface electrodes 5 , field generator/measurer 10 , position analyzer 20 , and sensing electrodes 3 (for example, sensing electrodes 3 located on catheter probe 11 ).
- the electromagnetic field navigation subsystem operates by inducing at least one time-varying electromagnetic (EM) field 4 (for example, three crossing EM fields, each of a different frequency) across a region of body 2 including a body tissue region 7 that is targeted to be navigated by catheter 9 and catheter probe 11 .
- EM electromagnetic
- the time varying EM field is induced with a total inter-electrode voltage of one volt or less, at a frequency of between about 10 kHz and about 1 MHz.
- Voltages sensed at different positions by sensing electrodes 3 are characteristic of corresponding intrabody positions, allowing conversion by position analyzer 20 , for example of voltage measurements to position information (for example, after exploration of an intrabody region 7 using the probe 11 , and/or initially based on EM fields simulated with respect to a particular configuration of electrodes and anatomical data 31 ).
- position sensing at least partially comprises sensing of the relative position of a catheter probe 11 and a surface of tissue region 7 ; for example, by sensing of the dielectric environment of a sensing electrode 3 of catheter probe 11 .
- Imaging data there is provided an imaging modality 6 , which may include, for example, an ultrasound modality and/or a fluoroscopy modality.
- Imaging modality 6 is configured to monitor body tissue region 7 during use of the catheter probe. Characteristics monitored by imaging modality 6 optionally comprise position information of the probe and/or of tissue affected by operation of the probe.
- the imaging modality is in continuous, real-time (e.g., 5, 10, 15, 20, 30, 60 or more images per second) use during at least some phase of a procedure.
- system 1 continuously processes changes in images produced by imaging modality 6 for immediate display (within a few milliseconds, for example, within 250 milliseconds) at user interface 55 .
- imaging modality 6 operates less frequently (for example, once every minute to every five minutes, or at another interval).
- An infrequently updating imaging modality 6 is optionally used for providing periodic “key frames” used to synchronize and/or verify display of simulated states of tissue region 7 and/or catheter 9 .
- imaging information provides indirect information about elements in the scene simulation—for example, displacement of an organ boundary imaged with relatively high contrast optionally provides information about the displacement of a less clearly visualized organ in communication with the organ boundary.
- data imaged in a tissue cross-section optionally provides information which can be extrapolated to regions outside of the cross-section.
- an imaging modality is used only briefly during a procedure, for example, during a particular phase of a procedure such as a septal crossing.
- Dielectric tissue property sensing In some embodiments, dielectric property measurements (e.g., of impedance behavior of the electrical fields) providing indications of tissue state, and/or of tissue-probe contacts, are made by dielectric property analyzer 22 . The measurements, in some embodiments, use sensing electrodes 3 (or a subset thereof) to determine impedance behavior of electromagnetic fields generated in conjunction with field generator/measurer 10 , and optionally body surface electrodes 5 . Dielectric distance sensing has already been mentioned in connection with the discussion of position data.
- dielectric property sensing is used to distinguish, for example, the state of tissue as healthy, fibrotic, edematous, charred or charring, and/or electrophysiologically active (or capable of being so, e.g., retaining cellular integrity after attempted ablation).
- dielectric property sensing identifies and/or verifies tissue type(s) in a sensed region. Dielectric property sensing for such properties is described, for example, in International Patent Application Nos. PCT/IB2016/052690 and PCT/M2016/052686, the contents of which are incorporated by reference herein in their entirety.
- a force sensor may provide information on contact between a catheter probe 11 and its environment. The information may include indication that the contact has happened, and optionally with what degree of force.
- contact quality and/or contact force information is provided from sensing electrodes 3 , based on impedance measurements and/or sensing of dielectric properties.
- dielectric sensing optionally is used to provide an indication of contact quality (optionally as related to a corresponding contact force), for example as described in International Patent Application No. PCT/IB2016/052686, the contents of which are included by reference herein in their entirety.
- Contact quality may include dielectric and/or impedance sensing of the tissue environment of one or more electrodes, based on which force, pressure, area, and/or angle of contact between electrodes and the tissue environment is inferred, relatively and/or absolutely.
- other sensor(s) 14 comprise a temperature sensor, flow sensor, and/or another sensor configured to provide information about the environment of the catheter probe 11 .
- a treatment element 8 is provided on catheter probe 11 .
- the interaction data (for example, treatment status data 1102 of FIG. 7 ) optionally comprises information about the operation of the treatment element and/or components controlling its effect (for example, power levels, activation events, timing settings, and/or substance amounts administered).
- Treatment element 8 is optionally a probe for ablation treatment using an ablation modality; for example, one or more of the following ablation modalities: radio frequency ablation, cryoablation, microwave ablation, laser ablation, irreversible electroporation, substance injection ablation, and/or high-intensity focused ultrasound ablation.
- treatment element 8 is also used as a sensing electrode 3 (for example, in RF ablation, a treatment delivery electrode may also be used to sense the effect of local dielectric properties on measured electrical field impedance).
- treatment element 8 is operated in conjunction with a treatment controller 13 , configured to provide treatment element 8 with functions such as power, control (e.g., of signal frequency, phase, and/or timing), and/or monitoring.
- the treatment element 8 is configured to deliver a treatment other than ablation (for example, temporary activation or inactivation of tissue activity) using heat, cold, electrical current, sound radiation and/or light radiation.
- treatment element 8 comprises an injection apparatus, used to inject a treatment substance, and/or a substance used in diagnosis such an imaging tracer.
- the injected substance comprises ethyl alcohol, Botox, living cells, and/or growth factor.
- the injected substance comprises a radiolabeled substance, an immunosubstance, and/or a radiopaque trace substance.
- treatment element 8 comprises a tool for manipulating tissue (e.g., grasping, holding, sampling, cutting, attaching, and/or suturing).
- treatment status data 1102 Data indicating operations of treatment element 8 (and/or the rest of a treatment delivery system, for example, including a treatment controller 13 ) are optionally available within system 1 , and in particular available to modules of interaction analyzer 21 , as treatment status data 1102 ( FIG. 7 ). It should be understood that treatment status data 1102 are not limited strictly to data about operations targeted to disease treatments as such, but optionally also include administration of substances and/or energy affecting a tissue region for a diagnostic purpose.
- Interaction data relating to the interactions of a treatment element 8 with a target tissue region 7 include, for example, duration of operation, time of operation, nature and/or concentration of substances delivered, quantities of substances delivered, and/or power and/or frequencies of an exchange of energy between the treatment element 8 and tissue region 7 by a mechanism other than contact pressure (e.g., energy delivered for heating, energy removed for cooling, and/or energy delivered for disruption of structure).
- operational settings are combined with information about the position and/or environment of treatment element 8 in order to derive interaction data. In some embodiments, such combination is performed by one or more of simulators 1110 of FIG. 7 .
- sensing data 1101 optionally includes data from one or a plurality of sensing modalities; for example, sensor electrodes 3 , other sensors 14 , and/or imaging modality 6 , described in relation to FIG. 6 .
- computation-performing and/or control operation-performing modules are optionally implemented by any suitable combination of shared and/or dedicated processing units and/or controllers.
- implementations of treatment controller 13 , position analyzer 20 , and/or interaction analyzer 21 optionally comprise one shared processing unit, or any other suitable number of shared and/or dedicated processing units.
- certain types of interaction data branch additionally or alternatively to FIG. 1B (dotted line branch indicates optional branching).
- geometrical effects which modify the apparent position of geometrical features in a rendered view of a simulation scene are optionally calculated for locations defined by a 3-D data structure representing geometry of the targeted body tissue region 7 .
- the operations of block 112 are carried out, in some embodiments, by interaction analyzer 21 (detailed for some embodiments in FIG. 7 ).
- the geometrical effects of block 112 are calculated based on discrete events in the interaction data; for example, a single event such as a high-pressure contact triggering a tissue response like edema.
- the geometrical effects of block 112 are calculated based on a history of interaction data; for example, a history of the delivery of ablation energy to a tissue region is used to estimate properties (for example, lesion extent) of an ablation lesion produced.
- the lesion properties are optionally estimated using a model of a thermal profile of the target tissue region and an estimate of temperatures/times at temperatures above which ablation occurs.
- FIGS. 9A-9B schematically represent, respectively, different geometrical data representations of flat and indented surfaces, according to some embodiments of the present disclosure.
- the grids shown in the two figures to indicate geometrical point positions are illustrative; alternatively or additionally, these could be, for example: any set of geometrical points defined in a 3-D space by mesh data; by polygon definitions; and/or by one or more parametrically defined shapes such as polyhedra, ellipsoids, cylinders, planar-shape extrusions, and/or parametric curves.
- 3-D flat geometry 901 and indented geometry 903 represent the use of 3-D positions of geometrical points to visually convey surface shapes.
- the indentation 905 for example, is represented by displacing geometrically defined points falling within it by an appropriate distance out of the plane defined by other points of 3-D indented geometry 903 .
- geometrical appearance is changed (e.g., from a flat appearance to an indented appearance) by assigning to the surface of each rendered region within indentation 905 a suitable orientation (for purposes of rendering), chosen to optically mimic the angle the surface would have if the 3-D flat geometry 901 comprised a geometrically indented region like that of 3-D indented geometry 903 ; but without necessarily changing the 3-D geometry to which it maps.
- a suitable orientation for purposes of rendering
- normal maps 902 , 904 indicate by shading a changing elevation angle of a normal to the surface throughout region 906 (white is 90° elevation of the normal, while successively darker values represent successively decreased elevation values).
- normal maps 902 , 904 preferably include representation of azimuth, e.g., azimuth mapped from 0°-360° around concentric circumferences of indentation 905 .
- Surface orientation as represented by a normal map does not necessarily follow the geometrical surface orientation (for example, FIG. 9A shows a flat geometry 901 paired to a normal map 902 that represents an indentation).
- FIGS. 10A-1B do provide an example of how a geometrical appearance can be changed (in that case to appear like a raised bump) by use of shading, without necessarily changing underlying geometrical positions.
- a rendering pipeline typically takes into account at least the relative angle of each surface normal and a light source in order to determine how much light is received at the camera. Then, for example (and other things being equal): when the relative angle is low, the surface is brighter; when the relative angle is high, the surface is darker.
- the normal mapping algorithm also takes into account camera position and/or viewing angle-dependent surface reflection/scattering properties of the surface.
- Normal mapping uses include, for example: to create the appearance of surface irregularities where the 3-D geometrical data has none, to exaggerate the 3-D appearance of shapes in the 3-D geometrical data, and/or to smooth transitions between polygons where the 3-D geometrical data describes abrupt changes (for example, between polygons in a mesh).
- normal mapping (and a normal map, supplied as part of the geometrical rendering data 1121 ) has particular application for the showing of tissue deformations such as swelling (e.g., to indicate tissue damage) and indentation (e.g., to indicate probe-tissue contact).
- tissue deformations such as swelling (e.g., to indicate tissue damage)
- indentation e.g., to indicate probe-tissue contact
- 3-D structure rendered in a scene is geometrically represented by geometrical rendering data 1121 .
- 3-D positions are one part of the geometrical rendering data.
- Data used to affect geometrical appearance such as by use of normal maps (apart from use to define fine-grain texture) are considered to comprise a second part of the geometrical rendering data 1121 .
- the geometrical rendering data 1121 comprise mesh data; for example as commonly used in defining structures for computerized visual rendering of 3-D structures.
- Geometrical rendering data 1121 specify positions (and usually also connections among positions, and/or positions joined by the extent of a common surface and/or material volume), corresponding to positions of surfaces of a target body tissue region to be visually rendered for presentation.
- the geometry of positions interior to the surface is also defined and/or represented.
- presentation optionally includes the use of transparency and/or cross-sectional views, whereby an interior portion of a tissue region is made visible.
- geometrical rendering data 1121 are derived from anatomical data 31 ; for example, appropriately segmented 3-D medical image data.
- anatomical data 31 include specification of tissue region thicknesses, for example, thicknesses of heart walls.
- Heart wall thickness is optionally obtained from, for example: atlas information (optionally for a population corresponding to the current patient), modified atlas information (for example, scaled according to anatomical landmark correspondence, heart rate, and/or point observations), and/or imaging of the patient (for example, one or more of CT, MRI, and/or nuclear imaging techniques).
- atlas information for a population corresponding to the current patient
- modified atlas information for example, scaled according to anatomical landmark correspondence, heart rate, and/or point observations
- imaging of the patient for example, one or more of CT, MRI, and/or nuclear imaging techniques.
- the appearance of the raw geometrical rendering data 1121 that is finally presented by a user interface 55 is also determined in part by the assignment to the geometry of material appearance properties (MAPs); that is, properties affecting the appearance of materials represented in the rendered image.
- MAPs comprise any properties associated to positions (typically positions of a “virtual material”, as next described) in a virtual environment for visual rendering according to simulated optical laws, and which affect how a surface and/or its enclosed volume are visualized within a 3-D rendered space.
- MAPs may define color, texture, transparency, translucency, scattering, reflectance properties, and the like.
- MAPs are usually but not only assigned to surface positions defined by the geometrical rendering data.
- MAPs are optionally assigned to volumes defined by surfaces specified by the geometrical rendering data 1121 .
- MAPs can also be assigned to the virtual environment (e.g., as lighting parameters) in such a way that they selectively affect material appearance at different positions.
- MAPs are used to in part define surface textures, for example by use of bump mapping (a type of normal mapping technique).
- Creating the visual rendering in some embodiments may include surfaces and/or volumes comprising “virtual material”; for example, a virtual material having a visual appearance of myocardial tissue, and used in the representation of a heart wall defined by two surfaces.
- a virtual material in some embodiments, is subject to simulated optical rules approximating processes such as reflection, scattering, transparency, shading, and lighting.
- Not every optical rule used in visual rendering is a copy of a real-world physical process; the art of computer rendering includes numerous techniques (for achieving both realistic and deliberately unrealistic results) that apply simulated optical rules that have no direct physical equivalent. Normal mapping has already been mentioned as a technique which can be applied to change a texture and/or geometrical appearance.
- Another example of a simulated optical rule is ambient occlusion.
- Ambient occlusion is an efficiently calculable method of simulating the effect of ambient lighting, but the occlusion is defined as a mapped property of an object's surface, rather than as an effect of light emitted from positions in the environment
- a virtual material optionally also defines material properties that are not directly either geometrical or “of appearance”, for example, density, viscosity, thermal properties, and/or elastic properties. Insofar as these properties do in turn (in a given embodiment) affect the definition of MAPs (for example, via calculations of one or more simulators 1110 ), they are optionally treated as parts of material appearance properties data 1122 , without actually comprising MAPs in themselves. Additionally or alternatively, non-appearance properties, particularly those that affect how geometry changes (such as thickness, density, velocity, viscosity, and/or elasticity), are optionally considered part of the geometrical rendering data 1121 insofar as they affect geometrically apparent behaviors of the material (e.g., how the material changes in shape).
- geometrical effects of tissue-probe interactions on a simulated tissue region are assigned based on the output of one or more simulators 1110 ( FIG. 7 ).
- sensing data 1101 and/or treatment status data 1102 are used directly or indirectly as input to one or more simulators 1110 (e.g., simulators 1111 , 1112 , 1113 , and/or 1114 ) that make adjustments to a modeled appearance state 1120 of the tissue based on inputs received, and one or more simulated aspects of tissue physiology, geometry, and/or mechanics.
- simulators 1110 e.g., simulators 1111 , 1112 , 1113 , and/or 1114
- the modeled appearance state 1120 includes the geometrical rendering data 1121 and material appearance properties data 1122 in a form suitable for being operated on by the simulators 1110 ; it may also be or comprise a renderable model state 1103 suitable for rendering for presentation, or else be convertible to a renderable model state 1103 .
- modeled appearance state also includes data indicating the probe state 1123 .
- Simulators 1110 also optionally receive as starting input anatomical data 31 and/or tissue state data 1104 .
- simulators 1110 optionally maintain their own internal or mutually shared simulation states.
- simulators 1110 use motion simulation services exposed by a graphical game engine that can produce geometrical changes to a scene based, for example, on simulated collisions among scene elements, gravity effects, velocity, momentum, and/or elasticity.
- the inputs comprise direct and/or transformed use of one or more of the interaction data types described in relation to block 110 .
- Direct sensing input In some embodiments, adjustment of the simulation scene is implemented based directly on sensing data 1101 . For example, a pressure reading from a pressure sensor 14 is optionally mapped directly to a geometrical displacement according to the measured pressure.
- a more involved simulation is performed; wherein probe interaction with a virtual material representing tissue is, in at least one aspect, physically and/or physiologically simulated in order to produce a new modeled appearance state.
- sensing data 1101 is indirect after interpretation by one or more physiology trackers 1106 .
- Physiology tracker 1106 is a module which accepts sensing data 1101 and generates an assessment of current physiological state based on the sensing data 1101 .
- sensing data 1101 comprises dielectric measurements that physiology tracker 1106 is configured to convert into assessment of tissue state, for example fibrotic, healthy, or edematous; for example as described in International Patent Application No. PCT/IB2016/052690, the contents of which are included by reference herein in their entirety.
- electrical activity originating in tissue indicating a functional state e.g., general capacity to support electrical activity, and/or feature of the activity itself) is measured and used as sensing input.
- the output of the physiology tracker 1106 from one or more of these inputs is optionally in terms of one or more states such as tissue thickness (e.g., heart wall thickness), lesion depth, lesion volume, degree of lesion transmurality, characterization of tissue edema, characterization of functional activity and/or inactivation, a classification as to a potential for tissue charring, and/or a classification as to a potential for or occurrence of steam pop.
- tissue thickness e.g., heart wall thickness
- lesion depth e.g., lesion depth
- lesion volume degree of lesion transmurality
- characterization of tissue edema characterization of functional activity and/or inactivation
- a classification as to a potential for tissue charring e.g., a classification of functional activity and/or inactivation
- a classification as to a potential for tissue charring e.g., a potential for or occurrence of steam pop.
- steam pop is a phenomenon occurring during ablation with an aud
- a physiology simulator 1114 and/or an ablation physics simulator 1112 configured to convert such states into MAPs, other virtual material properties, and/or geometrical effects that indicate the tissue state(s) calculated from the measurements.
- the tissue state interpreted from the sensing input also affects mechanical properties used, for example, by a contact physics simulator 1111 and/or an injection simulator 1113 .
- physiological tracker 1106 is optionally implemented as part of one or more simulators 1110 producing changes to a modeled appearance state 1120 .
- the module configuration is more like that of direct sensing input, with the simulation of appearance integrated with physiological interpretation of the sensing data.
- Probe position tracker 1107 is a module that accepts appropriate sensing data 1101 (e.g., electromagnetic field navigation data, acoustic tracking data, and/or imaging data) and converts it to a measurement of the position (e.g., a measurement of the location and/or a measurement of the orientation) of a probe such as catheter probe 11 , for example as described in International Patent Application No. PCT/1132016/052687. It optionally comprises position analyzer 20 .
- appropriate sensing data 1101 e.g., electromagnetic field navigation data, acoustic tracking data, and/or imaging data
- a measurement of the position e.g., a measurement of the location and/or a measurement of the orientation
- It optionally comprises position analyzer 20 .
- position tracker 1107 implements processing to massage outputs of position analyzer 20 in view of the current state of the scene simulation—for example, to recalibrate sensed position data to positions compatible with the scene simulation.
- position tracker 1107 integrates position data from a plurality of position inputs.
- Optionally position determination includes determination of tissue contact force and/or quality, using a force sensor on the probe, and/or for example as described in International Patent Application No. PCT/IB2016/052686, the contents of which are included by reference herein in their entirety. Additionally or alternatively, on-line imaging data (e.g., ultrasound and/or angiographic images) are used, intermittently and/or continuously, to determine and/or verify probe position.
- on-line imaging data e.g., ultrasound and/or angiographic images
- Probe position determinations are optionally used as inputs to any of simulators 1110 ; for example in order to assign particular positions to measurements of other tissue states/properties, and/or to help characterize changes induced by probe interactions with tissue (e.g. geometrical distortions of tissue introduced by touching contact with the probe, and/or simulated effects of treatment procedures). It is a potential advantage to implement probe position tracker 1107 as a distinct module that can be treated as a computational “service” to any appropriate simulator 1110 . However, it should be understood that probe position tracker 1107 is optionally implemented as part of one or more simulators 1110 producing changes to a modeled appearance state 1120 maintained by interaction analyzer 21 .
- Treatment status input In some embodiments, simulation is implemented based on treatment status data 1102 .
- Treatment status data 1102 include data indicating the operation and/or status of a treatment modality—for example, power, control parameters (e.g., of signal frequency, phase, and/or timing), and/or monitoring data.
- treatment status data are applied directly to modeled appearance state 1120 ; for example, as an indentation or other deformation at a position of treatment modality activation.
- at least one aspect of the tissue and/or tissue/probe interaction is physically and/or physiologically simulated in order to produce a new modeled appearance state 1120 , based on the treatment status data.
- a physiology simulator 1114 receives input indicating that a probe-delivered treatment operation has occurred at some particular position (optionally along with parameters of the treatment operation).
- Physiology simulator 1114 is optionally configured to model the reaction of tissue to the treatment, instantaneously (for example, due directly to energy delivered by an ablation treatment), and/or over time (for example, as an edematous reaction develops in the minutes following an ablation treatment).
- an injection simulator 1113 receives treatment status data indicating that a material injection is occurring.
- Injection simulator 1113 is optionally configured to model an appropriate reaction of tissue to the injected substance (e.g., swelling to indicate the injected volume, and/or to indicate injury response to the injection).
- the reaction is optionally immediate, and/or includes a slow-developing component as the material diffuses from the injection site.
- changes in geometry due to the addition of material volume to the tissue are also modeled.
- a rendering of the modeled appearance state is created for presentation.
- geometrical effects on a simulated tissue region are assigned based on the output of one or more simulators 1110 ( FIG. 7 ).
- sensing data 1101 and/or treatment status data 1102 are used directly or indirectly as input to one or more simulators 1110 (e.g., simulators 1111 , 1112 , 1113 , and/or 1114 ) that make adjustments to a modeled appearance state 1120 of the tissue based on inputs received, and one or more simulated aspects of tissue physiology, geometry, and/or mechanics.
- Simulators 1110 also optionally receive as starting input anatomical data 31 and/or tissue state data 1104 .
- simulators 1110 optionally maintain their own internal or mutually shared simulation states.
- simulators 1110 use motion simulation services exposed by a graphical game engine that can produce geometrical changes to a scene based, for example, on simulated collisions among scene elements, gravity effects, velocity, momentum, and/or elasticity.
- FIGS. 2A-2E, 3A-3L, 4A-4D, and 5A-5B Operations of some exemplary simulators 1111 , 1112 , 1113 , and/or 1114 are described herein in the context of the examples of FIGS. 2A-2E, 3A-3L, 4A-4D, and 5A-5B .
- a modeled appearance state 1120 is converted to a renderable model state 1103 and provided to a display module 1130 that converts (renders) the renderable model state into at least one image comprising a visually rendered representation of the intrabody region 7 .
- modeled appearance state 1120 is directly represented as a renderable model state 1103 (this is a potential advantage for tighter integration of the simulation with a game engine driving its rendering and presentation).
- the at least one image is displayed by one or more graphical displays of a user interface 55 .
- User interface 55 in some embodiments, comprises one or more displays, for example a computer monitor, virtual reality goggles, and/or 2-D or 3-D projection device.
- user interface 55 also comprises one or more user input devices that can be used for tasks such as selecting operating modes, preferences, and/or display views. It is noted that insofar as catheter probe position sensing affects simulation and/or display, catheter probe manipulation also acts as a special form of user input device; but for purposes of the descriptions herein such catheter probe sensing inputs should be considered distinct from inputs provided through user interface 55 .
- the display module 1130 renders from one, two, three, or more viewpoints simultaneously.
- rendering is performed (and the resulting images are displayed) at a frame rate sufficient to produce perceived motion (herein, such a frame rate is termed a motion frame rate)—for example, at least 10-15 frames per second; and optionally at least, for example, 15, 20, 30, 50, 60, or 100 frames per second (fps), or another greater or intermediate value.
- a motion frame rate for example, at least 10-15 frames per second; and optionally at least, for example, 15, 20, 30, 50, 60, or 100 frames per second (fps), or another greater or intermediate value.
- lower frame rates e.g. 10-20 fps
- More fluid motion is potentially less fatiguing and/or more precise for guiding actions based on events in the simulation scene.
- Trans-flicker fusion frequency frame rates are optionally preferred for immersive, virtual reality (VR) user interface implementations; higher frame rates potentially help mitigate VR motion sickness.
- display module 1130 includes a computer-implemented software module comprising the rendering pipeline 1230 of a 3-D graphics engine 1200 (software environment) such as is provided with graphical game engines such as the Unreal® or Unity® graphical game engine, or another game engine. Some general aspects of 3-D graphical game engines are discussed in relation to FIG. 8 , herein.
- the conversion of a modeled appearance state 1120 into a renderable model state 1103 comprises the creation and/or instantiation of computer data and/or code structures that are directly used by the rendering pipeline of the 3-D graphics engine 1200 .
- interaction analyzer 21 for example, any of simulators 1110 ) are provided as functions (e.g. classes, hook implementations, etc.) making use of the application programming interface (API) of such a 3-D graphics engine 1200 .
- API application programming interface
- flow optionally returns to block 110 to receive more interaction data, or else (if adaptive visual rendering is to be suspended), the flowchart ends.
- geometrical effects on a simulated tissue region are assigned based on the output of one or more simulators 1110 .
- sensing data 1101 and/or treatment status data 1102 are used directly or indirectly as input to one or more simulators 1110 (e.g., simulators 1111 , 1112 , 1113 , and/or 1114 ) that make adjustments to a modeled appearance state 1120 of the tissue based on inputs received, and one or more simulated aspects of tissue physiology, geometry, and/or mechanics.
- Simulators 1110 also optionally receive as starting input anatomical data 31 and/or tissue state data 1104 .
- simulators 1110 optionally maintain their own internal or mutually shared simulation states.
- simulators 1110 use motion simulation services exposed by a graphical game engine that can produce geometrical changes to a scene based, for example, on simulated collisions among scene elements, gravity effects, velocity, momentum, and/or elasticity.
- FIG. 8 schematically represents components, inputs, and outputs of a graphical game engine 1200 operating to manage and render scene elements 1220 to motion frame-rate images 1240 , according to some embodiments of the present disclosure.
- a graphical game engine 1200 is used not only to render images (for example as described in relation to block 114 of FIG. 1A ), but also to provide more generally the data structure and code framework of the “scene” and how it changes in response to time and/or input.
- a graphical game engine 1200 comprises a collection of computer software components exposing one or more application programming interfaces (APIs) for use in describing, instantiating (initializing and maintaining), continuously updating, rendering, and/or displaying of scene elements 1220 .
- APIs application programming interfaces
- Examples of graphical game engines include the Unreal® and Unity® graphical game engines.
- the scene elements 1220 provided for the operations of graphical game engine 1200 optionally include, for example, descriptions of terrain 1221 , objects 1224 , cameras 1223 , and/or elements for lighting 1222 .
- definitions of scene elements 1220 are derived from geometrical rendering data 1121 and/or MAPs data 1122 .
- Definitions are optionally expressed in terms of geometrical-type scene data 1225 (e.g. model assets, shapes, and/or meshes), and/or appearance-type scene data 1226 (e.g., image assets, materials, shaders, and/or textures).
- geometrical rendering data 1121 and MAPs data 1122 are initially produced already in a format that is directly used by graphical game engine 1200 .
- scene elements 1220 are provided with simulated dynamic behaviors by an iterated series of calculated scene adjustments 1210 .
- Scene adjustments 1210 are optionally implemented by a variety of software components for e.g., motion physics services 1212 , collision detection service 1213 , and/or scripts 1211 . These are examples; graphical game engines 1200 optionally implement additional services, e.g., “destructibility”.
- Scripts 1211 can be provided to simulate, for example, autonomous behaviors and/or the effects of triggered events. Scripts 1211 are optionally written in a general-purpose computer language taking advantage of APIs of the graphical gaming engine 1200 , and/or in a scripting language particular to an environment provided by the core graphical gaming engine 1200 .
- Graphical gaming engines optionally also accept integration with plugin software modules (plugins, not shown) that allow extending the functionality of the core graphical game engine 1200 in any of its functional aspects.
- plugins that perform functions related to updating the scene state are also encompassed within the term “script” 1211 .
- all or part of any of simulators 1110 is implemented as a script 1211 .
- scripts 1211 (optionally including plugins) and scene elements 1220 are considered part of the graphical game engine 1200 as a functional unit.
- core graphical game engine is used.
- graphical game engines 1200 accept user input 1214 (optionally including, but not limited to, inputs from user interface 55 devices such as mouse, keyboard, touch screen, game controller, and/or hand motion detector; and for some embodiments of the current invention, optionally including data provided as input that indicate probe positions, treatment modality operation, etc.).
- user input 1214 optionally including, but not limited to, inputs from user interface 55 devices such as mouse, keyboard, touch screen, game controller, and/or hand motion detector; and for some embodiments of the current invention, optionally including data provided as input that indicate probe positions, treatment modality operation, etc.
- a typical graphical game engine also includes a rendering pipeline 1230 that may include one or more stages of 3-D rendering, effects application, and/or post-processing, yielding at least one stream of frame-rate images 1240 .
- the stages of the rendering pipeline 1230 include modules that implement simulated optical algorithms—not necessarily directly based on real-world physical laws—generally selected to produce a rendered result that visually gives to elements in the rendered scene the appearance of material substances.
- Table 1 includes some examples of how graphical game engine features and concepts are optionally used in some embodiments of the current invention:
- Probe 11 is optionally represented as a “game” object, and may optionally serve as a viewpoint anchor like avatars and/or tools in certain 3-D games.
- Significant features of the anatomical environment such as scars, lesions, and/or regions of edema, are optionally implemented as appropriately positioned objects, e.g., embedded in an environment of surrounding tissue.
- Guides and markers are optionally implemented as game objects.
- Assets Tissue, probe, guide, and/or other objects and/or their appearances are optionally instantiated from assets which represent available types of objects, their behaviors and/or their appearances.
- Cameras 1223 Cameras optionally define flythrough viewpoint(s) of the anatomy traversed by the catheter probe 11, and/or overview viewpoint(s) (showing probe and tissue from a remote viewpoint).
- the position of catheter probe 11 defines one or more camera viewpoints by its position/or orientation.
- Lighting 1222 In addition to providing general lighting of the tissue being navigated, lighting 1222 is optionally defined to provide highlighting, e.g., of regions pointed at by probe 11, indications of environmental state by choice of light color, light flashing, etc.
- Lighting is optionally used to implement MAPs non-locally (that is, a defined light source optionally is defined to illuminate a view of simulated tissue to selectively change its material appearance, while not being part of the material properties of appearance of the simulated tissue as such).
- Image Assets MAPs that are also material properties of Materials, appearance, for example, defining the appearance of Shaders, and tissue as healthy muscle, edematous, fibrotic, heated, Textures 1126 cooled, etc.
- Particle Type of object optionally used for providing effects Systems such as smoke/steam-like indications of ablation heating, spray, transfer of energy, etc.
- Collision Optionally used for interactions between probe and Detection the geometry of the anatomical environment; 1213 and optionally including deformation of the probe and/or Motion the anatomy.
- the term “physics” generally is limited Service to physics affecting movement/deformation of game 1212 objects such as collision, gravity, or destruction.
- simulators 1110 include simulation of other “physics”, such as temperature, physiological change, etc.
- Scripts Optionally used for animating and/or showing changes 1211 in dynamic features of the environment (lighting, terrain), view (camera position) and/or game objects, optionally gradually over a period of time: for example, development of lesions, development of edema, heating/cooling effects, and/or injection effects.
- scripts are used to implement dynamic appearance, even though the underlying state representation is constant (e.g., coruscating and/or pulsing effects).
- User Input Optionally comprise inputs reflecting changes in probe 1214 position (e.g., output of probe position tracker 1107) for guiding navigation through the scene, and/or determining camera position.
- Some treatment status data 1102 are optionally interpreted as inputs reflecting operator interaction with the scene.
- Multiplayer During a procedure, there is optionally a plurality of different operators working simultaneously with a system according to some embodiments of the current invention. For example, while a primary physician manipulates the intra-body probe, one or more additional workers are optionally reviewing the simulated environment to locate next target sites for the probe, evaluate effects of previous ablations, etc.
- there is more than one probe in use at a time each of which is optionally treated as a different “player” with its own associated camera views and/or interaction capabilities.
- FIG. 1B is a schematic flowchart illustrating the calculation and display of an rendered image of a simulation scene comprising a view of simulated tissue having a geometry and/or geometrical appearance of a tissue dynamically changing as a function of time to represent changes developing subsequent to a triggering interaction between the tissue and a catheter probe, according to some embodiments of the present disclosure.
- simulation of probe-tissue interactions includes simulation of tissue effects (e.g., injury response) developing substantially independently of continuing inputs from probe-tissue interaction data.
- the flowchart of FIG. 1B branches off from certain input cases of the flowchart of FIG. 1A , wherein geometrical effects develop at least partially concurrently with (and optionally unsynchronized to) geometrical effects which immediately track changes in inputs.
- initial interaction data is received (optionally entering the flowchart from block 110 of FIG. 1A ).
- the simulated geometry evolves according to the results of pre-set rules which operate substantially independently of further input for a time.
- a potential advantage of this approach is to allow continuously updated visualization of tissue changes, even when no new sensing data has been obtained to confirm them.
- the flowchart optionally begins after a triggering probe-tissue interaction has occurred which is to be modeled as provoking changes to the scene which continue after the trigger time to. For example, an input indicating that ablation energy has been delivered triggers the operations of the flowchart.
- operations of the flowchart of FIG. 1B are implemented by a script 1211 . Additionally or alternatively, operations of the flowchart are implemented by a simulator 1110 , for example, physiology simulator 1114 .
- one or more geometries and/or geometrical appearances are set to an initial state (an existing state is optionally used as the initial state) and a simulation function is selected and assigned to change the geometries and/or geometrical appearances as a function of time according to parameters set from inputs describing the probe-tissue interaction. These inputs may be included in the interaction data received at block 110 .
- the simulation function is configured to evolve according to the state of a timer.
- a physiology simulator 1114 is configured to emulate effects of edema developing post-ablation, based on parameters such as the position, amount of energy delivery, and/or duration of energy delivery causing the ablation.
- Edema is optionally modeled to develop over the course of several minutes (for example, 2, 5, 10, 15, 20 or another number of minutes).
- modeled changes in geometry and/or geometrical appearance simulate changes in muscle tone, e.g., vasodilation or vasoconstriction.
- the geometry and/or geometrical appearance is optionally modeled to show thickening and/or thinning, increase and/or decrease in surface height variation over a surface area, and/or another deformation, for example: dimpling, puckering, “goose-pimpling”, stretching, collapsing, expanding, distending, and/or shrinking.
- Lumenal structures optionally show change in cross-sectional shape (e.g., radius).
- one or more MAPs are changed in coordination with change in geometry and/or geometrical appearance.
- Adjusted MAPs optionally include, for example, those that can be modified to show increasing “redness” of the tissue with time to indicate swelling, “whiteness” or “greyness” to indicate loss of perfusion, color change to indicate change in temperature, etc.
- geometrical effects are applied to indicate contractile state (for example, of cardiac muscle, or gastrointestinal tract motion).
- simulations of contraction are triggered by measurements of heartbeat and/or pulse phase, and/or of autonomic nervous system activity.
- the geometrical effects are preferably simulated to be in synchrony with what is expected to be actually occurring in the tissue that the simulation describes.
- the simulation is optionally different from reality in one or more respects; for example, amplitude is optionally adjusted.
- dynamic adjustment of heart size in a rendered view of a simulated scene is based on heart rate.
- this is implemented by dynamic adjustment of the geometrical rendering data representing the heart shape.
- the adjusting comprises adjusting a static size of one or more heart chambers (e.g., a lumenal volume of the heart chambers, and/or a lumenal dimension of the heart chambers).
- the adjusting comprises selecting a range of heart chamber sizes simulated cyclically over the course of each heartbeat cycle, e.g., between changing minimum and/or maximum sizes.
- the adjustment of heart chamber size to larger or smaller sizes is accompanied by corresponding inverse adjustment of heart wall sizes to smaller or greater thicknesses.
- a potential advantage of these adjustments is to increase an accuracy and/or precision with which an intrabody probe (and in particular, an intracardial catheter probe) can be positioned, and/or with which the position of such a probe can be determined.
- positioning precision/accuracy with respect to one or more particular regions of heart wall tissue is potentially improved; for example, a nearest and/or a pointed-at region of heart wall tissue.
- a pointed at location is located along a longitudinal axis extending through the probe tip.
- Adjustment of a display to maintain an accuracy of positioning of the intracardial probe relative to the heart is implemented, in some embodiments, using one or more of the following methods.
- positioning changes of a probe relative to a heart wall due to heart size changes are at least partially represented to an operator by simulating relative movements and/or scaling of a rendered representation of an intrabody probe in a display, while suppressing at least part of the size changes undergone by the actual heart chamber represented in the display. For example, if heart chamber beats are at least partially suppressed, then changing actual probe position relative to the beating heart chamber walls is optionally displayed by movements of the probe itself.
- inter-pulse heart chamber size changes e.g., due to heartbeat rate changes
- scaling of detected intracardial probe movements is adjusted in a display so that relative positions of heart wall and probe remain synchronized between the actual tissue and probe pair, and a display of a simulated tissue and probe pair.
- the wave pattern to be simulated is determined at least in part from direct measurements of impulse wave propagation.
- the wave pattern is simulated from a generic heart tissue or other tissue model.
- the wave pattern is adapted according to knowledge about tissue state, for example, to indicate regions of weak and/or slow propagation attributed to states of fibrosis, perfusion state, and/or denervation.
- the degree of impulse transmission is itself modulated in simulations managed by physiology simulator 1114 ; for example, to reflect transmission effects of treatment activities such as lesioning, tissue cooling, injections, etc.
- the current state of the geometry and/or geometrical appearance (optionally including changes to MAPs) is rendered to a visual representation of the tissue with which the interaction occurred.
- the rendering makes use of 3-D graphics engine, for example as described in relation to display module 1130 , and/or in relation to FIG. 8 and/or Table 1.
- the timer is incremented.
- Time-evolving geometry and/or geometrical appearance optionally evolve, for example, cyclically (for example, repeating a movement pattern), transiently (disappearing at the end of a generation cycle, for example, in a simulation of cooling from a heated condition or re-warming from a cooled condition), and/or to a new steady-state appearance (for example, edema that develops to a fully developed state during a period after ablation, and then persists beyond the period during which the tissue is simulated).
- sensing feedback is optionally integrated with the flowchart of FIG. 1B to create semi-open/semi-closed loop simulation: periods of open loop simulation producing results (e.g., geometrical effects) that are periodically verified, guided, and/or corrected according to sensed data.
- simulation of developing edema optionally proceeds independently as long as no further sensing data characterizing the edema state is available. However, if edema state is measured at some midpoint of the simulated edema time-course (for example, by use of dielectric measurements), then the simulation is optionally adjusted mid-course to reflect the sensed data. Adjustment is optionally immediate, and/or includes a period of interpolated adjustment (which potentially helps maintain the sense of presence in rendered views of the simulation scene).
- FIGS. 2A-2E illustrate a 3-D rendered display for indicating lesioning status to an operator, according to some exemplary embodiments of the present disclosure.
- FIGS. 2A-2E show a sequence of visual renderings of a single lesion over the course of the operation of an RF ablation probe to create it. This provides an example of how adjusted geometry and/or geometrical appearance can be used (optionally together with adjustment of MAPs) to convey to an operator a direct understanding of how use of an ablation probe is affecting target tissue.
- FIGS. 2A-2E comprise images (rendered in some embodiments in the rendering pipeline 1230 of a 3-D graphical game engine 1200 ) of an RF ablation probe 202 (corresponding, in some embodiments, to catheter probe 11 , wherein treatment element 8 is an ablation electrode, and treatment controller 13 operates to supply ablation energy to the RF ablation probe 202 ) and its position relative to tissue 205 targeted for ablation (e.g., part of body tissue region 7 ).
- the rendering is in color, and/or otherwise using applied MAPs conveying the vital appearance (e.g., properties of roughness, specular reflection, etc.) of the tissue (black and white is shown herein for purposes of illustration).
- RF ablation probe 202 is implemented as an object 1224 belonging to scene elements 1220 ( FIG. 8 ).
- Tissue 205 is optionally implemented as terrain 1221 or an object 1224 belonging to scene elements 1220 .
- FIG. 2A shows the moment of initial contact between probe 202 and tissue 205 .
- this view is triggered when contact is sensed by a sensor on the probe, such as a force sensor (an example of an “other sensor” 14 ) and/or dielectric sensing of contact (e.g., via dielectric property analyzer 22 ).
- the triggering mediated in some embodiments by interaction analyzer 21 (and optionally taking advantage of a collision detection service 1213 of a game engine 1200 ), is optionally visually implemented as a jump from a wider angle view with the probe out of contact to a close-up of the probe contacting tissue.
- transition from no-contact to contact is shown by a short bridging animation.
- continuous sensing of probe position and/or probe distance to the tissue wall allows any jump in a sensed transition between contact and non-contact to be smoothed out using actual position data.
- FIG. 2B in some embodiments, includes a visual indication of increased contact pressure between the tissue 205 and probe 202 comprising an indented region 204 .
- the deeper indented region 204 shows that pressure has been increased still further.
- the geometry and/or geometrical appearance modifications indicate sensed and/or calculated contact pressure; the appropriate transformation being calculated, for example, by contact physics simulator 1111 (which may in turn take advantage of motion physics services 1212 and/or collision detection service 1213 of game engine 1200 ).
- contact physics simulator 1111 which may in turn take advantage of motion physics services 1212 and/or collision detection service 1213 of game engine 1200 .
- distances of the indentation deformation need not be exactly corresponding to deflection distances in the real tissue.
- tissue 205 is shown in cross-section.
- the cross-sectional view also displays information about achieved lesion parameters such as lesion depth and/or lesion transmurality.
- transformation of geometrical position data is preferably used to show indentation changes.
- Geometrical appearance changes are optionally used as well; but preferably not used alone, since the edge-on view of a cross-section highlights the spatial position of surface contours.
- transparency effects are applied to allow seeing into a targeted volume of tissue. For example, before ablation begins, a local region of tissue selected by the position of probe 202 is shown with increased transparency. Optionally, as portions of the tissue become lesioned, they are represented in simulated display as more opaque; creating an ablation “island” that directly shows the progress of lesioning.
- a potential advantage of the transparency approach is to allow representation of lesioning progress from any arbitrary 3-D point of view including the targeted tissue region.
- FIG. 2C in some embodiments, there has been a slight increase in sensed contact (shown by increased indentation of indented region 204 ), and ablation by delivery of RF energy to the tissue from probe 202 has begun.
- a superficial lesioned portion 208 of tissue 205 is now shown, for example, in a lighter shade (in color, lesioned portion 208 is optionally colored a light grey compared to darker red vital tissue).
- lesioned portion 208 gradually increases in extent and/or degree of MAP change from the pre-lesioned state.
- FIG. 2D also indicates an increased pressure of contact by an indented region 204 in the tissue, while FIG. 2E shows pressure reduced.
- the geometrical deformation changes as tissue ablation proceeds (even for a fixed pressure), for example to indicate changes in tissue elasticity and/or volume.
- this progression is based on inputs describing the operation of the treatment modality (ablation, in the illustrated example). For example, inputs describing power, duration, and/or contact quality are factored into a simulation (e.g., by an ablation physics simulator 1112 ) linked to how the tissue is displayed in its geometrical and/or material appearances.
- operation of an ablation physics simulator 1112 includes thermal modeling (thermal simulation), based on local tissue region properties, for example, of local tissue type, thickness, thermal conductivity, and/or thermal exchange (e.g., between tissue and flowing blood).
- at least part of the information providing local tissue type and/or thickness is obtained based on dielectric properties calculated from measurements of an alternating electromagnetic field obtained from a sensing electrode 3 at or near the position of the lesion 209 .
- calculated dielectric properties are used as indications of lesion state (e.g., size, transmurality, completeness and/or irreversibility), for example as described in International Patent Application No. PCT/IB32016/052690, the contents of which are incorporated by reference herein in their entirety.
- accuracy of transmurality has been found to be about ⁇ 1 mm.
- 100% sensitivity and specificity in predicting lesion transmurality was found, while in humans, at least 90% specificity and sensitivity was found. Specificity is the percentage of actually well-ablated areas that were dielectrically identified as well-ablated; sensitivity is the percentage of actually partially ablated areas that were dielectrically identified as partially ablated.
- the progression during lesioning is based on inputs describing sensed data reflecting one or more treatment effects, for example, measured temperature and/or changes in dielectric properties as tissue begins to break down.
- probe-based temperature sensing where available, is limited in resolution and/or depth, so that completely sensing-based adjustment may be difficult or impossible to obtain.
- sensed data may nevertheless be used as input to an ablation physics simulator 1112 that extrapolates lesion state through a 3-D block of tissue.
- the extrapolated state is used as a corrective and/or calibrating input to an ablation physics simulator 1112 .
- one or more additional indications of house lesioning is proceeding are provided as part of the rendered image.
- “steam” 207 is shown arising from the lesion point.
- the threshold may be, for example, a threshold at which lesioning occurs, a threshold above which a danger of effects such as steam pop or charring occurs, or another threshold.
- Different characteristics of the “steam” could be used, for example, conversion to black (or increasingly black) “smoke” in case of increased danger of excessive heating.
- such steam- and/or smoke-like effects are implemented using a particle system facility provided by a graphical game engine.
- FIGS. 3A, 3D, 3G, and 3J schematically represent a sequence of rendered views of a rendered catheter probe 11 A (representing a catheter probe 11 ) passing through a rendered tissue wall region 50 , according to some embodiments of the present disclosure.
- FIGS. 3B, 3E, 3H, and 3K each of which schematically represents a graph of position versus time and measured contact versus time for the catheter probe 11 rendered as rendered catheter probe 11 A of FIGS. 3A, 3D, 3G, and 3J , according to some embodiments of the present disclosure.
- FIGS. 3A, 3D, 3G, and 3J schematically represent a sequence of rendered views of a rendered catheter probe 11 A (representing a catheter probe 11 ) passing through a rendered tissue wall region 50 , according to some embodiments of the present disclosure.
- FIGS. 3B, 3E, 3H, and 3K each of which schematically represents a graph of position versus time and measured contact versus time for the catheter probe 11 rendered as rendered catheter probe 11 A
- 3C, 3F, 3I, and 3L which schematically represent an ultrasound image at a cross-section of a heart at the atrial level, and corresponding to the sequence of FIGS. 3A, 3D, 3G, and 3J , according to some embodiments of the present disclosure.
- the geometry of a three-dimensional simulation of a tissue wall region 50 is updated for displaying at a motion frame rate.
- the frame updating may be based on information received from one or more sensing modalities.
- the information may be received as catheter probe 11 interacts with a tissue wall.
- FIGS. 3B, 3E, 3H, and 3K and FIGS. 3C, 3F, 3I, and 3L represent different examples of sensed inputs related to tissue-catheter probe interactions, based on which (in any suitable combination) the tissue deformations of FIGS. 3A, 3D, 3G, and 3J are simulated.
- the sensing modalities optionally comprise modalities that are non-imaging in nature (e.g., catheter probe position tracking data, and/or probe-sensed parameter time-course data), and/or comprise images giving incomplete view coverage of the simulated tissue region (for example, cross-sectional images).
- New sensing data is optionally acquired faster, slower, or at the same rate as the simulation appearance is updated.
- Simulation and visualization updating is optionally in correspondence with states indicated by recently sensed data. For example when sampling is slow and/or intermittent, the current simulation state is optionally extrapolated from recent data according to one or more trends therein.
- simulation updating is delayed from the acquisition of real-time data (for example, delayed to a buffer of at least two recent samples, and/or for example, by up to about 250 msec), which optionally allows smoothing interpolation between actually measured sensing data points in exchange for a certain amount of lag.
- the X-axes of graphs 310 of FIGS. 3B, 3E, 3H, and 3K represent relative time.
- the Y-axes overlappingly represent sensed catheter probe position advance above a baseline position 311 (dashed lines including points 312 , 314 , 316 , and 318 ), and a measure of sensed catheter probe-tissue contact (solid lines including points 313 , 315 , 317 , and 319 ).
- the measure of sensed catheter probe-tissue contact may include, for example, force and/or dielectrically measured contact quality.
- the position of contacted region 302 of the actual tissue wall portion represented by rendered tissue wall region 50 relative to catheter tip 301 is represented in the graphs by dotted line 309 .
- probe-tissue contacts causing and/or represented by geometrical tissue deformations within the body are measured using one or more sensing modalities (for example, sensing by a force sensor, by sensing of impedance properties, or another sensing modality) that are only partially indicative of the overall geometrical effects of the contact.
- the one or more sensing modalities provide information as to the variation over time of a limited number of parameters communicated in the interaction data; for example, one, two, three, or more parameters.
- sensing information that encodes position of probe 11 is available.
- the position of probe 11 may be indicated by the interactive information absolutely and/or relative to the tissue portion represented by rendered tissue region 50 .
- the sensing information may be indicative of contact quality and/or contact force measured to exist between probe 11 and the tissue portion represented by rendered tissue region 50 .
- these measurements are used to guide changes made to simulated tissue region 50 and rendered probe 11 A, and the model rendered in turn to a sequence of images that visually simulate geometrical effects associated with the sensed information.
- the simulated model comprises a mechanical model of a tissue wall, including, for example, properties of tissue wall thickness, elasticity, density, velocity, and/or viscosity suitable to the tissue being simulated.
- Simulation of deformations optionally comprises applying a force commensurate with sensed forces and/or positions.
- simulated geometrical effects are generated to faithfully visualize those effects that are actually occurring.
- a mechanical model of the tissue wall is preferably provided with parameter values yielding realistic-looking behavior in reaction to applied simulated force and/or displacement.
- Graphical game engines commonly expose services for the simulation of physical interactions of scene elements, providing a potential advantage for ease of implementation.
- simulated geometrical effects may convey to an operator information about the contact, even though actual geometrical distortions (e.g., geometrical distortions introduced by touching contact with a probe, which may comprise pressing on tissue by the probe) are potentially different than the simulation shows: e.g., smaller in size, and/or modeled to simply indicate stages in deformation, without quantitative fidelity.
- a simulated mechanical model is optionally implemented with parameters giving model behaviors that are potentially different from the actual case.
- the model is implemented more simply; for example, as a mapping of a range of geometrically distorted wall shapes to one or more corresponding ranges of sensed input values.
- image information at least partially describing geometrical changes is available to the operator.
- the image information may be spatially incomplete: for example, an ultrasound cross-section that illustrates deformation in a planar cross-section of the tissue wall portion that an intrabody probe is penetrating.
- an imaging modality other than ultrasound is used, for example, X-ray fluoroscopy.
- the imaging modality provides images at a rate sufficient to guide manipulation of the catheter probe 11 , but this can optionally be a rate below motion frame rate; for example, at least 2-5 Hz.
- 3C, 3F, 3I, and 3L represent a time sequence of ultrasound images measured from an ultra sound probe located in the lumen of a left atrium 321 (about at the apex of ultrasound images 320 ), as a probe 11 crosses into the left atrium 321 from a right atrium 322 .
- rendered tissue wall region 50 and/or imaged tissue wall portion 50 B represent a tissue wall portion comprising an interatrial septum which is to be crossed by a catheter probe 11 at a contact region corresponding to contacted region 302 , for example the foramen ovale (which may be a weak spot in the interatrial septum, or even a residual opening between the two atria).
- the ultrasound images 320 do not simultaneously show in imaged tissue wall portion 50 B the whole three dimensional structure of the tissue wall portion represented by rendered tissue wall region 50 , they potentially do reveal partial information about how the wall is deforming.
- the partial information is used in a simulation of tissue-wall interaction dynamics to show a live-updated 3-D view of the tissue wall.
- a curve extending through the image plane along the visualized extent of the interatrial septum is optionally used as a guide, to which a simulated tissue wall geometrical distortion in that plane is fit; and moreover, may be used as a boundary condition to which out-of-plane tissue wall geometrical distortions are also constrained.
- FIG. 3A represents a rendered view showing the tip 301 of rendered catheter probe 11 A approaching the contacted region 302 of rendered tissue wall region 50 .
- Rendered tissue wall region 50 is shown in cross section; however, it should be understood that in other examples (not drawn) it may be shown from any other appropriate view angle.
- rendered tissue wall region 50 is shown opaque, transparent, or in any suitable combination of the two.
- the rendered tissue wall region 50 is shown in what is optionally its default and/or resting state geometry: for example, a geometry determined from a segmentation of an earlier MRI and/or CT scan (it should be understood that contact-independent behaviors such as periodic heart contractions are optionally superimposed on a default geometry).
- a simulator is configured to recognize that this non-interacting geometry default should be shown.
- a contact sensing parameter value 313 optionally indicates that there is no contact force exerted.
- the distance between catheter probe position 312 and the expected (optionally, sensed) wall position trace at dotted line 309 indicates that there is not yet any contact.
- the ultrasound image of FIG. 3C shows no deformation of rendered wall region 50 in the vicinity of target contacted region 302 , and/or shows a separation between rendered wall region 50 and rendered catheter probe 11 A.
- Use of 3-D rendering to augment ultrasound imaging of tissue wall deformation has the potential advantage of converting a relatively abstract-appearing (cross-sectional, black and white, visually noisy) display of ultrasound-imaged anatomical structures into a solid looking indication of how forces from a catheter are interacting with a heart wall, on the basis of which the penetration operation can be guided.
- FIGS. 3D-3F wall contact has begun, as shown ( FIG. 3D ) by the deformation of the rendered tissue wall region 50 in contact with catheter probe tip 301 .
- this simulation is generated to track the rising value of sensed contact (e.g., at point 315 ).
- the simulation is generated to track the forward movement of the probe tip 301 to point 314 ; optionally, the simulation scene is generated to track the forward movement with respect to expected or measured wall position trace at dotted line 309 .
- deformation of the imaged tissue wall portion 50 B in an ultrasound image ( FIG. 3F ) is used as a constraint to guide how the rendered tissue wall region 50 is geometrically distorted in 3-D.
- contact between imaged tissue wall portion 50 B and catheter probe 11 is determined and/or verified from the ultrasound image as well.
- FIGS. 3G-3I deformation has reached a maximum before catheter probe 11 breaks through the rendered tissue wall region 50 at contacted region 302 (foramen ovale).
- FIGS. 3J-3L rendered catheter probe 11 A is shown having broken through the rendered tissue wall region 50 .
- the breakthrough is optionally inferred by the sudden drop in sensed contact, optionally in concert with the continued advance of the catheter probe 11 . Additionally or alternatively, the breakthrough is inferred from the sudden increase in distance between the catheter probe 11 and the actual tissue wall (inferred, for example, from a sudden change in the dielectric environment of an electrode associated with probe tip 301 ).
- the breakthrough is optionally inferred from a relaxation of the geometrical distortion of imaged tissue wall portion 50 B, and/or by the observation of a portion of catheter probe 11 extending on the other side of the imaged tissue wall portion 50 B.
- FIGS. 10C-10D schematically represent aspects of geometrical deformation of a rendered tissue region 50 in contact with a rendered catheter probe 11 A, according to some embodiments of the present disclosure.
- displayed interactions of a rendered catheter probe 11 A with a rendered tissue wall region 50 include geometrical effects which look like deformations of the tissue wall that visually convey the forces of their interaction.
- FIGS. 10C-10D Full geometrical deformation, including mesh deformation, is described herein in relation to the examples of FIGS. 2A-2E and 3A-3L .
- FIGS. 10C-10D a different mode of indentation is shown, wherein relatively limited (and, potentially, computationally less expensive) geometrical deformation is simulated by the use of one or more rendering techniques such as normal mapping, depth mapping, shadow mapping, depth of field simulation, and the like.
- rendered catheter probe 11 A is shown in a sequence of positions relative to the rendered surface 1010 of a rendered tissue region 50 (optionally, rendered surface 1010 is rendered with the use of any suitable MAPs to provide it with a tissue-like visual appearance).
- rendered surface 1010 is rendered with the use of any suitable MAPs to provide it with a tissue-like visual appearance.
- each position 1011 , 1012 , 1013 is also vertically displaced with respect to the tissue surface.
- the only visual indication that positions 1012 1013 actually contact the surface is a slight successive truncation of the catheter probe tip 301 .
- FIG. 10D all the elements of FIG. 10C and their relative positions remain the same, but there is shown in addition the effects of manipulation of the surface normal map in region 1021 and indentation region 1022 , assuming a light source that is to the left and somewhat behind the plane of the drawing (normal mapping is described in relation to FIGS. 9A-9B ).
- the normal map manipulations have been chosen to give the appearance of geometrical changes—specifically, to indicate indentations in rendered surface 1010 .
- this geometrical appearance change is optionally triggered by any suitable input related to probe-tissue contact, for example, contact force measurements, dielectric contact quality measurements, and/or relative position measurements of tissue and probe.
- the normal map is also adjusted to reflect contact angle, for example, stretched along a dimension of elongated contact. Since no change in the underlying 3-D object geometry is required in order to produce this effect, there is a potential advantage for computational efficiency and/or reduced complexity of implementation compared to manipulation of the full 3-D geometry.
- the normal-mapped mode of representing geometrical deformation is of potential use to an operator for helping to gauge contact quality before lesioning, particularly in views having a substantial elevation angle above the contacted surface.
- views using normal mapping-type indentation are presented alongside views where 3-D geometrical distortion is used (for example, in cross-section, as discussed in relation to FIGS. 2A-2E ).
- normal mapping is used to exaggerate 3-D geometrical deformation, for example, to potentially increase emphasis and/or clarity.
- FIGS. 4A-4D schematically represent aspects of geometrical deformation of a rendered tissue region 50 due to an internal change such as edema, according to some embodiments of the present disclosure.
- FIGS. 10A-10B illustrate normal mapping superimposed on a rendered tissue region 50 in order to provide the geometrical appearance of a swelling, according to some embodiments of the present disclosure.
- FIGS. 5A-5B schematically represent global geometrical deformation of a tissue structure, for example, due to hydration state and/or more global edema than the example of FIGS. 4A-4D , according to some embodiments of the present disclosure.
- lesion 401 represents a recently formed lesion, for example, an RF ablation lesion. Over the course of a few minutes after RF ablation, tissue potentially reacts with a swelling response.
- the swelling response is simulated (for example, as a function of time according to the method of FIG. 1B , and/or based on measurements such as dielectric measurements that provide edema data) by one or both of increasing thickness in a region 403 surrounding lesion 401 (thickness changes can also be seen in the changing thickness of region 411 between FIGS. 4B-4D ; comparison also can be made to the baseline surface boundary 50 A), and a change in color and/or texture in region 402 (represented by the partial rings in the drawing).
- FIGS. 10A-10B illustrate how normal mapping can be used to potentially enhance the appearance of changes in a tissue, for example as a result of treatment and/or injury.
- Lesion 401 again indicates a recently formed lesion.
- a surface is rendered as combination image 1000 by combining baseline surface texture 1006 , with an injury response overlay 1002 .
- the method of combination is partial transparency overlaying; optionally, another method of combining within a rendering pipeline 1230 is chosen) the injury response is detectable, but not clearly delineated.
- FIG. 10B adds to this an overlay 1003 generated from a normal map (assuming a light source to the left of the page) that describes a swelling in the region of the injury response.
- FIGS. 4A-4D are optionally combined with the normal mapping of FIGS. 10A-10B .
- tissue thickening is represented by the change in tissue dimension between baseline thickness 420 A and swollen thickness 420 B.
- the thickening is optionally derived from measurements and/or extrapolation, for example, according to one or more of the methods of FIGS. 1A-1B .
- other changes are also made to represent tissue changes.
- the 3-D geometry of rendered tissue region 50 is optionally smoothed out with increasing swelling.
- normal mapping across the extent of surfaces 421 A, 421 B is adjusted as a function of swelling: for example, simulated wrinkles used to texture surface 421 A are optionally smoothed and/or stretched, for example to indicate a tauter appearance as at texture surface 421 B.
- FIG. 11A schematically illustrates a rendered image 1150 rendered from a camera viewpoint 1154 looking at rendered tissue region 50 along an axis 1156 parallel to a rendered catheter probe 11 A, according to some embodiments of the present disclosure.
- FIG. 11B schematically illustrates a field of view 1152 projected from camera viewpoint 1154 , including indication of axis 1156 , according to some embodiments of the present disclosure.
- Indentation region 1022 indicates a region of touching contact between probe 11 and rendered tissue region 50 .
- FIG. 11A and FIG. 11B comprise views looking onto the same simulation scene.
- a camera viewpoint 1154 is defined (e.g., as part of the definition of a camera 1223 , FIG. 8 ) to be positioned on or near the body of a catheter probe 11 , and looking along an axis 1156 which is substantially parallel to the rendered catheter probe 11 A (termed a “probe-mounted” view herein). Insofar as the system tracks (using measured position) the location and orientation of the actual catheter probe 11 which the rendered orientation of rendered catheter probe 11 A simulates, camera viewpoint 1154 also tracks (by adjustment to match the orientation of the rendered catheter probe 11 A) the orientation of the actual catheter probe 11 .
- rendered catheter probe 11 A appears in rendered image 1150 in a position similar to the position of hand-held tools seen in some “first-person” games, wherein a tool is shown on the screen in a position as if held before otherwise unseen avatar whose eyes define the camera position.
- this viewpoint configuration provides a potential advantage for obtaining a clear view of the field of operation of the probe, e.g., when it contacts tissue.
- registration between the probe and the viewpoint may comprise any other suitable combination of position and orientation.
- looking back along a catheter is potentially useful for obtaining a sense of what freedom exists in how the catheter probe can be presently positioned.
- Looking at the catheter itself from a more distant position potentially provides an improved sense of how the catheter relates to its overall surroundings.
- viewpoint optionally shifts (automatically and/or under manual control) depending on what action is being performed; for example, a probe-mounted view like that of FIG. 11A is optionally used for selection of where a probe should be advanced to contact tissue, while a vantage point more distant from the probe may be selected to show details of how probe and tissue interact once contact is made (for example, as shown in the sequence of FIGS.
- the angular size of the field of view (area subtended within the frame of the rendered image) is selected to be larger or smaller.
- a larger angular size provides a potential relative advantage in helping an operator orient within a simulated environment, while a smaller angular size is optionally used to magnify details and/or reduce simulated optical distortion in the rendered view.
- catheter probe is intended to include all such new technologies a priori.
- compositions, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
- a compound or “at least one compound” may include a plurality of compounds, including mixtures thereof.
- example and exemplary are used herein to mean “serving as an example, instance or illustration”. Any embodiment described as an “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
- method refers to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the chemical, pharmacological, biological, biochemical and medical arts.
- treating includes abrogating, substantially inhibiting, slowing or reversing the progression of a condition, substantially ameliorating clinical or aesthetical symptoms of a condition or substantially preventing the appearance of clinical or aesthetical symptoms of a condition.
- range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as “from 1 to 6” should be considered to have specifically disclosed subranges such as “from 1 to 3”, “from 1 to 4”, “from 1 to 5”, “from 2 to 4”, “from 2 to 6”, “from 3 to 6”, etc.; as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Robotics (AREA)
- Human Computer Interaction (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Otolaryngology (AREA)
- Plasma & Fusion (AREA)
- Cardiology (AREA)
- Processing Or Creating Images (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
In some embodiments, data sensed and/or operational parameters used during a catheterization procedure are used in the motion frame-rate updating and visual rendering of a simulated organ geometry. In some embodiments, measurements of and/or effects on tissue by sensed and/or commanded probe-tissue interactions are converted into adjustments to the simulated organ geometry, allowing dynamic visual simulation of intra-body states and/or events based on optionally partial and/or non-visual input data. Adjustments to geometry are optionally to 3-D positions of simulated data and/or to simulated surface properties affecting geometrical appearances (e.g., normal mapping). Optionally, the organ geometry is rendered as a virtual material using a software environment (preferably a graphical game engine) which applies simulated optical laws to material appearance parameters affecting the virtual material's visual appearance. Optionally, physiology, motion physics, and/or other physical processes are simulated based on live inputs, as part of assigning geometrical adjustments to the simulated tissue.
Description
- This application is a continuation of U.S. patent application Ser. No. 16/349,646 filed on May 14, 2019, which is a National Phase of PCT Patent Application No. PCT/M2017/057175 having International Filing Date of Nov. 16, 2017, which claims the benefit of priority under 35 USC § 119(e) of U.S. Provisional Patent Application Nos. 62/422,705, 62/422,708 and 62/422,713, all filed on Nov. 16, 2016. The contents of the above applications are all incorporated by reference as if fully set forth herein in their entirety.
- The present invention, in some embodiments thereof, relates to the field of medical procedures using intrabody probes navigable within intrabody spaces, and more particularly, to presentation of procedure data dynamically acquired during the course of a catheter procedure.
- Graphical game engines currently available comprise suites of software-implemented capabilities supporting the dynamic display and updating of simulated three-dimensional scenes. Typically, game engines include API calls supporting the creation and modification of a variety of scene objects (chiefly terrain, various types of physical objects, camera viewpoints, and lighting), a visual rendering pipeline, and optionally further services assisting tasks such as coding, animating, and/or debugging. User inputs are accepted from various user interface devices (including pointer devices, keyboards, game controllers, motion sensors, touch screens and the like) and converted into events in the simulated environment. Well-known game engines include the Unreal® and Unity® graphical game engines (www(dot)unrealengine(dot)com; www(dot)unity3d(dot)com). The rendering pipelines of modern game engines typically include facilities for creating realistic-looking visualizations of scene elements, based on properties assigned to instantiations of data objects representing those scene elements.
- Several medical procedures in cardiology and other medical fields comprise the use of catheters to reach tissue targeted for diagnosis and/or treatment while minimizing procedure invasiveness. Early imaging-based techniques (such as fluoroscopy) for navigation of the catheter and monitoring of treatments continue to be refined, and are now joined by techniques such as electromagnetic field-guided position sensing systems. Refinements to techniques for registration of previously imaged (for example, by CT and/or MRI) anatomical features of a patient to electromagnetic field-sensed catheter position are a subject of ongoing research and development, for example as described in International Patent Application No. IB2016/052687 to Schwartz et al. filed May 11, 2016; and International Patent Application No. IB2016/052692 to Schwartz et al. filed May 11, 2016. Intrabody sensing from catheter probes to determine information about, for example, tissue contact and/or lesion assessment, has also been described (e.g., International Patent Application No. PCT IB2016/052690 to Schwartz et al. filed May 11, 2016; and International Patent Application No. IB2016/052686 to Schwartz et al. filed May 11, 2016).
- There is provided, in accordance with some embodiments of the present disclosure, a method of visually displaying effects of a medical procedure, comprising: receiving interaction data from an intrabody probe indicating touching contacts between the intrabody probe and a body tissue region, wherein the interaction data at least associate the contacts to contacted positions of the body tissue region; adjusting geometrical rendering data representing a shape of the body tissue region to obtain adjusted geometrical rendering data, wherein the adjusting is based on an indication in the interaction data of a change in the shape of the body tissue region due to the contacting; rendering the adjusted geometrical rendering data to a rendered image; and displaying the rendered image.
- In some embodiments, the intrabody probe is a catheter probe.
- In some embodiments, the geometrical rendering data are adjusted as a function of time relative to a time of occurrence of at least one of the indicated contacts.
- In some embodiments, the receiving, the adjusting, and the displaying are performed iteratively for a sequence of contacts for which interaction data is received.
- In some embodiments, the adjusting is at a frame rate of 10 frames per second or more.
- In some embodiments, the rendering and the displaying are at a frame rate of 10 frames per second or more.
- In some embodiments, the geometrical rendering data include a representation of 3-D surface positions and a representation of surface orientations; wherein the two representations each correspond to a same portion of the shape of the body tissue region; and wherein the adjusting comprises adjusting the surface orientation representation to change a geometrical appearance in the rendering.
- In some embodiments, the representation of surface orientation is adjusted separately from the representation of 3-D surface positions.
- In some embodiments, the extent and degree of the adjusting model a change in a thickness of the body tissue region.
- In some embodiments, the interaction data describe an exchange of energy between the intrabody probe and the body tissue region by a mechanism other than contact pressure.
- In some embodiments, the adjusting comprises updating the geometrical rendering data based on a history of interaction data describing the exchange of energy.
- In some embodiments, the exchange of energy comprises operation of an ablation modality.
- In some embodiments, the updating changes an indication of lesion extent in the geometrical rendering data based on the history of interaction data describing the exchange of energy by operation of the ablation modality.
- In some embodiments, the updating comprises adjusting the geometrical rendering data to indicate a change in mechanical tissue properties, based on the history of interaction data describing the exchange of energy.
- In some embodiments, the ablation energy exchanged between the intrabody probe and the body tissue region comprises at least one of the group consisting of: radio frequency ablation, cryoablation, microwave ablation, laser ablation, irreversible electroporation, substance injection ablation, and high-intensity focused ultrasound ablation.
- In some embodiments, the updating comprises adjusting the geometrical rendering data to indicate a change in tissue thickness, based on the history of interaction data describing the exchange of energy.
- In some embodiments, effects of the history of interaction data describing the exchange of energy are determined from modelling of thermal effects of the exchange of energy on the body tissue region.
- In some embodiments, the modelling of thermal effects accounts for local tissue region properties affecting transfer of thermal energy between the intrabody probe and the body tissue region.
- In some embodiments, the adjusting is as a function of time relative to a time of occurrence of at least one of the indicated contacts, and comprises adjusting the geometrical rendering data to indicate gradual development of a change in geometry of the body tissue region as a result of the contacts.
- In some embodiments, the gradually developed change in geometry indicates a developing state of edema.
- In some embodiments, the method comprises geometrically distorting the rendering of the geometrical rendering data into a swollen appearance, to an extent based on the indicated development of the state of edema.
- In some embodiments, the contacts comprise mechanical contacts, and the gradual development of a change in geometry indicates swelling of the body tissue region in response to tissue irritation by the mechanical contacts.
- In some embodiments, the contacts comprise an exchange of energy between the intrabody probe and the body tissue region by a mechanism other than contact pressure.
- In some embodiments, the interaction data indicate a contact force between the intrabody probe and the body tissue region.
- In some embodiments, the interaction data indicate a contact quality between the intrabody probe and the body tissue region.
- In some embodiments, the interaction data indicate a geometrical distortion introduced by touching contact between the intrabody probe and the body tissue region.
- In some embodiments, the adjusting comprises geometrically distorting the rendering of the geometrical rendering data at a region of touching contact to an extent based on the interaction data.
- In some embodiments, the geometrically distorting the rendering of the geometrical rendering data includes geometrically distorting a portion of the geometrical rendering data which is not geometrically corresponding to the portion of the body tissue region from which the interaction data were obtained.
- In some embodiments, the interaction data comprises a 2-D image including a cross-sectional view of the body tissue region, and the distorted portion of the geometrical rendering extends out of a plane in the geometrical rendering data corresponding to the plane of the cross-sectional view.
- In some embodiments, the interaction data describes injection of a substance from the intrabody probe to the body tissue region, and the adjusting comprises changing a thickness of tissue in the body tissue region, corresponding to an effect of the injection of the substance.
- In some embodiments, the rendering includes a view of the intrabody probe. In some embodiments, the rendering is rendered from a viewpoint at least partially defined by a measured position of the intrabody probe relative to a surface of the body tissue region.
- In some embodiments, the measured position includes a measured orientation of the intrabody probe.
- In some embodiments, the intrabody probe contacts a lumenal surface of the body tissue region.
- In some embodiments, the intrabody probe contacts an external surface of an organ comprising the body tissue region.
- In some embodiments, the body tissue region comprises a tissue of at least one organ of the group consisting of the heart, vasculature, stomach, intestines, liver and kidney.
- In some embodiments, the method further comprises assigning material appearance properties across an extent of the geometrical rendering data, based on the interaction data; and wherein the displaying of the rendered image uses the assigned material appearance properties.
- In some embodiments, the rendering comprises a rendering in cross-section of the body tissue region.
- In some embodiments, the extent and degree of the adjusting simulate stretching of the body tissue region.
- In some embodiments, the geometrical rendering data represent a shape of a body tissue region comprising a heart chamber; and wherein the adjusting comprises adjusting a size of the heart chamber, based on the current heart rate data.
- In some embodiments, the adjusting a size of the heart chamber comprises adjusting a size of a lumen of the heart chamber, based on the current heart rate data.
- In some embodiments, the adjusting a size of the heart chamber comprises adjusting a thickness of a wall of the heart chamber, based on the current heart rate data.
- In some embodiments, the adjusting geometrical rendering data comprises adjusting a position of the intrabody probe in the geometrical rendering data relative to a wall of the heart chamber, based on the current heart rate data.
- There is provided, in accordance with some embodiments of the present disclosure, a system for visually displaying effects of interactions between an intrabody probe and a body tissue region, the system comprising computer circuitry configured to: receive interaction data indicating the interactions, and associated to positions on a surface of the body tissue region; adjust geometrical rendering data representing a shape of the body tissue region to obtain adjusted geometric rendering data, wherein the adjusting is based on an indication in the interaction data of a change in the shape of the body tissue region; render the adjusted geometrical rendering data to a rendered image; and present the rendered image.
- In some embodiments, the rendering is performed using a graphical game engine, and the interaction data include sensed positions of the intrabody probe.
- In some embodiments, the interaction data include probe-sensed characteristics of tissue in the vicinity of the intrabody probe.
- In some embodiments, the interaction data includes operational data describing operation of the intrabody probe to treat tissue.
- There is provided, in accordance with some embodiments of the present disclosure, a method of visually displaying a medical procedure, comprising: receiving position data indicating the position of an intracardial probe within a heart; receiving heart rate data for the heart; adjusting geometrical rendering data representing a shape of the heart and a shape and position of the intracardial probe to obtain adjusted geometric rendering data; wherein the adjusting is based on the heart rate data to maintain an accuracy of positioning of the intracardial probe relative to the heart as average size of the heart changes as a function of a heart rate; rendering the adjusted geometrical rendering data to a rendered image; and displaying the rendered image.
- Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.
- As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.”
- Furthermore, some embodiments of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon. Implementation of the method and/or system of some embodiments of the invention can involve performing and/or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of some embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware and/or by a combination thereof, e.g., using an operating system.
- For example, hardware for performing selected tasks according to some embodiments of the invention could be implemented as a chip or a circuit. As software, selected tasks according to some embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In an exemplary embodiment of the invention, one or more tasks according to some exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data. Optionally, a network connection is provided as well. A display and/or a user input device such as a keyboard or mouse are optionally provided as well.
- Any combination of one or more computer readable medium(s) may be utilized for some embodiments of the invention. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer readable medium and/or data used thereby may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
- Computer program code for carrying out operations for some embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- Some embodiments of the present invention may be described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example, and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.
- In the drawings:
-
FIG. 1A is a schematic flowchart illustrating the calculation and display of an image of a scene comprising simulated tissue having a geometry and/or geometrical appearance dynamically linked to interactions of the tissue with a catheter probe, according to some embodiments of the present disclosure; -
FIG. 1B is a schematic flowchart illustrating the calculation and display of a geometry and/or geometrical appearance dynamically changing over time as a result of prior interaction of the tissue with a catheter probe, according to some embodiments of the present disclosure. -
FIGS. 2A-2E illustrate a 3-D rendered display for indicating lesioning status to a user, according to some exemplary embodiments of the present disclosure; -
FIGS. 3A, 3D, 3G, and 3J schematically represent a sequence of rendered views of a catheter probe passing through a tissue wall portion, according to some embodiments of the present disclosure; -
FIGS. 3B, 3E, 3H, and 3K schematically represent a graph of position versus time and measured contact versus time for the catheter probe ofFIGS. 3A, 3D, 3G, and 3J , according to some embodiments of the present disclosure; -
FIGS. 3C, 3F, 3I, and 3L schematically represent an ultrasound image at a cross-section of a heart at the atrial level, and corresponding to the sequence ofFIGS. 3A, 3D, 3G, and 3J , according to some embodiments of the present disclosure; -
FIGS. 4A-4D schematically represent aspects of geometrical deformation of a tissue region due to an internal change such as edema, according to some embodiments of the present disclosure; -
FIGS. 5A-5B schematically represent global geometrical deformation of a tissue structure, for example, due to hydration state and/or more global edema than the example ofFIGS. 4A-4D , according to some embodiments of the present disclosure; -
FIG. 6 is a schematic representation of a system configured for display of interactions between a catheter probe and a body tissue region, and/or their effects, according to some embodiments of the present disclosure; -
FIG. 7 schematically represents software components and data structures of an interaction analyzer of a system, according to some embodiments of the present disclosure; -
FIG. 8 schematically represents components, inputs, and outputs of a graphical game engine operating to manage and render scene elements to images for presentation at motion frame-rate, according to some embodiments of the present disclosure; -
FIGS. 9A-9B schematically represent, respectively, different geometrical data representations of flat and indented surfaces, according to some embodiments of the present disclosure; -
FIGS. 10A-10B illustrate normal mapping superimposed on a tissue region in order to provide the geometrical appearance of a swelling, according to some embodiments of the present disclosure; -
FIGS. 10C-10D schematically represent aspects of geometrical deformation of a tissue region in touching contact with a catheter probe, according to some embodiments of the present disclosure; -
FIG. 11A schematically illustrates a rendered image rendered from a camera viewpoint looking at tissue region along an axis parallel to an intrabody probe; according to some embodiments of the present disclosure; and -
FIG. 11B schematically illustrates a field of view projected from camera viewpoint, including indication of axis, according to some embodiments of the present disclosure. - The present invention, in some embodiments thereof, relates to the field of medical procedures using intrabody probes navigable within intrabody spaces, and more particularly, to presentation of procedure data dynamically acquired during the course of a catheter procedure.
- Overview
- An aspect of some embodiments of the current invention relates to the motion frame-rate, real-time display of geometrical effects on a simulation scene comprising simulated tissue, wherein the geometrical effects comprise changes to a geometrical representation of one or more elements in the scene, and wherein the changes are made based on ongoing and/or intermittent measurements of interactions between a catheter probe and the actual tissue being simulated.
- Herein, “geometrical effects” optionally comprise one or both of changes to the 3-D position of simulated elements, and changes to the geometrical appearance of simulated elements. Geometrical appearance, as distinct from 3-D position, comprises geometrical that can give a relatively raised, indented, smoothed, irregular, blurred, focused, closer, further, shaded, and/or unshaded appearance to a portion of a surface, without affecting 3-D coordinates of the surface itself. Geometrical appearance optionally comprises features implemented at least in part by computational methods-for example, normal mapping, depth mapping, and/or shadow mapping.
- In some embodiments, a software environment specialized for interactive visual simulations (for example a 3-D graphical game engine such as the Unreal® and/or Unity® graphical game engines) is used as a basis for implementing a simulation of a scene comprising simulated tissue (herein, such a scene is referred to as a simulation scene). For rendering images by the game engine's graphics pipeline, geometrical rendering data are optionally supplemented with one or more material appearance properties (preferably a plurality of such properties) that describe how virtual materials such as simulated tissue interact with simulated optical laws and lighting conditions to generate images for display. The geometrical rendering data optionally comprises a geometrical representation of a scene including tissue. In some embodiments, the rendering is implemented, by a rendering pipeline of the graphical game engine.
- It should be understood that one or more capabilities used by some embodiments of the present invention and described as implemented by a game engine are optionally provided by alternative implementations not packaged in a game engine distribution, including: use of customized software, firmware and/or hardware; and/or use of separately distributed software libraries. The term “game engine” as used herein should be understood to encompass computer-implemented collections of such typical game engine capabilities as may be used by some embodiments of the present invention (examples of which are described herein), whether or not they have been packaged into a game engine distribution.
- As used herein, the term “rendering” refers to the process of generating an image from a 2-D or 3-D model or models by means of one or more computer programs. The model may contain object parameter definitions and/or data structures; for example, geometry, viewpoint, texture, lighting, and/or shading information as a description of the virtual model. The data contained in the model may be passed to a rendering program to be processed and output to a digital image or raster graphics image file. The processing comprises one or more processing stages referred to collectively as a “pipeline”, and carried out by the software and hardware of a rendering device. In some embodiments, the rendering device includes one or more of a general purpose CPU and graphics hardware specialized for use within a rendering pipeline.
- In some embodiments, updating of the simulation scene during a procedure is at least partially based on data inputs from one or more data sources supplying data during the procedure (for example, sources of probe-tissue interaction data such as sensing data and/or treatment status data described in relation to
FIG. 6 andFIG. 7 ). Graphical game engines typically receive inputs from game input devices such as pointer devices, keyboards, game controllers, body motion sensors, and the like. In some embodiments of the present invention, inputs optionally are from one or more additional or alternative inputs related to the performance of a catheter procedure—for example, catheter probe position data, data tracking the intrabody use of catheter probes (particularly but not exclusively use to deliver treatment; e.g. by delivering treatment energies), and/or measurement data, for example measurement data obtained from an intrabody probe (herein a catheter probe is used as an example of an intrabody probe, but it should be understood that another intrabody probe is optionally used in some embodiments; e.g., a capsule probe). - In typical applications of game engines, the simulated world (also referred to herein as a simulated scene) maintained by a game engine does not directly correspond to any simultaneous objective-world state. However, an object of some embodiments of the current invention is to simulate the reality of a clinical situation sufficiently to allow substantially seamless interaction with that reality via a presentation of the scene simulation. In some embodiments, this comprises maintaining and displaying a simulated scene having a useful level of correlation with the changing reality of the actual tissue environment (as reflected in data available to characterize it).
- Optionally, usefulness derives from actions which are taken by an operator on the basis of information in the scene simulation presentation which reveals to a catheter operator the changing state of the tissue environment. Potentially, the useful level of correlation with the changing reality of the actual tissue environment allows an operator to realize the state of the tissue or a change in that state, optionally without adding to the scene annotations indicative of such state or state change. Optionally, usefulness derives from the presented scene simulation providing fidelity of representation sufficient that actions the operator takes based on the presented scene simulation produce effects as intended in the corresponding real-world environment. Optionally, the useful level of correlation with the changing reality of the actual tissue environment is a level of correlation sufficient to allow the operator to perform actions within the real-world environment based on the presented scene simulation. The presented scene simulation may include effects simulating results of the actions taken by the operator.
- In some embodiments of the invention, a display of a user interface is updated at motion frame rate with rendered images of a simulation scene simulating an intrabody probe (for example, a probe at the end of a catheter) and its tissue environment. The updating optionally indicates changes to an actual intrabody probe and tissue environment which occur as an operator manipulates the actual intrabody probe (wherein the updating is based, e.g., on position data describing the position of the intrabody probe), and/or operates the intrabody probe for treatment and/or diagnostic measurement of the actual tissue environment (wherein the updating is based, e.g., on operational data describing operation of the intrabody probe to treat tissue and/or measure properties of the tissue). In some embodiments, changes are shown in the rendered images as if occurring within the actual material of the tissue environment.
- For example, immediate and/or developing effects of ablation are shown by simulating appearance and/or geometrical changes in ablated tissue (in contrast, for example, to marks, icons, and/or symbols indicating ablation events). In some embodiments, tissue is deflected and/or an intrabody probe shape is distorted in rendered images of a simulation scene based on interaction data indicating touching contacts. These and other simulation scene changes (for example, other simulation scene changes as described herein) potentially provide an operator with a sense of presence in the actual tissue region accessed by an intrabody probe, and/or intuitive indications of changing status during a procedure underway.
- In some embodiments, a smoothly updating, naturalistic appearance of a rendered view of a simulation scene is achieved even when available inputs indicating changes to the simulation scene are incomplete, slowly updating, irregular, and/or lagging (for example, as described in relation to
FIG. 1B ). Herein, “naturalistic” scene appearance means that the displayed scene gives an operator the impression of substantial materials (i.e., volume-occupying, as opposed to merely shell defining materials) and/or reactive materials existing in a fluidly navigable environment. The reactions of the materials in turn become a significant part of the information which an operator relies on to act within the actual environment that the scene simulates. A material moreover may be simulated as occupying volume per se (for example, as a wall having thickness), rather than merely as a boundary extending in space (for example, as a structure defining a surface, but having no well-defined thickness). - Optionally, appearances in rendered views of simulation scene objects are moreover “realistic” in some aspects. For example, tissues, in some embodiments, are provided with material appearances that mimic their appearance in life, and to this extent are “realistic”. In some embodiments of the invention, for example, geometrical deformation of tissue in a simulation scene is directly based on deformation measurements, for example, ultrasound images of septal wall deflection during transseptal puncture are optionally converted into movements in three dimensions of a simulated septal wall's deflection.
- However, non-realistic material appearances and even objects are optionally or additionally provided to a naturalistic scene. Degree of tissue compression, for example, is optionally used as a visual proxy for probe-tissue contact force (force of touching contact), whether or not the real tissue is indeed compressed.
- In some embodiments of the invention, motion due to normal heart pulsations is indicated in the simulation by pulses with corresponding timing; this potentially helps an operator understand the difference between a probe in intermittent wall-touching contact and continuous wall-touching contact. Optionally, however, the amplitude of the simulated pulses is reduced from the real state, to stabilize the visual environment an operator uses for navigation. Additionally or alternatively, some geometrical states (such as degree of vasodilation and/or vasoconstriction) are optionally exaggerated for clarity.
- In some embodiments, the size of one or more heart chambers is adjusted based on current heart rate, and/or the size and/or movements of a probe relative to the heart chamber are scaled based on current heart rate. It has been observed that as heart rate increases, the maximum size of the heart between contractions correspondingly decreases. This decrease can also be observed in the sizes adopted by heart chamber at other phases of the heartbeat cycle. For example, in some embodiments, the average rendered size of the heart over the course of a heartbeat cycle is decreased as a function of measured heart rate increase. The average size change is optionally to either a beating or non-beating rendered representation of the heart. Optionally heart wall thickness correspondingly increases with decreasing chamber size. It is a potential advantage to incorporate these dynamic changes in anatomy into a display used by an operator to guide an intrabody probe, and/or to improve the accuracy and/or precision with which actions by and/or through the probe (e.g., contacts and/or treatment administration) are associated to positions on the heart wall.
- In another example, visual rendering of blood is preferably suppressed, making visualization possible from within a vascular or cardiac lumen. Optionally, one or more normally invisible tissue properties such as temperature are encoded by visual conventions; appearing as, for example in the case of temperature: ice, flame, smoke, and/or steam. In some embodiments, guiding marks related to planning and/or procedure progress are optionally provided as part of the simulation scene's naturalistic rendering to images.
- Among the services provided by some prominent graphical game engines are motion physics simulators (e.g., for modeling collisions, accelerations, elastic deformations, object destruction, and the like). In some embodiments, one or more these motion physics simulators is used to increase the naturalistic impression and/or realistic fidelity of a rendered simulation scene. In some embodiments, one or more of these motion physics simulators is used to increase the naturalistic impression of a scene. Additionally or alternatively, geometrical deformations are used to indicate aspects of a procedure where a probe contacts tissue. As for the case of material appearances, the geometrical deformations may be, but are not necessarily realistic.
- A general potential benefit of naturalistic (optionally also realistic) presentation of a scene comprising simulated tissue is to reduce cognitive load on a catheter operator and/or team of operators working with an intra-body probe. Such procedures typically have multiple interacting factors and requirements affecting procedure outcome. These factors and requirements preferably are tracked simultaneously and/or may need to be accounted for with little time for consideration. Examples of these factors and requirements in a standard operating environment optionally include any one or more of the following:
-
- Positions of one or more probes are selected and verified with respect to a procedure plan.
- Results of procedure actions are verified.
- If planned actions and actual procedure actions begin to diverge, adjustments may be made on the fly.
- Similarly, actual procedure results may not match planned results.
- Some parts of the procedure optionally rely on discovering tissue states and locations, for example, based on sensing from the catheter probe.
- Such discovery steps are preferably performed quickly and without undue repetition of catheter motions.
- Particularly after plan and procedure diverge, relative timing of past procedure steps can be critical for deciding what current and/or following steps are optimal. For example, edema that gradually develops following lesioning (as in certain ablation procedures) can interfere with further lesioning, potentially leading to a need to adjust parameters and/or positions away from those first planned if there is a delay or error in an earlier phase of the procedure.
- Similarly, the interpretation of sensing data is optionally dependent on the timing and/or results of previous actions. For example, a detected current impulse block in heart tissue may be correlated with the recent history of lesioning in an area to determine if the impulse block is more likely to be permanent (e.g., pre-existing, or in a well-lesioned area) or temporary (e.g., in a region where inactivation, for example, due to use of a lesioning modality, is potentially reversible).
- In some embodiments of the current invention, immediate visual presentation of material appearance helps to control the complexity these factors can create. Potentially, a naturalistic display of information is more immediately understood by the clinical personnel, and/or intuitively draws attention to clinically relevant state updates. For example, instead of the operator team having to consider and/or calculate whether a previously lesioned tissue region was lesioned long enough ago to have converted to edematous tissue: in some embodiments, the edema is directly displayed as edematous tissue. Where a continuous lesion is planned, likely gaps in lesion extent can be directly seen in their overall context in the scene simulation, helping to guide the decision as to whether and/or how the procedure should be adapted to compensate.
- A naturalistic presentation of catheter procedure information also contrasts, for example, with the presentation of this information using graphs and/or symbols. Familiarization with more abstract symbols, measures and graphs potentially requires prolonged training. An extra level of symbolic abstraction also potentially slows recognition by the physician of important changes in the state of the catheter interface or the tissue.
- In some embodiments of the invention, a substantially continuous stream of input data describing a tissue region and/or probe interactions with it is used as a basis for correspondingly continuous updating of a scene simulating the tissue region. Optionally, the input data comprise only partial and/or indirect description of the tissue region. For example, spatially partial input data (such as from a cross-sectional image) is used in some embodiments to infer spatial changes over a larger region (such as a three-dimensional space extending outside the cross-sectional image). In another example, sensed pressure data from a catheter probe is optionally converted into corresponding movements in three-dimensional space of pressed-against tissue in the simulation scene. In some embodiments, effects on tissue by energy delivered from a lesioning probe are optionally simulated in a scene based on a model of energy dispersion in the tissue (e.g., thermal modeling, optionally thermal modeling incorporating information from anatomical data), and knowing a few parameters about how the energy was delivered (e.g., how long, with what energy, where, and/or with what efficacy).
- In some embodiments, sensed input data is used as a basis for updating the state of the scene-representation of the probe itself. For example, sensed input data is used to adjust the position of the probe's scene representation, and/or to control the parameters of a viewpoint used in creating a rendered image of the simulation scene, wherein the viewpoint is defined by a position of the probe. In some embodiments, sensed input data (e.g., indicating tissue contact force and/or quality) is used as a basis for changing the shape of a simulated probe. The shape may be adjusted based, for example, on a mechanical model of the actual probe and/or a catheter or other device that carries the probe (e.g., a mechanical model which models the flexibility and geometry of the actual probe and/or associated carrying device). For example, some probes such as lasso electrode probes comprise a flexible portion that can be bent in response to the forces of touching contact. In another example, an otherwise stiff probe may be carried on a flexible member such as a catheter used to manipulate the probe. In some embodiments, sensed input data indicates forces applied to the actual probe, and the simulated probe is modified in response to the indicated forces according to the parameters of the mechanical model. The modification may also take into account other data, for example, a position of the probe itself, geometry of the chamber in which the probe is positioned, and/or a position of an aperture via which a probe is passed into a heart chamber or other body lumen. Potentially, the modeling allows a changing simulated probe shape to indicate changes to the actual intrabody probe in use, without requiring direct measurement of the actual intrabody probe's shape (e.g., by imaging).
- Additionally or alternatively, in some embodiments, correlation between a simulation scene and the actual tissue region it represents is maintained at least in part by treating occasional inputs as describing events that (in the real world) trigger and/or entail certain predictable consequences to follow. In the simulation scene, the input optionally acts as a trigger for software routines that simulate those consequences. In some embodiments, longer-term effects of lesioning are optionally simulated by a physiological simulation. For example, a simulation converts estimated lesion damage into parameters for a script describing the gradual onset of tissue edema as it appears in rendered views of the simulation scene.
- In some embodiments, moreover, partial and/or occasional inputs optionally guide calibration of the simulation scene maintained by the game engine so that it better-corresponds to the state of the actual tissue region. For example, sensing of tissue state or position directly using the probe as a sensing modality (additionally or optionally by another sensing modality, such as ECG, monitoring of patient hydration, or an intermittently acquired image) is optionally used to update a model state, potentially restoring and/or improving a degree of synchronization between the actual tissue region and the simulation scene.
- Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings. The invention is capable of other embodiments or of being practiced or carried out in various ways.
- Methods and Systems for Visual Modeling of Probe-Tissue Interactions and Their Effects
- Reference is now made to
FIG. 1A , which is a schematic flowchart illustrating the calculation and display of an image of a simulation scene, the simulation scene comprising simulated tissue having a geometry and/or geometrical appearance dynamically linked to interactions of the tissue with a catheter probe 11 (shown, for example, inFIGS. 3A, and 6 ), according to some embodiments of the present disclosure. In overview, a cycle of activities of the method includes, in some embodiments: -
- Receiving interaction data between
probe 11 and tissue (at block 110). - Calculating geometrical effects altering a scene, the geometrical effects being indicated by the interaction data (at block 112).
- Rendering the altered scene for visual presentation (block 114).
- Receiving interaction data between
- Illustrating examples of systems configured for carrying out this method, further reference is made to
FIG. 6 , which is a schematic representation of asystem 1 configured to present interactions between acatheter probe 11 and abody tissue region 7, and/or effects of these interactions.System 1 is optionally configured to present the interactions and/or their effects atuser interface 55. Reference is also made toFIG. 7 , which schematically represents software components and data structures of aninteraction analyzer 21 ofsystem 1, according to some embodiments of the present disclosure. - Receipt of Interaction Data
- The flowchart of
FIG. 1A begins; and atblock 110, in some embodiments, a system 1 (for example, thesystem 1 ofFIG. 6 ) configured for display of interactions between acatheter probe 11 and abody tissue region 7 and/or results of such interactions receives interaction data. The interaction data may include, for example, data acquired by a sensing modality, and/or operation data of a treatment modality. - The interaction data, in some embodiments, comprise data indicating and/or numerically describing characteristics of interactions between
probe 11 andtissue region 7; including, for example, positions of the probe and/or of contacts between the probe and the tissue region, contact characteristics characterizing a contact between the probe and the tissue region, measurements taken by the probe (for example, measurements of the physiological state and/or dielectric properties of the tissue region), and/or actions of the probe (e.g., operations comprising delivery of treatment). Optionally, interaction data comprise imaging data obtained during probe-tissue interactions. -
System 1 ofFIG. 6 indicates examples of sources of interaction data that are optionally provided in some embodiments of the present disclosure. Interaction data is optionally received in raw form, or in any suitable stage of intermediate processing to indicate a parameter and/or status of more direct applicability. With respect toFIG. 6 , details for certain types of interaction data available in some embodiments of the invention (e.g., one type, all types, or any other combination of types) are now described for: position data, imaging data, dielectric tissue property sensing, general sensing (for example, of temperature and/or contact force), and treatment interactions. - Position data: In some embodiments (optionally), position data is sensed by use of an electromagnetic field navigation subsystem, comprising
body surface electrodes 5, field generator/measurer 10,position analyzer 20, and sensing electrodes 3 (for example,sensing electrodes 3 located on catheter probe 11). The electromagnetic field navigation subsystem operates by inducing at least one time-varying electromagnetic (EM) field 4 (for example, three crossing EM fields, each of a different frequency) across a region ofbody 2 including abody tissue region 7 that is targeted to be navigated bycatheter 9 andcatheter probe 11. Typically, the time varying EM field is induced with a total inter-electrode voltage of one volt or less, at a frequency of between about 10 kHz and about 1 MHz. Voltages sensed at different positions by sensingelectrodes 3 are characteristic of corresponding intrabody positions, allowing conversion byposition analyzer 20, for example of voltage measurements to position information (for example, after exploration of anintrabody region 7 using theprobe 11, and/or initially based on EM fields simulated with respect to a particular configuration of electrodes and anatomical data 31). - In some embodiments of the invention, position sensing at least partially comprises sensing of the relative position of a
catheter probe 11 and a surface oftissue region 7; for example, by sensing of the dielectric environment of asensing electrode 3 ofcatheter probe 11. - Imaging data: Additionally or alternatively, in some embodiments, there is provided an imaging modality 6, which may include, for example, an ultrasound modality and/or a fluoroscopy modality. Imaging modality 6 is configured to monitor
body tissue region 7 during use of the catheter probe. Characteristics monitored by imaging modality 6 optionally comprise position information of the probe and/or of tissue affected by operation of the probe. In some embodiments, the imaging modality is in continuous, real-time (e.g., 5, 10, 15, 20, 30, 60 or more images per second) use during at least some phase of a procedure. Optionally,system 1 continuously processes changes in images produced by imaging modality 6 for immediate display (within a few milliseconds, for example, within 250 milliseconds) atuser interface 55. - Additionally or alternatively, in some embodiments, imaging modality 6 operates less frequently (for example, once every minute to every five minutes, or at another interval). An infrequently updating imaging modality 6 is optionally used for providing periodic “key frames” used to synchronize and/or verify display of simulated states of
tissue region 7 and/orcatheter 9. Optionally, imaging information provides indirect information about elements in the scene simulation—for example, displacement of an organ boundary imaged with relatively high contrast optionally provides information about the displacement of a less clearly visualized organ in communication with the organ boundary. Also for example, data imaged in a tissue cross-section optionally provides information which can be extrapolated to regions outside of the cross-section. Optionally, an imaging modality is used only briefly during a procedure, for example, during a particular phase of a procedure such as a septal crossing. - Dielectric tissue property sensing: In some embodiments, dielectric property measurements (e.g., of impedance behavior of the electrical fields) providing indications of tissue state, and/or of tissue-probe contacts, are made by
dielectric property analyzer 22. The measurements, in some embodiments, use sensing electrodes 3 (or a subset thereof) to determine impedance behavior of electromagnetic fields generated in conjunction with field generator/measurer 10, and optionallybody surface electrodes 5. Dielectric distance sensing has already been mentioned in connection with the discussion of position data. Additionally or alternatively, in some embodiments, dielectric property sensing is used to distinguish, for example, the state of tissue as healthy, fibrotic, edematous, charred or charring, and/or electrophysiologically active (or capable of being so, e.g., retaining cellular integrity after attempted ablation). In some embodiments, dielectric property sensing identifies and/or verifies tissue type(s) in a sensed region. Dielectric property sensing for such properties is described, for example, in International Patent Application Nos. PCT/IB2016/052690 and PCT/M2016/052686, the contents of which are incorporated by reference herein in their entirety. - General sensing: In some embodiments, other sensor information (sensed by optional other sensor(s) 14 on catheter probe 11) is used as interaction data. For example, a force sensor may provide information on contact between a
catheter probe 11 and its environment. The information may include indication that the contact has happened, and optionally with what degree of force. - Additionally or alternatively, contact quality and/or contact force information is provided from
sensing electrodes 3, based on impedance measurements and/or sensing of dielectric properties. For example, where a surface oftissue region 7 and anelectrode 3 of acatheter probe 11 are in contact, dielectric sensing optionally is used to provide an indication of contact quality (optionally as related to a corresponding contact force), for example as described in International Patent Application No. PCT/IB2016/052686, the contents of which are included by reference herein in their entirety. Contact quality may include dielectric and/or impedance sensing of the tissue environment of one or more electrodes, based on which force, pressure, area, and/or angle of contact between electrodes and the tissue environment is inferred, relatively and/or absolutely. - In some embodiments, other sensor(s) 14 comprise a temperature sensor, flow sensor, and/or another sensor configured to provide information about the environment of the
catheter probe 11. - Treatment interactions: In some embodiments, a
treatment element 8 is provided oncatheter probe 11. The interaction data (for example,treatment status data 1102 ofFIG. 7 ) optionally comprises information about the operation of the treatment element and/or components controlling its effect (for example, power levels, activation events, timing settings, and/or substance amounts administered). -
Treatment element 8 is optionally a probe for ablation treatment using an ablation modality; for example, one or more of the following ablation modalities: radio frequency ablation, cryoablation, microwave ablation, laser ablation, irreversible electroporation, substance injection ablation, and/or high-intensity focused ultrasound ablation. In some embodiments,treatment element 8 is also used as a sensing electrode 3 (for example, in RF ablation, a treatment delivery electrode may also be used to sense the effect of local dielectric properties on measured electrical field impedance). Optionally,treatment element 8 is operated in conjunction with atreatment controller 13, configured to providetreatment element 8 with functions such as power, control (e.g., of signal frequency, phase, and/or timing), and/or monitoring. In some embodiments, thetreatment element 8 is configured to deliver a treatment other than ablation (for example, temporary activation or inactivation of tissue activity) using heat, cold, electrical current, sound radiation and/or light radiation. - Optionally,
treatment element 8 comprises an injection apparatus, used to inject a treatment substance, and/or a substance used in diagnosis such an imaging tracer. In some embodiments, the injected substance comprises ethyl alcohol, Botox, living cells, and/or growth factor. Optionally, the injected substance comprises a radiolabeled substance, an immunosubstance, and/or a radiopaque trace substance. Optionally,treatment element 8 comprises a tool for manipulating tissue (e.g., grasping, holding, sampling, cutting, attaching, and/or suturing). Data indicating operations of treatment element 8 (and/or the rest of a treatment delivery system, for example, including a treatment controller 13) are optionally available withinsystem 1, and in particular available to modules ofinteraction analyzer 21, as treatment status data 1102 (FIG. 7 ). It should be understood thattreatment status data 1102 are not limited strictly to data about operations targeted to disease treatments as such, but optionally also include administration of substances and/or energy affecting a tissue region for a diagnostic purpose. - Interaction data relating to the interactions of a
treatment element 8 with atarget tissue region 7 include, for example, duration of operation, time of operation, nature and/or concentration of substances delivered, quantities of substances delivered, and/or power and/or frequencies of an exchange of energy between thetreatment element 8 andtissue region 7 by a mechanism other than contact pressure (e.g., energy delivered for heating, energy removed for cooling, and/or energy delivered for disruption of structure). Optionally, operational settings are combined with information about the position and/or environment oftreatment element 8 in order to derive interaction data. In some embodiments, such combination is performed by one or more ofsimulators 1110 ofFIG. 7 . - It should be understood that not every source of interaction data described in relation to
FIG. 6 is necessarily implemented in every embodiment of the invention. Preferably, there is provided in embodiments of the invention at least a position sensing modality (e.g., comprising position analyzer 20), and a treatment modality which is monitored through treatment status data (e.g., comprising treatment controller 13). InFIG. 7 , data from sensing indicated assensing data 1101 optionally includes data from one or a plurality of sensing modalities; for example,sensor electrodes 3,other sensors 14, and/or imaging modality 6, described in relation toFIG. 6 . - Moreover, it should be understood that computation-performing and/or control operation-performing modules are optionally implemented by any suitable combination of shared and/or dedicated processing units and/or controllers. For example, implementations of
treatment controller 13,position analyzer 20, and/orinteraction analyzer 21 optionally comprise one shared processing unit, or any other suitable number of shared and/or dedicated processing units. - Optionally, the flowchart continues with
block 112. In some embodiments, certain types of interaction data (such as inputs indicating onset of ablation treatment) branch additionally or alternatively toFIG. 1B (dotted line branch indicates optional branching). - Geometrical Effects and Rendering of Virtual Materials
- At
block 112 ofFIG. 1A , in some embodiments, geometrical effects which modify the apparent position of geometrical features in a rendered view of a simulation scene are optionally calculated for locations defined by a 3-D data structure representing geometry of the targetedbody tissue region 7. The operations ofblock 112 are carried out, in some embodiments, by interaction analyzer 21 (detailed for some embodiments inFIG. 7 ). Optionally the geometrical effects ofblock 112 are calculated based on discrete events in the interaction data; for example, a single event such as a high-pressure contact triggering a tissue response like edema. Optionally, the geometrical effects ofblock 112 are calculated based on a history of interaction data; for example, a history of the delivery of ablation energy to a tissue region is used to estimate properties (for example, lesion extent) of an ablation lesion produced. The lesion properties are optionally estimated using a model of a thermal profile of the target tissue region and an estimate of temperatures/times at temperatures above which ablation occurs. - In further explanation of the distinction between adjustment of geometric points as such, and geometrical effects which affect the apparent position of geometrical points in a rendering, reference is now made to
FIGS. 9A-9B , which schematically represent, respectively, different geometrical data representations of flat and indented surfaces, according to some embodiments of the present disclosure. The grids shown in the two figures to indicate geometrical point positions are illustrative; alternatively or additionally, these could be, for example: any set of geometrical points defined in a 3-D space by mesh data; by polygon definitions; and/or by one or more parametrically defined shapes such as polyhedra, ellipsoids, cylinders, planar-shape extrusions, and/or parametric curves. 3-Dflat geometry 901 and indented geometry 903 (indented at indentation 905) represent the use of 3-D positions of geometrical points to visually convey surface shapes. Theindentation 905, for example, is represented by displacing geometrically defined points falling within it by an appropriate distance out of the plane defined by other points of 3-Dindented geometry 903. - Additionally or alternatively, geometrical appearance is changed (e.g., from a flat appearance to an indented appearance) by assigning to the surface of each rendered region within indentation 905 a suitable orientation (for purposes of rendering), chosen to optically mimic the angle the surface would have if the 3-D
flat geometry 901 comprised a geometrically indented region like that of 3-Dindented geometry 903; but without necessarily changing the 3-D geometry to which it maps. By convention, the surface orientation is represented by the orientation of a vector normal to (sticking straight out of) the surface. - For example,
normal maps normal maps indentation 905. Surface orientation as represented by a normal map does not necessarily follow the geometrical surface orientation (for example,FIG. 9A shows aflat geometry 901 paired to anormal map 902 that represents an indentation). Though the resulting appearance change is not shown inFIGS. 9A-9B ,FIGS. 10A-1B do provide an example of how a geometrical appearance can be changed (in that case to appear like a raised bump) by use of shading, without necessarily changing underlying geometrical positions. - To render the effects of a normal map, a rendering pipeline typically takes into account at least the relative angle of each surface normal and a light source in order to determine how much light is received at the camera. Then, for example (and other things being equal): when the relative angle is low, the surface is brighter; when the relative angle is high, the surface is darker. Optionally, the normal mapping algorithm also takes into account camera position and/or viewing angle-dependent surface reflection/scattering properties of the surface.
- Normal mapping uses include, for example: to create the appearance of surface irregularities where the 3-D geometrical data has none, to exaggerate the 3-D appearance of shapes in the 3-D geometrical data, and/or to smooth transitions between polygons where the 3-D geometrical data describes abrupt changes (for example, between polygons in a mesh). In connection with some embodiments of the present invention, normal mapping (and a normal map, supplied as part of the geometrical rendering data 1121) has particular application for the showing of tissue deformations such as swelling (e.g., to indicate tissue damage) and indentation (e.g., to indicate probe-tissue contact). Embodiments optionally implemented with the use of normal mapping are described, for example, in relation to
FIGS. 10A-10B, 10C-10D, 4A-4D , and 5A-5B. A distinction is drawn between the use of normal mapping techniques to define and/or highlight surface features having functional significance to an ongoing catheterization procedure, and the use of normal mapping techniques to provide general texture (such as bump mapping), and/or to mask display artifacts (such as masking of geometrical mesh artifacts using Gouraud shading or Phong shading). - Herein, 3-D structure rendered in a scene (in particular, 3-D data defining organ structure) is geometrically represented by
geometrical rendering data 1121. 3-D positions are one part of the geometrical rendering data. Data used to affect geometrical appearance such as by use of normal maps (apart from use to define fine-grain texture) are considered to comprise a second part of thegeometrical rendering data 1121. - In some embodiments, the
geometrical rendering data 1121 comprise mesh data; for example as commonly used in defining structures for computerized visual rendering of 3-D structures.Geometrical rendering data 1121 specify positions (and usually also connections among positions, and/or positions joined by the extent of a common surface and/or material volume), corresponding to positions of surfaces of a target body tissue region to be visually rendered for presentation. Optionally, the geometry of positions interior to the surface is also defined and/or represented. For example, presentation optionally includes the use of transparency and/or cross-sectional views, whereby an interior portion of a tissue region is made visible. - Surfaces represented are optionally external (e.g., organ surfaces; not necessarily surfaces visible externally to the body) and/or internal (e.g., lumenal) surfaces of the target body tissue region. In some embodiments,
geometrical rendering data 1121 are derived fromanatomical data 31; for example, appropriately segmented 3-D medical image data. In some embodiments,anatomical data 31 include specification of tissue region thicknesses, for example, thicknesses of heart walls. Heart wall thickness is optionally obtained from, for example: atlas information (optionally for a population corresponding to the current patient), modified atlas information (for example, scaled according to anatomical landmark correspondence, heart rate, and/or point observations), and/or imaging of the patient (for example, one or more of CT, MRI, and/or nuclear imaging techniques). - Moreover, in some embodiments, the appearance of the raw
geometrical rendering data 1121 that is finally presented by auser interface 55 is also determined in part by the assignment to the geometry of material appearance properties (MAPs); that is, properties affecting the appearance of materials represented in the rendered image. As the term is used herein, MAPs comprise any properties associated to positions (typically positions of a “virtual material”, as next described) in a virtual environment for visual rendering according to simulated optical laws, and which affect how a surface and/or its enclosed volume are visualized within a 3-D rendered space. For example, MAPs may define color, texture, transparency, translucency, scattering, reflectance properties, and the like. MAPs are usually but not only assigned to surface positions defined by the geometrical rendering data. MAPs are optionally assigned to volumes defined by surfaces specified by thegeometrical rendering data 1121. MAPs can also be assigned to the virtual environment (e.g., as lighting parameters) in such a way that they selectively affect material appearance at different positions. In some embodiments of the current invention, MAPs are used to in part define surface textures, for example by use of bump mapping (a type of normal mapping technique). - Creating the visual rendering in some embodiments may include surfaces and/or volumes comprising “virtual material”; for example, a virtual material having a visual appearance of myocardial tissue, and used in the representation of a heart wall defined by two surfaces. A virtual material, in some embodiments, is subject to simulated optical rules approximating processes such as reflection, scattering, transparency, shading, and lighting. Not every optical rule used in visual rendering is a copy of a real-world physical process; the art of computer rendering includes numerous techniques (for achieving both realistic and deliberately unrealistic results) that apply simulated optical rules that have no direct physical equivalent. Normal mapping has already been mentioned as a technique which can be applied to change a texture and/or geometrical appearance. Another example of a simulated optical rule is ambient occlusion. Ambient occlusion is an efficiently calculable method of simulating the effect of ambient lighting, but the occlusion is defined as a mapped property of an object's surface, rather than as an effect of light emitted from positions in the environment.
- A virtual material optionally also defines material properties that are not directly either geometrical or “of appearance”, for example, density, viscosity, thermal properties, and/or elastic properties. Insofar as these properties do in turn (in a given embodiment) affect the definition of MAPs (for example, via calculations of one or more simulators 1110), they are optionally treated as parts of material
appearance properties data 1122, without actually comprising MAPs in themselves. Additionally or alternatively, non-appearance properties, particularly those that affect how geometry changes (such as thickness, density, velocity, viscosity, and/or elasticity), are optionally considered part of thegeometrical rendering data 1121 insofar as they affect geometrically apparent behaviors of the material (e.g., how the material changes in shape). - Calculation of Geometrical Effects from Interaction Data
- In some embodiments of the invention, geometrical effects of tissue-probe interactions on a simulated tissue region are assigned based on the output of one or more simulators 1110 (
FIG. 7 ). - In some embodiments,
sensing data 1101 and/or treatment status data 1102 (i.e., data describing the operation of a treatment modality) are used directly or indirectly as input to one or more simulators 1110 (e.g.,simulators appearance state 1120 of the tissue based on inputs received, and one or more simulated aspects of tissue physiology, geometry, and/or mechanics. The modeledappearance state 1120 includes thegeometrical rendering data 1121 and materialappearance properties data 1122 in a form suitable for being operated on by thesimulators 1110; it may also be or comprise a renderable model state 1103 suitable for rendering for presentation, or else be convertible to a renderable model state 1103. In some embodiments, modeled appearance state also includes data indicating theprobe state 1123. -
Simulators 1110 also optionally receive as starting inputanatomical data 31 and/ortissue state data 1104. In addition to adjusting the modeledappearance state 1120,simulators 1110 optionally maintain their own internal or mutually shared simulation states. In some embodiments,simulators 1110 use motion simulation services exposed by a graphical game engine that can produce geometrical changes to a scene based, for example, on simulated collisions among scene elements, gravity effects, velocity, momentum, and/or elasticity. - Operations of some
exemplary simulators FIGS. 2A-2E, 3A-3L, 4A-4D, 5A-5B, 10A-10B , and 10C-10D. - In relation to
FIG. 7 , different input types providing probe-tissue interaction data as input tosimulators 1110 are now described, including direct sensing input, physiologically interpreted sensing input, positionally interpreted sensing input, and treatment status input. In some embodiments, the inputs comprise direct and/or transformed use of one or more of the interaction data types described in relation to block 110. - Direct sensing input: In some embodiments, adjustment of the simulation scene is implemented based directly on
sensing data 1101. For example, a pressure reading from apressure sensor 14 is optionally mapped directly to a geometrical displacement according to the measured pressure. - Additionally or alternatively, in some embodiments, a more involved simulation is performed; wherein probe interaction with a virtual material representing tissue is, in at least one aspect, physically and/or physiologically simulated in order to produce a new modeled appearance state.
- Physiologically interpreted sensing input: In some embodiments, the use of
sensing data 1101 by a simulator is indirect after interpretation by one ormore physiology trackers 1106.Physiology tracker 1106, in some embodiments, is a module which acceptssensing data 1101 and generates an assessment of current physiological state based on thesensing data 1101. For example, in some embodiments,sensing data 1101 comprises dielectric measurements thatphysiology tracker 1106 is configured to convert into assessment of tissue state, for example fibrotic, healthy, or edematous; for example as described in International Patent Application No. PCT/IB2016/052690, the contents of which are included by reference herein in their entirety. Optionally or alternatively, electrical activity originating in tissue indicating a functional state (e.g., general capacity to support electrical activity, and/or feature of the activity itself) is measured and used as sensing input. - The output of the
physiology tracker 1106 from one or more of these inputs is optionally in terms of one or more states such as tissue thickness (e.g., heart wall thickness), lesion depth, lesion volume, degree of lesion transmurality, characterization of tissue edema, characterization of functional activity and/or inactivation, a classification as to a potential for tissue charring, and/or a classification as to a potential for or occurrence of steam pop. “Steam pop” is a phenomenon occurring during ablation with an audible popping noise and/or spike in impedance, which is apparently due to sudden release of steam after excessive heating, associated with risk of perforation. - These outputs are optionally provided to a
physiology simulator 1114 and/or anablation physics simulator 1112, configured to convert such states into MAPs, other virtual material properties, and/or geometrical effects that indicate the tissue state(s) calculated from the measurements. Optionally, the tissue state interpreted from the sensing input also affects mechanical properties used, for example, by acontact physics simulator 1111 and/or aninjection simulator 1113. It is a potential advantage to implement aphysiological tracker 1106 as a distinct module that can be treated as a computational “service” to anyappropriate simulator 1110. However, it should be understood thatphysiological tracker 1106 is optionally implemented as part of one ormore simulators 1110 producing changes to a modeledappearance state 1120. In this case, the module configuration is more like that of direct sensing input, with the simulation of appearance integrated with physiological interpretation of the sensing data. - Positionally interpreted sensing input: In some embodiments, the use of
sensing data 1101 by a simulator is indirect after interpretation by aprobe position tracker 1107. Probeposition tracker 1107, in some embodiments, is a module that accepts appropriate sensing data 1101 (e.g., electromagnetic field navigation data, acoustic tracking data, and/or imaging data) and converts it to a measurement of the position (e.g., a measurement of the location and/or a measurement of the orientation) of a probe such ascatheter probe 11, for example as described in International Patent Application No. PCT/1132016/052687. It optionally comprisesposition analyzer 20. Optionally,position tracker 1107 implements processing to massage outputs ofposition analyzer 20 in view of the current state of the scene simulation—for example, to recalibrate sensed position data to positions compatible with the scene simulation. Optionally,position tracker 1107 integrates position data from a plurality of position inputs. - Optionally position determination includes determination of tissue contact force and/or quality, using a force sensor on the probe, and/or for example as described in International Patent Application No. PCT/IB2016/052686, the contents of which are included by reference herein in their entirety. Additionally or alternatively, on-line imaging data (e.g., ultrasound and/or angiographic images) are used, intermittently and/or continuously, to determine and/or verify probe position.
- Probe position determinations are optionally used as inputs to any of
simulators 1110; for example in order to assign particular positions to measurements of other tissue states/properties, and/or to help characterize changes induced by probe interactions with tissue (e.g. geometrical distortions of tissue introduced by touching contact with the probe, and/or simulated effects of treatment procedures). It is a potential advantage to implementprobe position tracker 1107 as a distinct module that can be treated as a computational “service” to anyappropriate simulator 1110. However, it should be understood thatprobe position tracker 1107 is optionally implemented as part of one ormore simulators 1110 producing changes to a modeledappearance state 1120 maintained byinteraction analyzer 21. - Treatment status input: In some embodiments, simulation is implemented based on
treatment status data 1102.Treatment status data 1102 include data indicating the operation and/or status of a treatment modality—for example, power, control parameters (e.g., of signal frequency, phase, and/or timing), and/or monitoring data. Optionally, treatment status data are applied directly to modeledappearance state 1120; for example, as an indentation or other deformation at a position of treatment modality activation. Additionally or alternatively, in some embodiments, at least one aspect of the tissue and/or tissue/probe interaction is physically and/or physiologically simulated in order to produce a new modeledappearance state 1120, based on the treatment status data. - For example, in some embodiments, a
physiology simulator 1114 receives input indicating that a probe-delivered treatment operation has occurred at some particular position (optionally along with parameters of the treatment operation).Physiology simulator 1114 is optionally configured to model the reaction of tissue to the treatment, instantaneously (for example, due directly to energy delivered by an ablation treatment), and/or over time (for example, as an edematous reaction develops in the minutes following an ablation treatment). In another example, aninjection simulator 1113 receives treatment status data indicating that a material injection is occurring.Injection simulator 1113 is optionally configured to model an appropriate reaction of tissue to the injected substance (e.g., swelling to indicate the injected volume, and/or to indicate injury response to the injection). The reaction is optionally immediate, and/or includes a slow-developing component as the material diffuses from the injection site. Optionally, changes in geometry due to the addition of material volume to the tissue are also modeled. - Presentation of Visual Rendering
- At
block 114, in some embodiments, a rendering of the modeled appearance state is created for presentation. - In some embodiments of the invention, geometrical effects on a simulated tissue region are assigned based on the output of one or more simulators 1110 (
FIG. 7 ). - In some embodiments,
sensing data 1101 and/ortreatment status data 1102 are used directly or indirectly as input to one or more simulators 1110 (e.g.,simulators appearance state 1120 of the tissue based on inputs received, and one or more simulated aspects of tissue physiology, geometry, and/or mechanics.Simulators 1110 also optionally receive as starting inputanatomical data 31 and/ortissue state data 1104. In addition to adjusting the modeledappearance state 1120,simulators 1110 optionally maintain their own internal or mutually shared simulation states. In some embodiments,simulators 1110 use motion simulation services exposed by a graphical game engine that can produce geometrical changes to a scene based, for example, on simulated collisions among scene elements, gravity effects, velocity, momentum, and/or elasticity. - Operations of some
exemplary simulators FIGS. 2A-2E, 3A-3L, 4A-4D, and 5A-5B . - In some embodiments of the invention, a modeled
appearance state 1120 is converted to a renderable model state 1103 and provided to adisplay module 1130 that converts (renders) the renderable model state into at least one image comprising a visually rendered representation of theintrabody region 7. Optionally, modeledappearance state 1120 is directly represented as a renderable model state 1103 (this is a potential advantage for tighter integration of the simulation with a game engine driving its rendering and presentation). The at least one image is displayed by one or more graphical displays of auser interface 55.User interface 55, in some embodiments, comprises one or more displays, for example a computer monitor, virtual reality goggles, and/or 2-D or 3-D projection device. Preferably,user interface 55 also comprises one or more user input devices that can be used for tasks such as selecting operating modes, preferences, and/or display views. It is noted that insofar as catheter probe position sensing affects simulation and/or display, catheter probe manipulation also acts as a special form of user input device; but for purposes of the descriptions herein such catheter probe sensing inputs should be considered distinct from inputs provided throughuser interface 55. - In some embodiments, the
display module 1130 renders from one, two, three, or more viewpoints simultaneously. In some embodiments, rendering is performed (and the resulting images are displayed) at a frame rate sufficient to produce perceived motion (herein, such a frame rate is termed a motion frame rate)—for example, at least 10-15 frames per second; and optionally at least, for example, 15, 20, 30, 50, 60, or 100 frames per second (fps), or another greater or intermediate value. Within this range, lower frame rates (e.g. 10-20 fps) tend to give the appearance of “choppy” motion, with apparent motion growing increasingly fluid with rates up to at least 30-60 fps. More fluid motion is potentially less fatiguing and/or more precise for guiding actions based on events in the simulation scene. Still higher frame rates (above the nominal frequency of visual flicker fusion) add the potential advantage of increasingly convincing presentation of very rapid motion (e.g., reducing visual appearance of discrete-position motion “trails”). Trans-flicker fusion frequency frame rates are optionally preferred for immersive, virtual reality (VR) user interface implementations; higher frame rates potentially help mitigate VR motion sickness. - In some embodiments of the invention,
display module 1130 includes a computer-implemented software module comprising therendering pipeline 1230 of a 3-D graphics engine 1200 (software environment) such as is provided with graphical game engines such as the Unreal® or Unity® graphical game engine, or another game engine. Some general aspects of 3-D graphical game engines are discussed in relation toFIG. 8 , herein. Optionally, the conversion of a modeledappearance state 1120 into a renderable model state 1103 comprises the creation and/or instantiation of computer data and/or code structures that are directly used by the rendering pipeline of the 3-D graphics engine 1200. - Optionally, some functions of interaction analyzer 21 (for example, any of simulators 1110) are provided as functions (e.g. classes, hook implementations, etc.) making use of the application programming interface (API) of such a 3-
D graphics engine 1200. - Ending the presentation of
FIG. 1A : atblock 116, in some embodiments, flow optionally returns to block 110 to receive more interaction data, or else (if adaptive visual rendering is to be suspended), the flowchart ends. - Use of a Graphical Game Engine in Real-Time Anatomical Navigation
- Continuing reference to
FIG. 7 , in some embodiments of the invention, geometrical effects on a simulated tissue region are assigned based on the output of one ormore simulators 1110. - In some embodiments,
sensing data 1101 and/ortreatment status data 1102 are used directly or indirectly as input to one or more simulators 1110 (e.g.,simulators appearance state 1120 of the tissue based on inputs received, and one or more simulated aspects of tissue physiology, geometry, and/or mechanics.Simulators 1110 also optionally receive as starting inputanatomical data 31 and/ortissue state data 1104. In addition to adjusting the modeledappearance state 1120,simulators 1110 optionally maintain their own internal or mutually shared simulation states. In some embodiments,simulators 1110 use motion simulation services exposed by a graphical game engine that can produce geometrical changes to a scene based, for example, on simulated collisions among scene elements, gravity effects, velocity, momentum, and/or elasticity. - Operations of some
exemplary simulators FIGS. 2A-2E . - Reference is now made to
FIG. 8 , which schematically represents components, inputs, and outputs of agraphical game engine 1200 operating to manage and renderscene elements 1220 to motion frame-rate images 1240, according to some embodiments of the present disclosure. - In some embodiments of the invention, a
graphical game engine 1200 is used not only to render images (for example as described in relation to block 114 ofFIG. 1A ), but also to provide more generally the data structure and code framework of the “scene” and how it changes in response to time and/or input. - In broad outline, a
graphical game engine 1200 comprises a collection of computer software components exposing one or more application programming interfaces (APIs) for use in describing, instantiating (initializing and maintaining), continuously updating, rendering, and/or displaying ofscene elements 1220. Examples of graphical game engines include the Unreal® and Unity® graphical game engines. - The
scene elements 1220 provided for the operations ofgraphical game engine 1200 optionally include, for example, descriptions ofterrain 1221, objects 1224, cameras 1223, and/or elements for lighting 1222. In some embodiments of the present disclosure, definitions ofscene elements 1220 are derived fromgeometrical rendering data 1121 and/orMAPs data 1122. Definitions are optionally expressed in terms of geometrical-type scene data 1225 (e.g. model assets, shapes, and/or meshes), and/or appearance-type scene data 1226 (e.g., image assets, materials, shaders, and/or textures). In some embodiments,geometrical rendering data 1121 andMAPs data 1122 are initially produced already in a format that is directly used bygraphical game engine 1200. - In some embodiments,
scene elements 1220 are provided with simulated dynamic behaviors by an iterated series ofcalculated scene adjustments 1210.Scene adjustments 1210 are optionally implemented by a variety of software components for e.g.,motion physics services 1212,collision detection service 1213, and/orscripts 1211. These are examples;graphical game engines 1200 optionally implement additional services, e.g., “destructibility”.Scripts 1211 can be provided to simulate, for example, autonomous behaviors and/or the effects of triggered events.Scripts 1211 are optionally written in a general-purpose computer language taking advantage of APIs of thegraphical gaming engine 1200, and/or in a scripting language particular to an environment provided by the coregraphical gaming engine 1200. Graphical gaming engines optionally also accept integration with plugin software modules (plugins, not shown) that allow extending the functionality of the coregraphical game engine 1200 in any of its functional aspects. For purposes of the descriptions provided herein, plugins that perform functions related to updating the scene state are also encompassed within the term “script” 1211. In some embodiments, all or part of any ofsimulators 1110 is implemented as ascript 1211. - For purposes of descriptions herein, scripts 1211 (optionally including plugins) and
scene elements 1220 are considered part of thegraphical game engine 1200 as a functional unit. Optionally, for example where reference is made particularly to the off-the-shelf graphical game engine apart from specialized adaptations for uses described herein, the term “core graphical game engine” is used. - For interactivity,
graphical game engines 1200 accept user input 1214 (optionally including, but not limited to, inputs fromuser interface 55 devices such as mouse, keyboard, touch screen, game controller, and/or hand motion detector; and for some embodiments of the current invention, optionally including data provided as input that indicate probe positions, treatment modality operation, etc.). - A typical graphical game engine also includes a
rendering pipeline 1230 that may include one or more stages of 3-D rendering, effects application, and/or post-processing, yielding at least one stream of frame-rate images 1240. In some embodiments, the stages of therendering pipeline 1230 include modules that implement simulated optical algorithms—not necessarily directly based on real-world physical laws—generally selected to produce a rendered result that visually gives to elements in the rendered scene the appearance of material substances. - Table 1 includes some examples of how graphical game engine features and concepts are optionally used in some embodiments of the current invention:
-
TABLE 1 Examples of Graphical Engine Feature/Concept Usage FEATURE/ CONCEPT EXAMPLES OF USE Scene Overall visually renderable model of environment and objects within it. Optionally equivalent to a renderable model state 1103 and/or scene elements 1220.Terrain Optionally used to represent geometry of the anatomical environment; e.g., geometrical rendering data 1121. Forexample, the heart wall might be implemented as terrin 1221 (alternatively, anatomical features are implemented as objects 1224; e.g., as mesh geometry objects, and/orcombinations of primitive objects such as cylinders, boxes, and/or ellipsoids). Objects 1224Probe 11 is optionally represented as a “game” object,and may optionally serve as a viewpoint anchor like avatars and/or tools in certain 3-D games. Significant features of the anatomical environment such as scars, lesions, and/or regions of edema, are optionally implemented as appropriately positioned objects, e.g., embedded in an environment of surrounding tissue. Guides and markers are optionally implemented as game objects. Assets Tissue, probe, guide, and/or other objects and/or their appearances are optionally instantiated from assets which represent available types of objects, their behaviors and/or their appearances. Optionally includes geometrical-type scene data 1225 (e.g., model assets, shapes, and/or meshes), and/or appearance- type scene data 1226, (e.g., images assets, material, shaders, and/or textures). Cameras 1223 Cameras optionally define flythrough viewpoint(s) of the anatomy traversed by the catheter probe 11, and/oroverview viewpoint(s) (showing probe and tissue from a remote viewpoint). Optionally, the position of catheter probe 11 defines one or more cameraviewpoints by its position/or orientation. Lighting 1222 In addition to providing general lighting of the tissue being navigated, lighting 1222 is optionally defined to provide highlighting, e.g., of regions pointed at by probe 11, indications of environmental state by choiceof light color, light flashing, etc. Lighting is optionally used to implement MAPs non-locally (that is, a defined light source optionally is defined to illuminate a view of simulated tissue to selectively change its material appearance, while not being part of the material properties of appearance of the simulated tissue as such). Image Assets; MAPs that are also material properties of Materials, appearance, for example, defining the appearance of Shaders, and tissue as healthy muscle, edematous, fibrotic, heated, Textures 1126 cooled, etc. Particle Type of object optionally used for providing effects Systems such as smoke/steam-like indications of ablation heating, spray, transfer of energy, etc. Collision Optionally used for interactions between probe and Detection the geometry of the anatomical environment; 1213 and optionally including deformation of the probe and/or Motion the anatomy. As implemented by core graphical game Physics engines, the term “physics” generally is limited Service to physics affecting movement/deformation of game 1212 objects such as collision, gravity, or destruction. In some embodiments, simulators 1110 includesimulation of other “physics”, such as temperature, physiological change, etc. Scripts Optionally used for animating and/or showing changes 1211 in dynamic features of the environment (lighting, terrain), view (camera position) and/or game objects, optionally gradually over a period of time: for example, development of lesions, development of edema, heating/cooling effects, and/or injection effects. Optionally, scripts are used to implement dynamic appearance, even though the underlying state representation is constant (e.g., coruscating and/or pulsing effects). User Input Optionally comprise inputs reflecting changes in probe 1214 position (e.g., output of probe position tracker 1107) for guiding navigation through the scene, and/or determining camera position. Some treatment status data 1102 are optionally interpreted as inputs reflecting operator interaction with the scene. Multiplayer During a procedure, there is optionally a plurality of different operators working simultaneously with a system according to some embodiments of the current invention. For example, while a primary physician manipulates the intra-body probe, one or more additional workers are optionally reviewing the simulated environment to locate next target sites for the probe, evaluate effects of previous ablations, etc. Optionally, there is more than one probe in use at a time, each of which is optionally treated as a different “player” with its own associated camera views and/or interaction capabilities. - Independently Time-Evolving Probe-Tissue Interactions
- Reference is now made to
FIG. 1B , which is a schematic flowchart illustrating the calculation and display of an rendered image of a simulation scene comprising a view of simulated tissue having a geometry and/or geometrical appearance of a tissue dynamically changing as a function of time to represent changes developing subsequent to a triggering interaction between the tissue and a catheter probe, according to some embodiments of the present disclosure. - In some embodiments of the invention, simulation of probe-tissue interactions includes simulation of tissue effects (e.g., injury response) developing substantially independently of continuing inputs from probe-tissue interaction data. In some embodiments, the flowchart of
FIG. 1B branches off from certain input cases of the flowchart ofFIG. 1A , wherein geometrical effects develop at least partially concurrently with (and optionally unsynchronized to) geometrical effects which immediately track changes in inputs. InFIG. 1B , initial interaction data is received (optionally entering the flowchart fromblock 110 ofFIG. 1A ). After this, the simulated geometry evolves according to the results of pre-set rules which operate substantially independently of further input for a time. A potential advantage of this approach is to allow continuously updated visualization of tissue changes, even when no new sensing data has been obtained to confirm them. - The flowchart optionally begins after a triggering probe-tissue interaction has occurred which is to be modeled as provoking changes to the scene which continue after the trigger time to. For example, an input indicating that ablation energy has been delivered triggers the operations of the flowchart.
- Optionally, operations of the flowchart of
FIG. 1B are implemented by ascript 1211. Additionally or alternatively, operations of the flowchart are implemented by asimulator 1110, for example,physiology simulator 1114. - At
block 120, in some embodiments, one or more geometries and/or geometrical appearances are set to an initial state (an existing state is optionally used as the initial state) and a simulation function is selected and assigned to change the geometries and/or geometrical appearances as a function of time according to parameters set from inputs describing the probe-tissue interaction. These inputs may be included in the interaction data received atblock 110. In some embodiments, the simulation function is configured to evolve according to the state of a timer. - For example, in some embodiments, a
physiology simulator 1114 is configured to emulate effects of edema developing post-ablation, based on parameters such as the position, amount of energy delivery, and/or duration of energy delivery causing the ablation. Edema is optionally modeled to develop over the course of several minutes (for example, 2, 5, 10, 15, 20 or another number of minutes). Optionally, modeled changes in geometry and/or geometrical appearance simulate changes in muscle tone, e.g., vasodilation or vasoconstriction. The geometry and/or geometrical appearance is optionally modeled to show thickening and/or thinning, increase and/or decrease in surface height variation over a surface area, and/or another deformation, for example: dimpling, puckering, “goose-pimpling”, stretching, collapsing, expanding, distending, and/or shrinking. Lumenal structures optionally show change in cross-sectional shape (e.g., radius). - Optionally, one or more MAPs are changed in coordination with change in geometry and/or geometrical appearance. Adjusted MAPs optionally include, for example, those that can be modified to show increasing “redness” of the tissue with time to indicate swelling, “whiteness” or “greyness” to indicate loss of perfusion, color change to indicate change in temperature, etc.
- As another example: in some embodiments, geometrical effects are applied to indicate contractile state (for example, of cardiac muscle, or gastrointestinal tract motion). Optionally, simulations of contraction are triggered by measurements of heartbeat and/or pulse phase, and/or of autonomic nervous system activity. The geometrical effects are preferably simulated to be in synchrony with what is expected to be actually occurring in the tissue that the simulation describes. However, the simulation is optionally different from reality in one or more respects; for example, amplitude is optionally adjusted. Larger-adjusted amplitude potentially emphasizes activity (e.g., vasoconstriction is exaggerated for clarity); smaller-adjusted amplitude potentially reduces distracting effects of activity (e.g., heart contraction is shown with reduced amplitude).
- In some embodiments of the invention, dynamic adjustment of heart size in a rendered view of a simulated scene is based on heart rate. Optionally, this is implemented by dynamic adjustment of the geometrical rendering data representing the heart shape. In some embodiments, the adjusting comprises adjusting a static size of one or more heart chambers (e.g., a lumenal volume of the heart chambers, and/or a lumenal dimension of the heart chambers). In some embodiments, the adjusting comprises selecting a range of heart chamber sizes simulated cyclically over the course of each heartbeat cycle, e.g., between changing minimum and/or maximum sizes.
- In some embodiments of the invention, the adjustment of heart chamber size to larger or smaller sizes is accompanied by corresponding inverse adjustment of heart wall sizes to smaller or greater thicknesses.
- A potential advantage of these adjustments is to increase an accuracy and/or precision with which an intrabody probe (and in particular, an intracardial catheter probe) can be positioned, and/or with which the position of such a probe can be determined. In particular, positioning precision/accuracy with respect to one or more particular regions of heart wall tissue is potentially improved; for example, a nearest and/or a pointed-at region of heart wall tissue. A pointed at location is located along a longitudinal axis extending through the probe tip.
- This in turn potentially increases certainty of achieving targeted effects of treatment administration (e.g., ablation), and/or of evaluating those treatment effects. Adjustment of a display to maintain an accuracy of positioning of the intracardial probe relative to the heart is implemented, in some embodiments, using one or more of the following methods. Optionally, positioning changes of a probe relative to a heart wall due to heart size changes are at least partially represented to an operator by simulating relative movements and/or scaling of a rendered representation of an intrabody probe in a display, while suppressing at least part of the size changes undergone by the actual heart chamber represented in the display. For example, if heart chamber beats are at least partially suppressed, then changing actual probe position relative to the beating heart chamber walls is optionally displayed by movements of the probe itself. Optionally, for example, if inter-pulse heart chamber size changes (e.g., due to heartbeat rate changes) are at least partially suppressed: scaling of detected intracardial probe movements is adjusted in a display so that relative positions of heart wall and probe remain synchronized between the actual tissue and probe pair, and a display of a simulated tissue and probe pair.
- In some embodiments, the wave pattern to be simulated is determined at least in part from direct measurements of impulse wave propagation. In some embodiments, the wave pattern is simulated from a generic heart tissue or other tissue model. Optionally, the wave pattern is adapted according to knowledge about tissue state, for example, to indicate regions of weak and/or slow propagation attributed to states of fibrosis, perfusion state, and/or denervation. Optionally, moreover, the degree of impulse transmission is itself modulated in simulations managed by
physiology simulator 1114; for example, to reflect transmission effects of treatment activities such as lesioning, tissue cooling, injections, etc. - At
block 122, in some embodiments, the current state of the geometry and/or geometrical appearance (optionally including changes to MAPs) is rendered to a visual representation of the tissue with which the interaction occurred. In some embodiments, the rendering makes use of 3-D graphics engine, for example as described in relation to displaymodule 1130, and/or in relation toFIG. 8 and/or Table 1. - At
block 124, in some embodiments, the timer is incremented. - At
block 126, in some embodiments, a decision is made as to whether the loop is to continue (returning to block 120), or is terminated (stopping the flowchart). Time-evolving geometry and/or geometrical appearance optionally evolve, for example, cyclically (for example, repeating a movement pattern), transiently (disappearing at the end of a generation cycle, for example, in a simulation of cooling from a heated condition or re-warming from a cooled condition), and/or to a new steady-state appearance (for example, edema that develops to a fully developed state during a period after ablation, and then persists beyond the period during which the tissue is simulated). - It should be understood that sensing feedback is optionally integrated with the flowchart of
FIG. 1B to create semi-open/semi-closed loop simulation: periods of open loop simulation producing results (e.g., geometrical effects) that are periodically verified, guided, and/or corrected according to sensed data. In some embodiments, for example, simulation of developing edema optionally proceeds independently as long as no further sensing data characterizing the edema state is available. However, if edema state is measured at some midpoint of the simulated edema time-course (for example, by use of dielectric measurements), then the simulation is optionally adjusted mid-course to reflect the sensed data. Adjustment is optionally immediate, and/or includes a period of interpolated adjustment (which potentially helps maintain the sense of presence in rendered views of the simulation scene). - Modes of Simulating Geometrical Effects
- Cross-Sectional Perspective Views of Single-Lesion Progress
- Reference is now made to
FIGS. 2A-2E , which illustrate a 3-D rendered display for indicating lesioning status to an operator, according to some exemplary embodiments of the present disclosure.FIGS. 2A-2E show a sequence of visual renderings of a single lesion over the course of the operation of an RF ablation probe to create it. This provides an example of how adjusted geometry and/or geometrical appearance can be used (optionally together with adjustment of MAPs) to convey to an operator a direct understanding of how use of an ablation probe is affecting target tissue. - In appearance,
FIGS. 2A-2E comprise images (rendered in some embodiments in therendering pipeline 1230 of a 3-D graphical game engine 1200) of an RF ablation probe 202 (corresponding, in some embodiments, tocatheter probe 11, whereintreatment element 8 is an ablation electrode, andtreatment controller 13 operates to supply ablation energy to the RF ablation probe 202) and its position relative totissue 205 targeted for ablation (e.g., part of body tissue region 7). Optionally, the rendering is in color, and/or otherwise using applied MAPs conveying the vital appearance (e.g., properties of roughness, specular reflection, etc.) of the tissue (black and white is shown herein for purposes of illustration). In some embodiments,RF ablation probe 202 is implemented as anobject 1224 belonging to scene elements 1220 (FIG. 8 ).Tissue 205 is optionally implemented asterrain 1221 or anobject 1224 belonging toscene elements 1220. -
FIG. 2A , in some embodiments, shows the moment of initial contact betweenprobe 202 andtissue 205. Optionally, this view is triggered when contact is sensed by a sensor on the probe, such as a force sensor (an example of an “other sensor” 14) and/or dielectric sensing of contact (e.g., via dielectric property analyzer 22). The triggering, mediated in some embodiments by interaction analyzer 21 (and optionally taking advantage of acollision detection service 1213 of a game engine 1200), is optionally visually implemented as a jump from a wider angle view with the probe out of contact to a close-up of the probe contacting tissue. Optionally, transition from no-contact to contact (or vice versa) is shown by a short bridging animation. In some embodiments, continuous sensing of probe position and/or probe distance to the tissue wall (for example, by a position sensing subsystem comprisingsensing electrodes 3,body surface electrodes 5, field generator/measurer 10, and/orposition analyzer 20 and/or position tracker 1107) allows any jump in a sensed transition between contact and non-contact to be smoothed out using actual position data. -
FIG. 2B , in some embodiments, includes a visual indication of increased contact pressure between thetissue 205 and probe 202 comprising anindented region 204. InFIG. 2C and thenFIG. 2D , the deeperindented region 204 shows that pressure has been increased still further. Optionally, the geometry and/or geometrical appearance modifications indicate sensed and/or calculated contact pressure; the appropriate transformation being calculated, for example, by contact physics simulator 1111 (which may in turn take advantage ofmotion physics services 1212 and/orcollision detection service 1213 of game engine 1200). Although preferably modeled based on sensed contact quality and/or force data, distances of the indentation deformation need not be exactly corresponding to deflection distances in the real tissue. Rather, the visual degree of indentation shown is optionally considered as a proxy indicator for when the probe is out of contact, in poor contact, in a good position to ablate, and/or exerting excessive force on the tissue. Optionally,tissue 205 is shown in cross-section. - This has a potential advantage for allowing the indentation size to be clearly seen (as a deflection of the surface boundary 203). Optionally, the cross-sectional view also displays information about achieved lesion parameters such as lesion depth and/or lesion transmurality. Where cross-section is shown, transformation of geometrical position data is preferably used to show indentation changes. Geometrical appearance changes (e.g., by manipulation of normal mapping) are optionally used as well; but preferably not used alone, since the edge-on view of a cross-section highlights the spatial position of surface contours.
- Additionally or alternatively, in some embodiments of the invention, transparency effects are applied to allow seeing into a targeted volume of tissue. For example, before ablation begins, a local region of tissue selected by the position of
probe 202 is shown with increased transparency. Optionally, as portions of the tissue become lesioned, they are represented in simulated display as more opaque; creating an ablation “island” that directly shows the progress of lesioning. A potential advantage of the transparency approach is to allow representation of lesioning progress from any arbitrary 3-D point of view including the targeted tissue region. - In
FIG. 2C , in some embodiments, there has been a slight increase in sensed contact (shown by increased indentation of indented region 204), and ablation by delivery of RF energy to the tissue fromprobe 202 has begun. A superficiallesioned portion 208 oftissue 205 is now shown, for example, in a lighter shade (in color,lesioned portion 208 is optionally colored a light grey compared to darker red vital tissue). As lesioning proceeds (for example, to the intermediate state indicated inFIG. 2D , and finally to the completedlesion 209 inFIG. 2E ),lesioned portion 208 gradually increases in extent and/or degree of MAP change from the pre-lesioned state.FIG. 2D also indicates an increased pressure of contact by anindented region 204 in the tissue, whileFIG. 2E shows pressure reduced. Optionally, the geometrical deformation changes as tissue ablation proceeds (even for a fixed pressure), for example to indicate changes in tissue elasticity and/or volume. - In some embodiments, this progression is based on inputs describing the operation of the treatment modality (ablation, in the illustrated example). For example, inputs describing power, duration, and/or contact quality are factored into a simulation (e.g., by an ablation physics simulator 1112) linked to how the tissue is displayed in its geometrical and/or material appearances. Optionally, operation of an
ablation physics simulator 1112 includes thermal modeling (thermal simulation), based on local tissue region properties, for example, of local tissue type, thickness, thermal conductivity, and/or thermal exchange (e.g., between tissue and flowing blood). In some embodiments, at least part of the information providing local tissue type and/or thickness is obtained based on dielectric properties calculated from measurements of an alternating electromagnetic field obtained from asensing electrode 3 at or near the position of thelesion 209. - In some embodiments, calculated dielectric properties are used as indications of lesion state (e.g., size, transmurality, completeness and/or irreversibility), for example as described in International Patent Application No. PCT/IB32016/052690, the contents of which are incorporated by reference herein in their entirety. In in vitro studies, accuracy of transmurality has been found to be about ±1 mm. In prospective in vivo studies, 100% sensitivity and specificity in predicting lesion transmurality was found, while in humans, at least 90% specificity and sensitivity was found. Specificity is the percentage of actually well-ablated areas that were dielectrically identified as well-ablated; sensitivity is the percentage of actually partially ablated areas that were dielectrically identified as partially ablated.
- Additionally or alternatively, the progression during lesioning is based on inputs describing sensed data reflecting one or more treatment effects, for example, measured temperature and/or changes in dielectric properties as tissue begins to break down. In general, probe-based temperature sensing, where available, is limited in resolution and/or depth, so that completely sensing-based adjustment may be difficult or impossible to obtain. However, sensed data may nevertheless be used as input to an
ablation physics simulator 1112 that extrapolates lesion state through a 3-D block of tissue. Optionally, the extrapolated state is used as a corrective and/or calibrating input to anablation physics simulator 1112. - In some embodiments, one or more additional indications of house lesioning is proceeding are provided as part of the rendered image. For example, in
FIG. 2D , “steam” 207 is shown arising from the lesion point. Optionally, this is an indication that temperature has reached (and/or is maintained at) a certain threshold. The threshold may be, for example, a threshold at which lesioning occurs, a threshold above which a danger of effects such as steam pop or charring occurs, or another threshold. Different characteristics of the “steam” could be used, for example, conversion to black (or increasingly black) “smoke” in case of increased danger of excessive heating. In some embodiments of the invention, such steam- and/or smoke-like effects are implemented using a particle system facility provided by a graphical game engine. - Simulation of Tissue “Tenting”
- Reference is now made to
FIGS. 3A, 3D, 3G, and 3J , which schematically represent a sequence of rendered views of a renderedcatheter probe 11A (representing a catheter probe 11) passing through a renderedtissue wall region 50, according to some embodiments of the present disclosure. Reference is also made toFIGS. 3B, 3E, 3H, and 3K , each of which schematically represents a graph of position versus time and measured contact versus time for thecatheter probe 11 rendered as renderedcatheter probe 11A ofFIGS. 3A, 3D, 3G, and 3J , according to some embodiments of the present disclosure. Additionally, reference is made toFIGS. 3C, 3F, 3I, and 3L , which schematically represent an ultrasound image at a cross-section of a heart at the atrial level, and corresponding to the sequence ofFIGS. 3A, 3D, 3G, and 3J , according to some embodiments of the present disclosure. - In some embodiments of the invention, the geometry of a three-dimensional simulation of a
tissue wall region 50 is updated for displaying at a motion frame rate. The frame updating may be based on information received from one or more sensing modalities. The information may be received ascatheter probe 11 interacts with a tissue wall. The two figure series ofFIGS. 3B, 3E, 3H, and 3K andFIGS. 3C, 3F, 3I, and 3L represent different examples of sensed inputs related to tissue-catheter probe interactions, based on which (in any suitable combination) the tissue deformations ofFIGS. 3A, 3D, 3G, and 3J are simulated. - The sensing modalities optionally comprise modalities that are non-imaging in nature (e.g., catheter probe position tracking data, and/or probe-sensed parameter time-course data), and/or comprise images giving incomplete view coverage of the simulated tissue region (for example, cross-sectional images). New sensing data is optionally acquired faster, slower, or at the same rate as the simulation appearance is updated.
- Simulation and visualization updating is optionally in correspondence with states indicated by recently sensed data. For example when sampling is slow and/or intermittent, the current simulation state is optionally extrapolated from recent data according to one or more trends therein. Optionally, simulation updating is delayed from the acquisition of real-time data (for example, delayed to a buffer of at least two recent samples, and/or for example, by up to about 250 msec), which optionally allows smoothing interpolation between actually measured sensing data points in exchange for a certain amount of lag.
- The X-axes of
graphs 310 ofFIGS. 3B, 3E, 3H, and 3K represent relative time. The Y-axes overlappingly represent sensed catheter probe position advance above a baseline position 311 (dashedlines including points lines including points region 302 of the actual tissue wall portion represented by renderedtissue wall region 50 relative tocatheter tip 301 is represented in the graphs bydotted line 309. - In some embodiments of the invention, probe-tissue contacts causing and/or represented by geometrical tissue deformations within the body are measured using one or more sensing modalities (for example, sensing by a force sensor, by sensing of impedance properties, or another sensing modality) that are only partially indicative of the overall geometrical effects of the contact. In some embodiments, the one or more sensing modalities provide information as to the variation over time of a limited number of parameters communicated in the interaction data; for example, one, two, three, or more parameters.
- For example, in some embodiments, sensing information that encodes position of
probe 11 is available. The position ofprobe 11 may be indicated by the interactive information absolutely and/or relative to the tissue portion represented by renderedtissue region 50. In some embodiments, the sensing information may be indicative of contact quality and/or contact force measured to exist betweenprobe 11 and the tissue portion represented by renderedtissue region 50. In some embodiments, these measurements are used to guide changes made tosimulated tissue region 50 and renderedprobe 11A, and the model rendered in turn to a sequence of images that visually simulate geometrical effects associated with the sensed information. - In some embodiments, the simulated model comprises a mechanical model of a tissue wall, including, for example, properties of tissue wall thickness, elasticity, density, velocity, and/or viscosity suitable to the tissue being simulated. Simulation of deformations optionally comprises applying a force commensurate with sensed forces and/or positions. Preferably, simulated geometrical effects are generated to faithfully visualize those effects that are actually occurring. In such embodiments, a mechanical model of the tissue wall is preferably provided with parameter values yielding realistic-looking behavior in reaction to applied simulated force and/or displacement. Graphical game engines commonly expose services for the simulation of physical interactions of scene elements, providing a potential advantage for ease of implementation.
- Optionally or additionally, simulated geometrical effects may convey to an operator information about the contact, even though actual geometrical distortions (e.g., geometrical distortions introduced by touching contact with a probe, which may comprise pressing on tissue by the probe) are potentially different than the simulation shows: e.g., smaller in size, and/or modeled to simply indicate stages in deformation, without quantitative fidelity. In such embodiments, a simulated mechanical model is optionally implemented with parameters giving model behaviors that are potentially different from the actual case. Optionally, the model is implemented more simply; for example, as a mapping of a range of geometrically distorted wall shapes to one or more corresponding ranges of sensed input values.
- Additionally or alternatively, in some embodiments, image information at least partially describing geometrical changes is available to the operator. The image information may be spatially incomplete: for example, an ultrasound cross-section that illustrates deformation in a planar cross-section of the tissue wall portion that an intrabody probe is penetrating. In some embodiments, an imaging modality other than ultrasound is used, for example, X-ray fluoroscopy. Preferably, the imaging modality provides images at a rate sufficient to guide manipulation of the
catheter probe 11, but this can optionally be a rate below motion frame rate; for example, at least 2-5 Hz.FIGS. 3C, 3F, 3I, and 3L represent a time sequence of ultrasound images measured from an ultra sound probe located in the lumen of a left atrium 321 (about at the apex of ultrasound images 320), as aprobe 11 crosses into theleft atrium 321 from aright atrium 322. In the case illustrated, renderedtissue wall region 50 and/or imagedtissue wall portion 50B represent a tissue wall portion comprising an interatrial septum which is to be crossed by acatheter probe 11 at a contact region corresponding to contactedregion 302, for example the foramen ovale (which may be a weak spot in the interatrial septum, or even a residual opening between the two atria). Although theultrasound images 320 do not simultaneously show in imagedtissue wall portion 50B the whole three dimensional structure of the tissue wall portion represented by renderedtissue wall region 50, they potentially do reveal partial information about how the wall is deforming. In some embodiments, the partial information is used in a simulation of tissue-wall interaction dynamics to show a live-updated 3-D view of the tissue wall. For example, a curve extending through the image plane along the visualized extent of the interatrial septum is optionally used as a guide, to which a simulated tissue wall geometrical distortion in that plane is fit; and moreover, may be used as a boundary condition to which out-of-plane tissue wall geometrical distortions are also constrained. - Turning now to the images in sequence,
FIG. 3A represents a rendered view showing thetip 301 of renderedcatheter probe 11A approaching the contactedregion 302 of renderedtissue wall region 50. Renderedtissue wall region 50 is shown in cross section; however, it should be understood that in other examples (not drawn) it may be shown from any other appropriate view angle. Optionally or additionally, renderedtissue wall region 50 is shown opaque, transparent, or in any suitable combination of the two. - In
FIG. 3A , the renderedtissue wall region 50 is shown in what is optionally its default and/or resting state geometry: for example, a geometry determined from a segmentation of an earlier MRI and/or CT scan (it should be understood that contact-independent behaviors such as periodic heart contractions are optionally superimposed on a default geometry). In some embodiments, based on the data ofFIG. 3B , a simulator is configured to recognize that this non-interacting geometry default should be shown. For example, a contactsensing parameter value 313 optionally indicates that there is no contact force exerted. Additionally or alternatively, the distance betweencatheter probe position 312 and the expected (optionally, sensed) wall position trace atdotted line 309 indicates that there is not yet any contact. - Additionally or alternatively, the ultrasound image of
FIG. 3C shows no deformation of renderedwall region 50 in the vicinity of target contactedregion 302, and/or shows a separation between renderedwall region 50 and renderedcatheter probe 11A. Use of 3-D rendering to augment ultrasound imaging of tissue wall deformation (for example, as shown inFIG. 3C ) has the potential advantage of converting a relatively abstract-appearing (cross-sectional, black and white, visually noisy) display of ultrasound-imaged anatomical structures into a solid looking indication of how forces from a catheter are interacting with a heart wall, on the basis of which the penetration operation can be guided. - In the second set in the sequence (
FIGS. 3D-3F ), wall contact has begun, as shown (FIG. 3D ) by the deformation of the renderedtissue wall region 50 in contact withcatheter probe tip 301. Optionally (FIG. 3E ), this simulation is generated to track the rising value of sensed contact (e.g., at point 315). Additionally or alternatively, the simulation is generated to track the forward movement of theprobe tip 301 to point 314; optionally, the simulation scene is generated to track the forward movement with respect to expected or measured wall position trace atdotted line 309. Additionally or alternatively, deformation of the imagedtissue wall portion 50B in an ultrasound image (FIG. 3F ) is used as a constraint to guide how the renderedtissue wall region 50 is geometrically distorted in 3-D. Optionally, contact between imagedtissue wall portion 50B andcatheter probe 11 is determined and/or verified from the ultrasound image as well. - In the third set in the sequence, (
FIGS. 3G-3I ), deformation has reached a maximum beforecatheter probe 11 breaks through the renderedtissue wall region 50 at contacted region 302 (foramen ovale). In the fourth set in the sequence (FIGS. 3J-3L ), renderedcatheter probe 11A is shown having broken through the renderedtissue wall region 50. From the sensing data ofFIG. 3K , the breakthrough is optionally inferred by the sudden drop in sensed contact, optionally in concert with the continued advance of thecatheter probe 11. Additionally or alternatively, the breakthrough is inferred from the sudden increase in distance between thecatheter probe 11 and the actual tissue wall (inferred, for example, from a sudden change in the dielectric environment of an electrode associated with probe tip 301). In the ultrasound image ofFIG. 3L , the breakthrough is optionally inferred from a relaxation of the geometrical distortion of imagedtissue wall portion 50B, and/or by the observation of a portion ofcatheter probe 11 extending on the other side of the imagedtissue wall portion 50B. - Contact Simulation—Example of Simulation
- Reference is now made to
FIGS. 10C-10D , which schematically represent aspects of geometrical deformation of a renderedtissue region 50 in contact with a renderedcatheter probe 11A, according to some embodiments of the present disclosure. In some embodiments of the invention, displayed interactions of a renderedcatheter probe 11A with a renderedtissue wall region 50 include geometrical effects which look like deformations of the tissue wall that visually convey the forces of their interaction. - Full geometrical deformation, including mesh deformation, is described herein in relation to the examples of
FIGS. 2A-2E and 3A-3L . InFIGS. 10C-10D , a different mode of indentation is shown, wherein relatively limited (and, potentially, computationally less expensive) geometrical deformation is simulated by the use of one or more rendering techniques such as normal mapping, depth mapping, shadow mapping, depth of field simulation, and the like. - In
FIG. 10C , renderedcatheter probe 11A is shown in a sequence of positions relative to the renderedsurface 1010 of a rendered tissue region 50 (optionally, renderedsurface 1010 is rendered with the use of any suitable MAPs to provide it with a tissue-like visual appearance). Apart from the obvious lateral displacement, eachposition catheter probe tip 301. - In
FIG. 10D , all the elements ofFIG. 10C and their relative positions remain the same, but there is shown in addition the effects of manipulation of the surface normal map inregion 1021 andindentation region 1022, assuming a light source that is to the left and somewhat behind the plane of the drawing (normal mapping is described in relation toFIGS. 9A-9B ). The normal map manipulations have been chosen to give the appearance of geometrical changes—specifically, to indicate indentations in renderedsurface 1010. In some embodiments of the invention, this geometrical appearance change is optionally triggered by any suitable input related to probe-tissue contact, for example, contact force measurements, dielectric contact quality measurements, and/or relative position measurements of tissue and probe. Optionally, the normal map is also adjusted to reflect contact angle, for example, stretched along a dimension of elongated contact. Since no change in the underlying 3-D object geometry is required in order to produce this effect, there is a potential advantage for computational efficiency and/or reduced complexity of implementation compared to manipulation of the full 3-D geometry. - The normal-mapped mode of representing geometrical deformation is of potential use to an operator for helping to gauge contact quality before lesioning, particularly in views having a substantial elevation angle above the contacted surface. Optionally, views using normal mapping-type indentation are presented alongside views where 3-D geometrical distortion is used (for example, in cross-section, as discussed in relation to
FIGS. 2A-2E ). Optionally, normal mapping is used to exaggerate 3-D geometrical deformation, for example, to potentially increase emphasis and/or clarity. - Physiological Simulation—Example of Simulation
- Reference is now made to
FIGS. 4A-4D , which schematically represent aspects of geometrical deformation of a renderedtissue region 50 due to an internal change such as edema, according to some embodiments of the present disclosure. Reference is also made toFIGS. 10A-10B , which illustrate normal mapping superimposed on a renderedtissue region 50 in order to provide the geometrical appearance of a swelling, according to some embodiments of the present disclosure. Further reference is made toFIGS. 5A-5B , which schematically represent global geometrical deformation of a tissue structure, for example, due to hydration state and/or more global edema than the example ofFIGS. 4A-4D , according to some embodiments of the present disclosure. - In
FIG. 4A ,lesion 401 represents a recently formed lesion, for example, an RF ablation lesion. Over the course of a few minutes after RF ablation, tissue potentially reacts with a swelling response. In some embodiments of the invention, the swelling response is simulated (for example, as a function of time according to the method ofFIG. 1B , and/or based on measurements such as dielectric measurements that provide edema data) by one or both of increasing thickness in aregion 403 surrounding lesion 401 (thickness changes can also be seen in the changing thickness ofregion 411 betweenFIGS. 4B-4D ; comparison also can be made to thebaseline surface boundary 50A), and a change in color and/or texture in region 402 (represented by the partial rings in the drawing). -
FIGS. 10A-10B illustrate how normal mapping can be used to potentially enhance the appearance of changes in a tissue, for example as a result of treatment and/or injury.Lesion 401 again indicates a recently formed lesion. InFIG. 10A , a surface is rendered ascombination image 1000 by combiningbaseline surface texture 1006, with aninjury response overlay 1002. In the combination image 1000 (in the example shown, the method of combination is partial transparency overlaying; optionally, another method of combining within arendering pipeline 1230 is chosen) the injury response is detectable, but not clearly delineated.FIG. 10B adds to this anoverlay 1003 generated from a normal map (assuming a light source to the left of the page) that describes a swelling in the region of the injury response. By changing the geometrical appearance of the tissue (though not necessarily the 3-D tissue geometry data itself), the injured region is potentially emphasized in the resulting view. It is to be understood that the 3-D geometry swelling ofFIGS. 4A-4D are optionally combined with the normal mapping ofFIGS. 10A-10B . - In
FIGS. 5A-5B , generalized tissue thickening is represented by the change in tissue dimension betweenbaseline thickness 420A andswollen thickness 420B. The thickening is optionally derived from measurements and/or extrapolation, for example, according to one or more of the methods ofFIGS. 1A-1B . Optionally, other changes are also made to represent tissue changes. As can be seen from thecross-sectional borders tissue region 50, the 3-D geometry of renderedtissue region 50 is optionally smoothed out with increasing swelling. Additionally or alternatively, normal mapping across the extent ofsurfaces texture surface 421A are optionally smoothed and/or stretched, for example to indicate a tauter appearance as attexture surface 421B. - Example of Probe-Determined Camera Perspective
- Reference is now made to
FIG. 11A , which schematically illustrates a renderedimage 1150 rendered from acamera viewpoint 1154 looking at renderedtissue region 50 along anaxis 1156 parallel to a renderedcatheter probe 11A, according to some embodiments of the present disclosure. Reference is also made toFIG. 11B , which schematically illustrates a field ofview 1152 projected fromcamera viewpoint 1154, including indication ofaxis 1156, according to some embodiments of the present disclosure.Indentation region 1022 indicates a region of touching contact betweenprobe 11 and renderedtissue region 50.FIG. 11A andFIG. 11B comprise views looking onto the same simulation scene. - In some embodiments, a
camera viewpoint 1154 is defined (e.g., as part of the definition of a camera 1223,FIG. 8 ) to be positioned on or near the body of acatheter probe 11, and looking along anaxis 1156 which is substantially parallel to the renderedcatheter probe 11A (termed a “probe-mounted” view herein). Insofar as the system tracks (using measured position) the location and orientation of theactual catheter probe 11 which the rendered orientation of renderedcatheter probe 11A simulates,camera viewpoint 1154 also tracks (by adjustment to match the orientation of the renderedcatheter probe 11A) the orientation of theactual catheter probe 11. - It may be noted that rendered
catheter probe 11A appears in renderedimage 1150 in a position similar to the position of hand-held tools seen in some “first-person” games, wherein a tool is shown on the screen in a position as if held before otherwise unseen avatar whose eyes define the camera position. In some embodiments of the present invention, this viewpoint configuration provides a potential advantage for obtaining a clear view of the field of operation of the probe, e.g., when it contacts tissue. - Optionally, registration between the probe and the viewpoint may comprise any other suitable combination of position and orientation. For example, looking back along a catheter is potentially useful for obtaining a sense of what freedom exists in how the catheter probe can be presently positioned. Looking at the catheter itself from a more distant position potentially provides an improved sense of how the catheter relates to its overall surroundings. In some embodiments, viewpoint optionally shifts (automatically and/or under manual control) depending on what action is being performed; for example, a probe-mounted view like that of
FIG. 11A is optionally used for selection of where a probe should be advanced to contact tissue, while a vantage point more distant from the probe may be selected to show details of how probe and tissue interact once contact is made (for example, as shown in the sequence ofFIGS. 3A, 3D, 3G, and 3J ). In some embodiments, the angular size of the field of view (area subtended within the frame of the rendered image) is selected to be larger or smaller. A larger angular size provides a potential relative advantage in helping an operator orient within a simulated environment, while a smaller angular size is optionally used to magnify details and/or reduce simulated optical distortion in the rendered view. - General
- It is expected that during the life of a patent maturing from this application many relevant catheter probes will be developed; the scope of the term catheter probe is intended to include all such new technologies a priori.
- As used herein with reference to quantity or value, the term “about” means “within ±10% of”.
- The terms “comprises”, “comprising”, “includes”, “including”, “having” and their conjugates mean: “including but not limited to”.
- The term “consisting of” means: “including and limited to”.
- The term “consisting essentially of” means that the composition, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
- As used herein, the singular form “a”, “an” and “the” include plural references unless the context clearly dictates otherwise. For example, the term “a compound” or “at least one compound” may include a plurality of compounds, including mixtures thereof.
- The words “example” and “exemplary” are used herein to mean “serving as an example, instance or illustration”. Any embodiment described as an “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
- The word “optionally” is used herein to mean “is provided in some embodiments and not provided in other embodiments”. Any particular embodiment of the invention may include a plurality of “optional” features except insofar as such features conflict.
- As used herein the term “method” refers to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the chemical, pharmacological, biological, biochemical and medical arts.
- As used herein, the term “treating” includes abrogating, substantially inhibiting, slowing or reversing the progression of a condition, substantially ameliorating clinical or aesthetical symptoms of a condition or substantially preventing the appearance of clinical or aesthetical symptoms of a condition.
- Throughout this application, embodiments of this invention may be presented with reference to a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as “from 1 to 6” should be considered to have specifically disclosed subranges such as “from 1 to 3”, “from 1 to 4”, “from 1 to 5”, “from 2 to 4”, “from 2 to 6”, “from 3 to 6”, etc.; as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
- Whenever a numerical range is indicated herein (for example “10-15”, “10 to 15”, or any pair of numbers linked by these another such range indication), it is meant to include any number (fractional or integral) within the indicated range limits, including the range limits, unless the context clearly dictates otherwise. The phrases “range/ranging/ranges between” a first indicate number and a second indicate number and “range/ranging/ranges from” a first indicate number “to”, “up to”, “until” or “through” (or another such range-indicating term) a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numbers therebetween.
- Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.
- It is the intent of the Applicant(s) that all publications, patents and patent applications referred to in this specification are to be incorporated in their entirety by reference into the specification, as if each individual publication, patent or patent application was specifically and individually noted when referenced that it is to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting. In addition, any priority document(s) of this application is/are hereby incorporated herein by reference in its/their entirety.
- It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
Claims (20)
1. A method of visually displaying effects of a medical procedure, comprising:
receiving interaction data from an intrabody probe indicating touching contacts between the intrabody probe and a body tissue region, wherein the interaction data at least associate the contacts to contacted positions of the body tissue region;
adjusting geometrical rendering data representing a shape of the body tissue region to obtain adjusted geometrical rendering data, wherein the adjusting is based on an indication in the interaction data of a change in the shape of the body tissue region due to the contacting;
rendering the adjusted geometrical rendering data to a rendered image; and
displaying the rendered image;
wherein the geometrical rendering data are adjusted as a function of time since occurrence of an indicated contact.
2. The method of claim 1 , wherein the receiving, the adjusting, and the displaying are performed iteratively for a sequence of contacts for which interaction data is received.
3. The method of claim 1 , wherein the adjusting is as a function of time relative to a time of occurrence of at least one of the indicated contacts, and comprises adjusting the geometrical rendering data to indicate gradual development of a change in geometry of the body tissue region as a result of the contacts.
4. The method of claim 3 , wherein the gradually developed change in geometry indicates a developing state of edema.
5. The method of claim 4 , comprising geometrically distorting the rendering of the geometrical rendering data into a swollen appearance, to an extent based on the indicated development of the state of edema.
6. The method of claim 3 , wherein the contacts comprise mechanical contacts, and the gradual development of a change in geometry indicates swelling of the body tissue region in response to tissue irritation by the mechanical contacts.
7. The method of claim 3 , wherein the contacts comprise an exchange of energy between the intrabody probe and the body tissue region by a mechanism other than contact pressure.
8. The method of claim 1 , wherein the extent and degree of the adjusting model a change in a thickness of the body tissue region.
9. The method of claim 1 , wherein the interaction data describe an exchange of energy between the intrabody probe and the body tissue region by a mechanism other than contact pressure.
10. The method of claim 9 , wherein the adjusting comprises updating the geometrical rendering data based on a history of interaction data describing the exchange of energy.
11. The method of claim 10 , wherein the exchange of energy comprises operation of an ablation modality.
12. The method of claim 11 , wherein the updating changes an indication of lesion extent in the geometrical rendering data based on the history of interaction data describing the exchange of energy by operation of the ablation modality.
13. The method of claim 11 , wherein the updating comprises adjusting the geometrical rendering data to indicate a change in mechanical tissue properties, based on the history of interaction data describing the exchange of energy.
14. The method of claim 11 , wherein the ablation energy exchanged between the intrabody probe and the body tissue region comprises at least one of the group consisting of: radio frequency ablation, cryoablation, microwave ablation, laser ablation, irreversible electroporation, substance injection ablation, and high-intensity focused ultrasound ablation.
15. The method of claim 10 , wherein the updating comprises adjusting the geometrical rendering data to indicate a change in tissue thickness, based on the history of interaction data describing the exchange of energy.
16. The method of claim 10 , wherein effects of the history of interaction data describing the exchange of energy are determined from modelling of thermal effects of the exchange of energy on the body tissue region.
17. The method of claim 16 , wherein the modelling of thermal effects accounts for local tissue region properties affecting transfer of thermal energy between the intrabody probe and the body tissue region.
18. The method of claim 9 , wherein the exchange of energy between the intrabody probe and the body tissue region induces edema, and the adjusting comprises adjusting the geometrical rendering data to indicate the edema.
19. The method of claim 1 , wherein the body tissue region comprises a tissue of at least one organ of the group consisting of the heart, vasculature, stomach, intestines, liver and kidney.
20. The method of claim 1 , further comprising assigning material appearance properties across an extent of the geometrical rendering data, based on the interaction data; and wherein the displaying of the rendered image uses the assigned material appearance properties.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/701,830 US20220211293A1 (en) | 2016-11-16 | 2022-03-23 | Real-time display of tissue deformation by interactions with an intra-body probe |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662422708P | 2016-11-16 | 2016-11-16 | |
US201662422713P | 2016-11-16 | 2016-11-16 | |
US201662422705P | 2016-11-16 | 2016-11-16 | |
PCT/IB2017/057175 WO2018092062A1 (en) | 2016-11-16 | 2017-11-16 | Real-time display of tissue deformation by interactions with an intra-body probe |
US201916349646A | 2019-05-14 | 2019-05-14 | |
US17/701,830 US20220211293A1 (en) | 2016-11-16 | 2022-03-23 | Real-time display of tissue deformation by interactions with an intra-body probe |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2017/057175 Continuation WO2018092062A1 (en) | 2016-11-16 | 2017-11-16 | Real-time display of tissue deformation by interactions with an intra-body probe |
US16/349,646 Continuation US11284813B2 (en) | 2016-11-16 | 2017-11-16 | Real-time display of tissue deformation by interactions with an intra-body probe |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220211293A1 true US20220211293A1 (en) | 2022-07-07 |
Family
ID=67777010
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/349,646 Active 2038-08-27 US11284813B2 (en) | 2016-11-16 | 2017-11-16 | Real-time display of tissue deformation by interactions with an intra-body probe |
US17/701,830 Abandoned US20220211293A1 (en) | 2016-11-16 | 2022-03-23 | Real-time display of tissue deformation by interactions with an intra-body probe |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/349,646 Active 2038-08-27 US11284813B2 (en) | 2016-11-16 | 2017-11-16 | Real-time display of tissue deformation by interactions with an intra-body probe |
Country Status (2)
Country | Link |
---|---|
US (2) | US11284813B2 (en) |
WO (1) | WO2018092062A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016181320A1 (en) | 2015-05-12 | 2016-11-17 | Navix International Limited | Fiducial marking for image-electromagnetic field registration |
EP3484362A1 (en) | 2016-07-14 | 2019-05-22 | Navix International Limited | Characteristic track catheter navigation |
US11284813B2 (en) | 2016-11-16 | 2022-03-29 | Navix International Limited | Real-time display of tissue deformation by interactions with an intra-body probe |
CN110072449B (en) | 2016-11-16 | 2023-02-24 | 纳维斯国际有限公司 | Esophageal position detection by electrical mapping |
EP3541313B1 (en) | 2016-11-16 | 2023-05-10 | Navix International Limited | Estimators for ablation effectiveness |
US10709507B2 (en) * | 2016-11-16 | 2020-07-14 | Navix International Limited | Real-time display of treatment-related tissue changes using virtual material |
WO2018092059A1 (en) | 2016-11-16 | 2018-05-24 | Navix International Limited | Tissue model dynamic visual rendering |
US11043144B2 (en) * | 2017-08-04 | 2021-06-22 | Clarius Mobile Health Corp. | Systems and methods for providing an interactive demonstration of an ultrasound user interface |
KR102216455B1 (en) * | 2019-02-01 | 2021-02-17 | 연세대학교 산학협력단 | Apparatus and method for measuring left atrium thickness of heart |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120310116A1 (en) * | 2011-06-03 | 2012-12-06 | Doron Moshe Ludwin | Detection of tenting |
Family Cites Families (166)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4917097A (en) | 1987-10-27 | 1990-04-17 | Endosonics Corporation | Apparatus and method for imaging small cavities |
US5662108A (en) | 1992-09-23 | 1997-09-02 | Endocardial Solutions, Inc. | Electrophysiology mapping system |
US5553611A (en) | 1994-01-06 | 1996-09-10 | Endocardial Solutions, Inc. | Endocardial measurement method |
US7189208B1 (en) | 1992-09-23 | 2007-03-13 | Endocardial Solutions, Inc. | Method for measuring heart electrophysiology |
CA2144973C (en) | 1992-09-23 | 2010-02-09 | Graydon Ernest Beatty | Endocardial mapping system |
USRE41334E1 (en) | 1992-09-23 | 2010-05-11 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Endocardial mapping system |
US6947785B1 (en) | 1993-09-23 | 2005-09-20 | Endocardial Solutions, Inc. | Interface system for endocardial mapping catheter |
US5598848A (en) | 1994-03-31 | 1997-02-04 | Ep Technologies, Inc. | Systems and methods for positioning multiple electrode structures in electrical contact with the myocardium |
US6322558B1 (en) | 1995-06-09 | 2001-11-27 | Engineering & Research Associates, Inc. | Apparatus and method for predicting ablation depth |
US5697377A (en) | 1995-11-22 | 1997-12-16 | Medtronic, Inc. | Catheter mapping system and method |
IL119262A0 (en) | 1996-02-15 | 1996-12-05 | Biosense Israel Ltd | Locatable biopsy needle |
SE9602574D0 (en) | 1996-06-28 | 1996-06-28 | Siemens Elema Ab | Method and arrangement for locating a measurement and / or treatment catheter in a vessel or organ of a patient |
JP2000514331A (en) | 1996-07-05 | 2000-10-31 | ザ・カロライナス・ハート・インスティテュート | Electromagnetic Imaging and Therapy (EMIT) System |
ES2231887T3 (en) | 1996-09-17 | 2005-05-16 | Biosense Webster, Inc. | SYSTEM TO CONFIRM THE POSITION WITH LEARNING AND TEST FUNCTIONS. |
US5724978A (en) | 1996-09-20 | 1998-03-10 | Cardiovascular Imaging Systems, Inc. | Enhanced accuracy of three-dimensional intraluminal ultrasound (ILUS) image reconstruction |
US6019725A (en) | 1997-03-07 | 2000-02-01 | Sonometrics Corporation | Three-dimensional tracking and imaging system |
GB2329709B (en) | 1997-09-26 | 2001-12-19 | Roke Manor Research | Catheter localisation system |
US7187973B2 (en) | 1998-06-30 | 2007-03-06 | Endocardial Solutions, Inc. | Congestive heart failure pacing optimization method and device |
US6226542B1 (en) | 1998-07-24 | 2001-05-01 | Biosense, Inc. | Three-dimensional reconstruction of intrabody organs |
US6301496B1 (en) | 1998-07-24 | 2001-10-09 | Biosense, Inc. | Vector mapping of three-dimensionally reconstructed intrabody organs and method of display |
US20030074011A1 (en) | 1998-09-24 | 2003-04-17 | Super Dimension Ltd. | System and method of recording and displaying in context of an image a location of at least one point-of-interest in a body during an intra-body medical procedure |
US6423057B1 (en) | 1999-01-25 | 2002-07-23 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | Method and apparatus for monitoring and controlling tissue temperature and lesion formation in radio-frequency ablation procedures |
DE19919907C2 (en) | 1999-04-30 | 2003-10-16 | Siemens Ag | Method and device for catheter navigation in three-dimensional vascular tree images |
US6696844B2 (en) | 1999-06-04 | 2004-02-24 | Engineering & Research Associates, Inc. | Apparatus and method for real time determination of materials' electrical properties |
US6515657B1 (en) | 2000-02-11 | 2003-02-04 | Claudio I. Zanelli | Ultrasonic imager |
US7146210B2 (en) | 2000-02-17 | 2006-12-05 | Standen Ltd. | Apparatus and method for optimizing tumor treatment efficiency by electric fields |
JP2001340336A (en) | 2000-06-01 | 2001-12-11 | Toshiba Medical System Co Ltd | Ultrasonic diagnosing device and ultrasonic diagnosing method |
US20080125775A1 (en) | 2001-02-28 | 2008-05-29 | Morris David L | Hemostasis and/or coagulation of tissue |
US6989010B2 (en) | 2001-04-26 | 2006-01-24 | Medtronic, Inc. | Ablation system and method of use |
JP3996359B2 (en) | 2001-07-12 | 2007-10-24 | 株式会社日立メディコ | Magnetic resonance imaging system |
US7894877B2 (en) | 2002-05-17 | 2011-02-22 | Case Western Reserve University | System and method for adjusting image parameters based on device tracking |
WO2003097125A2 (en) | 2002-05-17 | 2003-11-27 | Case Western Reserve University | Double contrast technique for mri-guided vascular interventions |
US6780182B2 (en) | 2002-05-23 | 2004-08-24 | Adiana, Inc. | Catheter placement detection system and operator interface |
US7001383B2 (en) | 2002-10-21 | 2006-02-21 | Biosense, Inc. | Real-time monitoring and mapping of ablation lesion formation in the heart |
US7306593B2 (en) | 2002-10-21 | 2007-12-11 | Biosense, Inc. | Prediction and assessment of ablation of cardiac tissue |
US7881769B2 (en) | 2002-11-18 | 2011-02-01 | Mediguide Ltd. | Method and system for mounting an MPS sensor on a catheter |
US7697972B2 (en) | 2002-11-19 | 2010-04-13 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
DE60321836D1 (en) | 2002-11-27 | 2008-08-07 | Medical Device Innovations Ltd | Gewebeablationsgerät |
US20040220461A1 (en) | 2003-04-29 | 2004-11-04 | Yitzhack Schwartz | Transseptal facilitation using sheath with electrode arrangement |
US20050054913A1 (en) | 2003-05-05 | 2005-03-10 | Duerk Jeffrey L. | Adaptive tracking and MRI-guided catheter and stent placement |
DE10325003A1 (en) | 2003-06-03 | 2004-12-30 | Siemens Ag | Visualization of 2D / 3D-merged image data for catheter angiography |
JP2007501069A (en) | 2003-08-04 | 2007-01-25 | シーメンス コーポレイト リサーチ インコーポレイテツド | Virtual organ expansion processing method for visualization |
US8150495B2 (en) | 2003-08-11 | 2012-04-03 | Veran Medical Technologies, Inc. | Bodily sealants and methods and apparatus for image-guided delivery of same |
US20050054918A1 (en) | 2003-09-04 | 2005-03-10 | Sra Jasbir S. | Method and system for treatment of atrial fibrillation and other cardiac arrhythmias |
CA2505464C (en) | 2004-04-28 | 2013-12-10 | Sunnybrook And Women's College Health Sciences Centre | Catheter tracking with phase information |
US8446473B2 (en) | 2004-10-05 | 2013-05-21 | Brainlab Ag | Tracking system with scattering effect utilization, in particular with star effect and/or cross effect utilization |
US8423125B2 (en) | 2004-11-09 | 2013-04-16 | Spectrum Dynamics Llc | Radioimaging |
US7684850B2 (en) | 2005-01-07 | 2010-03-23 | Biosense Webster, Inc. | Reference catheter for impedance calibration |
EP1853162B9 (en) | 2005-03-03 | 2011-08-31 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Apparatus for locating the fossa ovalis, creating a virtual fossa ovalis and performing transseptal puncture |
WO2006120982A1 (en) | 2005-05-06 | 2006-11-16 | National University Corpration Nagoya University | Catheter surgery simulation |
US7681579B2 (en) | 2005-08-02 | 2010-03-23 | Biosense Webster, Inc. | Guided procedures for treating atrial fibrillation |
US20070049915A1 (en) | 2005-08-26 | 2007-03-01 | Dieter Haemmerich | Method and Devices for Cardiac Radiofrequency Catheter Ablation |
DE102005042329A1 (en) | 2005-09-06 | 2007-03-08 | Siemens Ag | Electro-physiological catheter application assistance providing method, involves detecting contour of areas relevant for catheter application, and showing areas as simple line in representations of mapping and/or image data |
US8355801B2 (en) | 2005-09-26 | 2013-01-15 | Biosense Webster, Inc. | System and method for measuring esophagus proximity |
US9492226B2 (en) | 2005-12-06 | 2016-11-15 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Graphical user interface for real-time RF lesion depth display |
US8403925B2 (en) | 2006-12-06 | 2013-03-26 | St. Jude Medical, Atrial Fibrillation Division, Inc. | System and method for assessing lesions in tissue |
CA2631940C (en) | 2005-12-06 | 2016-06-21 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Assessment of electrode coupling for tissue ablation |
US8050739B2 (en) | 2005-12-15 | 2011-11-01 | Koninklijke Philips Electronics N.V. | System and method for visualizing heart morphology during electrophysiology mapping and treatment |
US8457712B2 (en) | 2005-12-30 | 2013-06-04 | Wisconsin Alumni Research Foundation | Multi-mode medical device system and methods of manufacturing and using same |
US9782229B2 (en) | 2007-02-16 | 2017-10-10 | Globus Medical, Inc. | Surgical robot platform |
US8556888B2 (en) | 2006-08-04 | 2013-10-15 | INTIO, Inc. | Methods and apparatuses for performing and monitoring thermal ablation |
US7996060B2 (en) | 2006-10-09 | 2011-08-09 | Biosense Webster, Inc. | Apparatus, method, and computer software product for registration of images of an organ using anatomical features outside the organ |
EP2086384A2 (en) | 2006-10-10 | 2009-08-12 | Biosense Webster, Inc. | Esophageal mapping catheter |
WO2008048780A1 (en) | 2006-10-16 | 2008-04-24 | Massachusetts Institute Of Technology | Method and apparatus for localizing an object in the body |
AU2007350982A1 (en) | 2006-11-10 | 2008-10-23 | Dorian Averbuch | Adaptive navigation technique for navigating a catheter through a body channel or cavity |
US8532742B2 (en) | 2006-11-15 | 2013-09-10 | Wisconsin Alumni Research Foundation | System and method for simultaneous 3DPR device tracking and imaging under MR-guidance for therapeutic endovascular interventions |
US8473030B2 (en) | 2007-01-12 | 2013-06-25 | Medtronic Vascular, Inc. | Vessel position and configuration imaging apparatus and methods |
US20080183070A1 (en) | 2007-01-29 | 2008-07-31 | Wisconsin Alumni Research Foundation | Multi-mode medical device system with thermal ablation capability and methods of using same |
US20080190438A1 (en) | 2007-02-08 | 2008-08-14 | Doron Harlev | Impedance registration and catheter tracking |
US20100063387A1 (en) | 2007-02-26 | 2010-03-11 | Koninklijke Philips Electronics N.V. | Pointing device for medical imaging |
US20080208039A1 (en) | 2007-02-28 | 2008-08-28 | Wisconsin Alumni Research Foundation | System and method of performing therapeutic endovascular interventions |
US10433929B2 (en) | 2007-03-09 | 2019-10-08 | St. Jude Medical, Atrial Fibrillation Division, Inc. | System and method for local deformable registration of a catheter navigation system to image data or a model |
US20080275440A1 (en) | 2007-05-03 | 2008-11-06 | Medtronic, Inc. | Post-ablation verification of lesion size |
US8160690B2 (en) | 2007-06-14 | 2012-04-17 | Hansen Medical, Inc. | System and method for determining electrode-tissue contact based on amplitude modulation of sensed signal |
JP5523681B2 (en) | 2007-07-05 | 2014-06-18 | 株式会社東芝 | Medical image processing device |
US8562602B2 (en) | 2007-09-14 | 2013-10-22 | Lazure Technologies, Llc | Multi-layer electrode ablation probe and related methods |
WO2009065140A1 (en) | 2007-11-16 | 2009-05-22 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Device and method for real-time lesion estimation during ablation |
US8320711B2 (en) | 2007-12-05 | 2012-11-27 | Biosense Webster, Inc. | Anatomical modeling from a 3-D image and a surface mapping |
WO2009079602A1 (en) | 2007-12-17 | 2009-06-25 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Systems and methods for modeling both unobstructed and obstructed portions of a catheter |
US8103327B2 (en) | 2007-12-28 | 2012-01-24 | Rhythmia Medical, Inc. | Cardiac mapping catheter |
US20090306643A1 (en) | 2008-02-25 | 2009-12-10 | Carlo Pappone | Method and apparatus for delivery and detection of transmural cardiac ablation lesions |
US20090221908A1 (en) | 2008-03-01 | 2009-09-03 | Neil David Glossop | System and Method for Alignment of Instrumentation in Image-Guided Intervention |
US8300047B2 (en) | 2008-03-10 | 2012-10-30 | Siemens Aktiengesellschaft | System and method for colon unfolding via skeletal subspace deformation |
US8532734B2 (en) | 2008-04-18 | 2013-09-10 | Regents Of The University Of Minnesota | Method and apparatus for mapping a structure |
US9198733B2 (en) | 2008-04-29 | 2015-12-01 | Virginia Tech Intellectual Properties, Inc. | Treatment planning for electroporation-based therapies |
US10238447B2 (en) | 2008-04-29 | 2019-03-26 | Virginia Tech Intellectual Properties, Inc. | System and method for ablating a tissue site by electroporation with real-time monitoring of treatment progress |
US9283051B2 (en) | 2008-04-29 | 2016-03-15 | Virginia Tech Intellectual Properties, Inc. | System and method for estimating a treatment volume for administering electrical-energy based therapies |
US20090275828A1 (en) | 2008-05-01 | 2009-11-05 | Magnetecs, Inc. | Method and apparatus for creating a high resolution map of the electrical and mechanical properties of the heart |
US20100063400A1 (en) | 2008-09-05 | 2010-03-11 | Anne Lindsay Hall | Method and apparatus for catheter guidance using a combination of ultrasound and x-ray imaging |
US8137343B2 (en) | 2008-10-27 | 2012-03-20 | Rhythmia Medical, Inc. | Tracking system using field mapping |
WO2010065786A1 (en) | 2008-12-03 | 2010-06-10 | St. Jude Medical, Atrial Fibrillation Division, Inc. | System and method for determining the positioin of the tip of a medical catheter within the body of a patient |
GB0904194D0 (en) | 2009-03-11 | 2009-04-22 | Southern Health And Social Care Trust | Apparatus for carrying out intravascular procedures and/or diagnosis |
JP5786108B2 (en) | 2009-05-08 | 2015-09-30 | セント・ジュード・メディカル・ルクセンブルク・ホールディング・エスエーアールエル | Method and apparatus for controlling lesion size in catheter ablation therapy |
EP2427106B1 (en) | 2009-05-08 | 2017-04-26 | Rhythmia Medical, Inc. | Impedance based anatomy generation |
EP2440130A4 (en) | 2009-06-08 | 2015-06-03 | Mri Interventions Inc | Mri-guided surgical systems with proximity alerts |
US8311791B1 (en) | 2009-10-19 | 2012-11-13 | Surgical Theater LLC | Method and system for simulating surgical procedures |
US8454589B2 (en) | 2009-11-20 | 2013-06-04 | St. Jude Medical, Atrial Fibrillation Division, Inc. | System and method for assessing effective delivery of ablation therapy |
US9131869B2 (en) | 2010-05-11 | 2015-09-15 | Rhythmia Medical, Inc. | Tracking using field mapping |
CN102939051A (en) | 2010-06-13 | 2013-02-20 | 安吉奥梅特里克斯公司 | Methods and systems for determining vascular bodily lumen information and guiding medical devices |
WO2012047563A1 (en) | 2010-09-27 | 2012-04-12 | Bailin Steven J | Method for determining the location of regions in tissue relevant to electrical propagation |
US9254090B2 (en) | 2010-10-22 | 2016-02-09 | Intuitive Surgical Operations, Inc. | Tissue contrast imaging systems |
US9039687B2 (en) | 2010-10-28 | 2015-05-26 | Medtronic Ablation Frontiers Llc | Reactance changes to identify and evaluate cryo ablation lesions |
US8532738B2 (en) | 2010-11-04 | 2013-09-10 | Biosense Webster (Israel), Ltd. | Visualization of catheter-tissue contact by map distortion |
US9999399B2 (en) | 2010-11-16 | 2018-06-19 | Siemens Healthcare Gmbh | Method and system for pigtail catheter motion prediction |
EP2656307B1 (en) | 2010-12-20 | 2017-07-05 | Koninklijke Philips N.V. | System and method for automatic generation of initial radiation treatment plans |
EP3482708B1 (en) | 2010-12-27 | 2021-03-10 | St. Jude Medical International Holding S.à r.l. | Prediction of atrial wall electrical reconnection based on contact force measured duing rf ablation |
US8708902B2 (en) | 2010-12-30 | 2014-04-29 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Catheter configuration interface and related system |
JP5956463B2 (en) | 2010-12-30 | 2016-07-27 | セント・ジュード・メディカル・エイトリアル・フィブリレーション・ディヴィジョン・インコーポレーテッド | System for analyzing and mapping electrophysiological data from body tissue, method of operating system for analyzing electrophysiological data, and catheter system for analyzing data measured from heart tissue |
US20120172724A1 (en) | 2010-12-31 | 2012-07-05 | Hill Anthony D | Automatic identification of intracardiac devices and structures in an intracardiac echo catheter image |
US9265557B2 (en) | 2011-01-31 | 2016-02-23 | Medtronic Ablation Frontiers Llc | Multi frequency and multi polarity complex impedance measurements to assess ablation lesions |
US10765336B2 (en) | 2011-02-11 | 2020-09-08 | The Johns Hopkins University | System and method for planning a patient-specific cardiac procedure |
US9014423B2 (en) | 2011-03-14 | 2015-04-21 | Siemens Aktiengesellschaft | Method and system for catheter tracking in fluoroscopic images using adaptive discriminant learning and measurement fusion |
WO2013052590A1 (en) | 2011-10-04 | 2013-04-11 | Vessix Vascular, Inc. | Apparatus and method for treatment of in-stent restenosis |
US9101333B2 (en) | 2011-11-14 | 2015-08-11 | Biosense Webster (Israel) Ltd. | Integrative atrial fibrillation ablation |
EP2785252B1 (en) | 2011-11-28 | 2018-05-16 | Acist Medical Systems, Inc. | Catheters for imaging and ablating tissue |
AU2012358224B2 (en) | 2011-12-23 | 2017-08-10 | Boston Scientific Scimed, Inc. | Tissue remodeling systems and a method for delivering energy to maintain predetermined target temperature |
US8876817B2 (en) | 2012-01-10 | 2014-11-04 | Boston Scientific Scimed Inc. | Electrophysiology system and methods |
BR112014024900A2 (en) | 2012-04-05 | 2018-07-24 | Bard Access Systems Inc | devices and systems for navigating and positioning a central venous catheter in a patient. |
US20130310673A1 (en) | 2012-05-17 | 2013-11-21 | Assaf Govari | Guide wire with position sensing electrodes |
US8923959B2 (en) | 2012-08-27 | 2014-12-30 | Birinder Robert Boveja | Methods and system for real-time cardiac mapping |
CA2881457C (en) | 2012-08-31 | 2021-10-26 | Acutus Medical, Inc. | Catheter system and methods of medical uses of same, including diagnostic and treatment uses for the heart |
US9895079B2 (en) | 2012-09-26 | 2018-02-20 | Biosense Webster (Israel) Ltd. | Electropotential mapping |
US20140188440A1 (en) | 2012-12-31 | 2014-07-03 | Intuitive Surgical Operations, Inc. | Systems And Methods For Interventional Procedure Planning |
AU2014208382A1 (en) | 2013-01-24 | 2015-07-23 | Tylerton International Holdings Inc. | Body structure imaging |
GB2510452A (en) | 2013-01-31 | 2014-08-06 | Naviconix Ltd | Method of mapping the heart with a trackable electrode catheter |
US9993287B2 (en) | 2013-03-13 | 2018-06-12 | Covidien Lp | System configured to provide controlled depth of hemostasis |
US9693820B2 (en) | 2013-03-15 | 2017-07-04 | St. Jude Medical, Atrial Fibrillation Division, Inc. | System for detecting catheter electrodes entering into and exiting from an introducer |
US9980653B2 (en) | 2013-05-03 | 2018-05-29 | Biosense Webster (Israel), Ltd. | Valve view map |
EP3733060B1 (en) | 2013-05-07 | 2021-06-16 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Utilization of electrode spatial arrangements for characterizing cardiac conduction conditions |
US20150147382A1 (en) | 2013-09-23 | 2015-05-28 | Exir Nano Sina Company | Topical liposomal compositions for delivering hydrophobic drugs and methods preparing same |
US20150099942A1 (en) | 2013-10-04 | 2015-04-09 | Volcano Corporation | Vascular securement catheter with imaging |
US20160270683A1 (en) | 2013-11-01 | 2016-09-22 | Koninklijke Philips N.V. | System for determining electrical characteristics on a surface of a heart |
US20150141978A1 (en) | 2013-11-20 | 2015-05-21 | Boston Scientific Scimed, Inc. | Ablation medical devices and methods for making and using ablation medical devices |
DE102014000775A1 (en) | 2014-01-24 | 2015-07-30 | Man Diesel & Turbo Se | Kipplagersegment for a shaft bearing device and shaft bearing device |
EP3116408B1 (en) | 2014-03-12 | 2018-12-19 | Cibiem, Inc. | Ultrasound ablation catheter |
US10776961B2 (en) | 2014-07-30 | 2020-09-15 | Navix International Limited | Registering nuclear medicine data |
CN117323001A (en) | 2014-09-08 | 2024-01-02 | 皇家飞利浦有限公司 | Optical fiber shape sensing system |
US20170360498A1 (en) | 2014-12-03 | 2017-12-21 | Baylis Medical Company Inc. | Devices and Methods for Electrosurgical Navigation |
CN106999080B (en) | 2014-12-18 | 2020-08-18 | 波士顿科学医学有限公司 | Real-time morphological analysis for lesion assessment |
US20160242667A1 (en) | 2015-02-20 | 2016-08-25 | Boston Scientific Scimed Inc. | Tissue contact sensing using a medical device |
WO2016135584A2 (en) | 2015-02-27 | 2016-09-01 | Koninklijke Philips N.V. | System and method for adaptive ablation and therapy based on elastography monitoring |
CN107847289A (en) | 2015-03-01 | 2018-03-27 | 阿里斯医疗诊断公司 | The morphology operation of reality enhancing |
US9636164B2 (en) | 2015-03-25 | 2017-05-02 | Advanced Cardiac Therapeutics, Inc. | Contact sensing systems and methods |
US10881455B2 (en) | 2015-05-12 | 2021-01-05 | Navix International Limited | Lesion assessment by dielectric property analysis |
RU2017140233A (en) | 2015-05-12 | 2019-06-13 | Навикс Интернэшнл Лимитед | Contact quality assessment through dielectric analysis |
WO2016181320A1 (en) | 2015-05-12 | 2016-11-17 | Navix International Limited | Fiducial marking for image-electromagnetic field registration |
WO2016181316A1 (en) | 2015-05-12 | 2016-11-17 | Navix International Limited | Systems and methods for tracking an intrabody catheter |
US10517670B2 (en) | 2015-07-16 | 2019-12-31 | Biosense Webster (Israel) Ltd. | Estimation of lesion size |
US10792097B2 (en) | 2015-12-03 | 2020-10-06 | Biosense Webster (Israel) Ltd. | Ablation line contiguity index |
EP3484362A1 (en) | 2016-07-14 | 2019-05-22 | Navix International Limited | Characteristic track catheter navigation |
US11266467B2 (en) | 2016-10-25 | 2022-03-08 | Navix International Limited | Systems and methods for registration of intra-body electrical readings with a pre-acquired three dimensional image |
US10709507B2 (en) | 2016-11-16 | 2020-07-14 | Navix International Limited | Real-time display of treatment-related tissue changes using virtual material |
WO2018130976A1 (en) | 2017-01-12 | 2018-07-19 | Navix International Limited | Estimation of effectiveness of ablation adjacency |
US11284813B2 (en) | 2016-11-16 | 2022-03-29 | Navix International Limited | Real-time display of tissue deformation by interactions with an intra-body probe |
EP3541313B1 (en) | 2016-11-16 | 2023-05-10 | Navix International Limited | Estimators for ablation effectiveness |
CN110072449B (en) | 2016-11-16 | 2023-02-24 | 纳维斯国际有限公司 | Esophageal position detection by electrical mapping |
WO2018092059A1 (en) | 2016-11-16 | 2018-05-24 | Navix International Limited | Tissue model dynamic visual rendering |
WO2019034944A1 (en) | 2017-08-17 | 2019-02-21 | Navix International Limited | Reconstruction of an anatomical structure from intrabody measurements |
EP3568837A1 (en) | 2017-01-12 | 2019-11-20 | Navix International Limited | Flattened view for intra-lumenal navigation |
US11311204B2 (en) | 2017-01-12 | 2022-04-26 | Navix International Limited | Systems and methods for reconstruction of intrabody electrical readings to anatomical structure |
WO2018134747A1 (en) | 2017-01-22 | 2018-07-26 | Navix International Limited | Coronary sinus-based electromagnetic mapping |
CN110461227B (en) | 2017-02-09 | 2022-07-19 | 纳维斯国际有限公司 | Intracorporeal probe navigation by electrical self-sensing |
US11806126B2 (en) | 2017-05-10 | 2023-11-07 | Navix International Limited | Property- and position-based catheter probe target identification |
CN111050641B (en) | 2017-08-17 | 2023-06-09 | 纳维斯国际有限公司 | Remote imaging based on field gradients |
WO2019111180A1 (en) | 2017-12-05 | 2019-06-13 | Navix International Limited | Electrophysiology procedure without ionizing radiation imaging |
CN112367907A (en) | 2018-05-07 | 2021-02-12 | 纳维斯国际有限公司 | Multifunctional imaging |
-
2017
- 2017-11-16 US US16/349,646 patent/US11284813B2/en active Active
- 2017-11-16 WO PCT/IB2017/057175 patent/WO2018092062A1/en active Application Filing
-
2022
- 2022-03-23 US US17/701,830 patent/US20220211293A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120310116A1 (en) * | 2011-06-03 | 2012-12-06 | Doron Moshe Ludwin | Detection of tenting |
Also Published As
Publication number | Publication date |
---|---|
WO2018092062A1 (en) | 2018-05-24 |
US11284813B2 (en) | 2022-03-29 |
US20190328275A1 (en) | 2019-10-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11793571B2 (en) | Real-time display of treatment-related tissue changes using virtual material | |
US20220211293A1 (en) | Real-time display of tissue deformation by interactions with an intra-body probe | |
US11631226B2 (en) | Tissue model dynamic visual rendering | |
Chheang et al. | A collaborative virtual reality environment for liver surgery planning | |
US8149236B2 (en) | Information processing apparatus and program | |
US11183296B1 (en) | Method and apparatus for simulated contrast for CT and MRI examinations | |
Ren et al. | Dynamic 3-D virtual fixtures for minimally invasive beating heart procedures | |
KR101206340B1 (en) | Method and System for Providing Rehearsal of Image Guided Surgery and Computer-readable Recording Medium for the same | |
US20190340838A1 (en) | Flattened view for intra-lumenal navigation | |
KR101700847B1 (en) | Method for Providing Training of Image Guided Surgery and Computer-readable Recording Medium for the same | |
JP7164345B2 (en) | MEDICAL IMAGE PROCESSING APPARATUS, MEDICAL IMAGE PROCESSING METHOD, AND MEDICAL IMAGE PROCESSING PROGRAM | |
US20240058070A1 (en) | Virtual reality surgical training systems | |
JP2007135843A (en) | Image processor, image processing program and image processing method | |
CN103632595B (en) | Multiple intracavitary therapy endoscopic surgery doctor religion training system | |
EP3966667A1 (en) | Virtual reality surgical training systems | |
KR20200042853A (en) | Mapping of activation wavefronts | |
US20090290769A1 (en) | Medical image processing method | |
US11771508B2 (en) | Robotically-assisted surgical device, robotically-assisted surgery method, and system | |
RU2735068C1 (en) | Body cavity map | |
JPH10201755A (en) | Method for measuring three-dimensional size in pseudo-three-dimensional image and its system | |
Hettig et al. | Visual Navigation Support for Liver Applicator Placement using Interactive Map Displays. | |
JP2002336242A (en) | Three-dimensional image display device | |
Lee et al. | Interactive manipulation and visualization of a deformable 3D organ model for medical diagnostic support | |
JP2021045341A (en) | Arthroscopic surgery support device, arthroscopic surgery support method, and program | |
WO2019067548A1 (en) | Systems and methods for ablation visualization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: NAVIX INTERNATIONAL LIMITED, VIRGIN ISLANDS, BRITISH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHMAYAHU, YIZHAQ;SCHWARTZ, YITZHACK;SIGNING DATES FROM 20180215 TO 20180411;REEL/FRAME:061222/0099 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |