US20160320930A1 - User interface devices for electrophysiology lab diagnostic and therapeutic equipment - Google Patents

User interface devices for electrophysiology lab diagnostic and therapeutic equipment Download PDF

Info

Publication number
US20160320930A1
US20160320930A1 US15/144,135 US201615144135A US2016320930A1 US 20160320930 A1 US20160320930 A1 US 20160320930A1 US 201615144135 A US201615144135 A US 201615144135A US 2016320930 A1 US2016320930 A1 US 2016320930A1
Authority
US
United States
Prior art keywords
user
user interface
display panel
logic
buttons
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/144,135
Inventor
Charles Bryan Byrd
Eric Betzler
Sandeep Dani
Israel A. Byrd
Eric S. Olson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
St Jude Medical Atrial Fibrillation Division Inc
Original Assignee
St Jude Medical Atrial Fibrillation Division Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by St Jude Medical Atrial Fibrillation Division Inc filed Critical St Jude Medical Atrial Fibrillation Division Inc
Priority to US15/144,135 priority Critical patent/US20160320930A1/en
Assigned to ST. JUDE MEDICAL, ATRIAL FIBRILLATION DIVISION, INC. reassignment ST. JUDE MEDICAL, ATRIAL FIBRILLATION DIVISION, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BETZLER, Eric, BYRD, CHARLES B., DANI, Sandeep, BYRD, ISRAEL A., OLSON, ERIC S.
Publication of US20160320930A1 publication Critical patent/US20160320930A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0487Special user inputs or interfaces
    • A61B2560/0493Special user inputs or interfaces controlled by voice
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the instant disclosure relates generally to electrophysiology lab integration, and more particularly to user interfaces and devices therefore for electrophysiology lab diagnostic and therapeutic equipment.
  • Such a lab may have use of a wide variety of diagnostic and therapeutic equipment useful in rendering medical service to a patient, such as imaging systems (e.g., fluoroscopy, intracardiac echocardiography, etc.), an electro-anatomic visualization, mapping and navigation system, ablation energy sources (e.g., radio frequency (RF) ablation generator), a recording system (e.g., for ECG, cardiac signals, etc.), a cardiac stimulator and the like.
  • imaging systems e.g., fluoroscopy, intracardiac echocardiography, etc.
  • ablation energy sources e.g., radio frequency (RF) ablation generator
  • a recording system e.g., for ECG, cardiac signals, etc.
  • cardiac stimulator e.g., for ECG, cardiac signals, etc.
  • a procedure room 10 (i.e., a sterile environment) may have an associated control area or room 12 , which is commonly outfitted with one or more control stations 141 , 14 2 , . . . 14 n that are operated by one or more control technicians.
  • Each control station may include a respective display monitor, keyboard and mouse for use by the technician.
  • the control station(s) may be across the room, or outside of the procedure room 10 completely, perhaps configured with a common window to allow the technician(s) to observe the procedure room through the window.
  • These control station(s) allow access to and may be used to control the diagnostic and therapeutic equipment mentioned above.
  • an electrophysiology (EP) physician 16 is scrubbed into a sterile procedure and typically manipulates one or more catheters (not shown) in a sterile drape covered body of the patient 18 .
  • the physician's sterile gloved hands are typically engaged with the catheter handle and shaft next to the patient and he or she is therefore unable to directly make changes himself to any of the EP systems.
  • the procedure room 10 typically includes one or more monitors (e.g., an integrated multi-display monitor 20 is shown) arranged so that the physician 16 can see the monitor 20 on which is displayed various patient information being produced by the diagnostic and therapeutic equipment mentioned above.
  • monitors e.g., an integrated multi-display monitor 20 is shown
  • multiple applications for example, an electro-anatomic mapping application (e.g., EnSite VelocityTM) and an EP signal acquisition and recording application, direct a visual output to a respective display area of monitor 20 .
  • an electro-anatomic mapping application e.g., EnSite VelocityTM
  • an EP signal acquisition and recording application direct a visual output to a respective display area of monitor 20 .
  • the physician 16 verbalizes such commands to the control technicians in the control area/room 12 who are working at the various control stations 14 1 , 14 2 , . . . 14 n .
  • the multiple technicians at multiple control stations use multiple keyboard/mouse sets to control the multiple applications.
  • the verbal commands between the physician and the technician occur throughout the procedure.
  • the EP physician 16 can verbally communicate (i.e., to the control technician—a mapping system operator) the desired view of the map to be displayed, when to collect points, when to separate anatomic locations, and other details of creating and viewing an anatomic map.
  • the EP physician 16 can also communicate which signal traces to show, the desired amplitude, when to drop a lesion marker, and when to record a segment, to name a few. Where the technician is in a separate room, communication can be facilitated using radio.
  • One advantage of the methods and apparatuses described, depicted and claimed herein is that they provide an EP physician with the capability of directly controlling an EP diagnostic or therapeutic system, such as an electro-anatomic mapping system. This capability eliminates the need for the physician to first communicate his/her wishes to a control technician, who in turn must hear, interpret and act on the physician's command.
  • the improved control paradigm results in reduced times for medical procedures.
  • a device for allowing a user to control an electro-anatomic mapping system includes an electronic control unit (ECU) and input means, using the ECU, for acquiring a user input with respect to a view of an anatomical model of at least a portion of a body of a patient.
  • the user input is selected from the group comprising a user touch, a user multi-touch, a user gesture, a verbal command, a motion pattern of a user-controlled object, a user motion pattern and a user electroencephalogram.
  • the ECU is configured to communicate the acquired input to the mapping system for further processing.
  • the acquired user input can correspond to any of a variety of mapping systems commands, for example only at least one of: (1) creating a map with respect to the view; (2) collecting points with respect to the view; (3) segmenting regions by anatomy with respect to the view; (4) rotating the view; (5) enlarging or reducing a portion of the view; (6) panning the view; (7) selecting one of a plurality of maps for the view; (8) selecting a signal trace; (9) adjusting a signal amplitude; (10) adjusting a sweep speed; (11) recording a segment; (12) placing an event marker; (13) placing a lesion marker with respect to the view; (14) activating a replay feature of a stored, temporally varying physiologic parameter and (15) activating a replay of a stored video clip.
  • mapping systems commands for example only at least one of: (1) creating a map with respect to the view; (2) collecting points with respect to the view; (3) segmenting regions by anatomy with respect to the view; (4) rotating the view; (5) enlarging or
  • the input means includes a touch-responsive display panel coupled to the ECU.
  • the input means also includes user interface logic (executed by the ECU) configured to display a user interface on the touch-responsive display panel.
  • the user interface logic is further configured to allow a user to interact with the touch-responsive panel for acquiring the above-mentioned user input with respect to the anatomical model.
  • the user interface in combination with the touch-panel allows the user to provide input by way of touch, multi-touch, and gesture.
  • the device further includes voice recognition logic configured to recognize a set of predefined verbal commands spoken by the user (e.g., the physician).
  • the device includes wireless communications functionality, improving portability of the device within a procedure room or the control room.
  • the user interface logic is configured to present a plurality of application-specific user interfaces associated with a plurality of different diagnostic or therapeutic systems.
  • application-specific user interfaces e.g., such as that for an electro-anatomic mapping system, an EP recording system, an ultrasound imaging system, a cardiac stimulator, etc.
  • the input means includes a remote control having a handle configured to be grasped by the user.
  • the remote control includes logic configured to acquire the above-mentioned user input.
  • the user input may include user-controlled motion patterns of the remote control, as well as user key-presses on the remote control.
  • the device is also configured to communicate the acquired user input to the mapping system.
  • the input means includes a motion capture apparatus configured to acquire imaging of movements of the user.
  • the device includes logic configured to identify a motion pattern using the acquired imaging from the motion capture apparatus.
  • the logic is further configured to produce a command, based on the identified motion pattern, and communicate the command to the electro-anatomic mapping system for further processing.
  • the motion capture apparatus provides the capability of receiving input by way of physician gestures (e.g., hand, arm, leg, trunk, facial, etc.).
  • physician gestures e.g., hand, arm, leg, trunk, facial, etc.
  • the device further includes voice recognition logic configured to identify verbal commands spoken by the user.
  • FIG. 1 is a block diagram view of a conventional electrophysiology lab having a sterile procedure room and an associated control room.
  • FIG. 2 is a block diagram view of an embodiment of an electrophysiology lab having a bedside interface device for controlling diagnostic and therapeutic equipment.
  • FIG. 3A is a plan view of a first embodiment of a bedside interface device comprising a touch panel computer, suitable for use in the EP lab of FIG. 2 , and showing a first application-specific user interface.
  • FIG. 3B is an isometric view of a sterile drape configured to isolate the touch panel computer of FIG. 3A .
  • FIG. 4A is a view of a monitor shown in FIG. 2 , showing multiple inset displays associated with a plurality of diagnostic and/or therapeutic systems.
  • FIG. 4B is a view of the monitor of FIG. 4A , showing a zoomed-in window of the display associated with an electro-anatomic mapping system.
  • FIG. 5 is a plan view of the touch panel computer of FIG. 3A showing a second application-specific user interface.
  • FIG. 6 is a plan view of the touch panel computer of FIG. 3A showing a third application-specific user interface.
  • FIG. 7A is a diagrammatic and block diagram view of a second embodiment of the bedside interface device comprising an electronic wand system.
  • FIG. 7B is a diagrammatic view of a third embodiment of the bedside interface device wherein a catheter is integrated with the remote control portion of FIG. 7A .
  • FIG. 8 is a diagrammatic and block diagram view of a fourth embodiment of the bedside interface device comprising a motion capture apparatus.
  • FIGS. 9-10 are diagrammatic views of fifth and sixth embodiments of the bedside interface device comprising touch responsive surface devices that can be covered in a sterile bag.
  • FIG. 11 is a diagrammatic view of a seventh embodiment of the bedside interface device comprising a customized joystick that can be covered in a sterile bag.
  • FIGS. 12-13 are diagrammatic views of eighth and ninth embodiments of the bedside interface device comprising holographic mouse and keyboard input devices, respectively.
  • FIG. 2 is a diagrammatic overview of an electrophysiology (EP) laboratory in which embodiments of the present invention may be used.
  • FIG. 2 shows a sterile procedure room 10 where an EP physician 16 is set to perform one or more diagnostic and/or therapeutic procedures. It should be understood that the separate control area/room 12 of FIG. 1 (not shown in FIG. 2 ) may continue to be used in conjunction with the bedside interface device to be described below.
  • FIG. 2 also shows multi-display monitor 20 as well as a procedure table or bed 22 .
  • monitor 20 may be a multi-display monitor configured to display a plurality of different input channels in respective display areas on the monitor.
  • the monitor 20 may be a commercially available product sold under the trade designation VantageViewTM from St. Jude Medical, Inc. of St. Paul, Minn., USA, which can have a 3840 ⁇ 2160 Quad-HD screen resolution with the flexibility to accept up to sixteen (16) digital or analog image inputs while displaying up to eight (8) images on one screen at one time.
  • the procedure table 22 which may be of conventional construction, is configured to receive a patient (not shown) on whom diagnostic and/or therapeutic procedure(s) are to be performed.
  • FIG. 2 further shows means or apparatus 24 for facilitating physician interaction with one or more diagnostic and/or therapeutic systems.
  • Means or apparatus 24 includes a bedside interface device 26 and optionally one or more base interfaces 28 .
  • Means or apparatus 24 provides the mechanism for the EP physician 16 to directly interact with such systems without the need for the intermediate step of verbalizing commands to a control technician, as described in connection with FIG. 1 .
  • bedside interface device 26 is configured to present a user interface or other input logic with which the user (e.g., the EP physician 16 ) can directly interact or from which an input can be acquired.
  • bedside interface device 26 can be configured to communicate with one or more of the diagnostic/therapeutic systems either wirelessly (as shown) or via a wired connection (not shown).
  • the base interface 28 is configured to interpret and/or facilitate directing the input acquired by the bedside interface device 26 to the appropriate one or more diagnostic and/or therapeutic systems (e.g., an electro-anatomic mapping system).
  • base interface 28 is centralized (as shown), wherein all communications with bedside device 26 occur through base interface 28 .
  • base interface 28 may be functionally distributed, wherein interface functions are located within each diagnostic or therapeutic system.
  • communications between bedside interface 26 and certain ones of the diagnostic/therapeutic systems can be centralized, while communications with other ones of the diagnostic/therapeutic systems can occur directly (i.e., separately).
  • the means or apparatus 24 addresses a number of the shortcomings of the conventional practice as described in the Background.
  • means or apparatus 24 allows the EP physician 16 to directly input levels of degree, for example, how much to rotate a view, as opposed to trying to verbally communicate “how much” to a control technician.
  • the use of means or apparatus 24 avoids the potential confusion that can sometimes occur between the EP physician and the control technician as to convention (i.e., does “rotate right” mean rotate the view or the model?).
  • the use of means or apparatus 24 reduces or eliminates the inherent time delay between the time when the EP physician verbally issues a command and the time when the command is understood and acted upon by the technician.
  • the physician 16 will typically have access to a plurality of diagnostic and/or therapeutic systems in order to perform one or more medical procedures.
  • the physician 16 may have access to a first imaging system, such as a fluoroscopic imaging system 30 , a second imaging system, such as an intracardiac ultrasound or echocardiography (ICE) imaging system 32 , an electro-anatomic positioning, mapping, and visualization system 34 , a further positioning system, such as a medical positioning system (magnetic-field based) 36 , a patient data (electrophysiological (EP) data) monitoring and recording system 38 , a cardiac stimulator 40 , an EP data editing/monitoring system 42 and an ablation system 44 .
  • FIG. 2 schematically shows a communication mechanism 46 which facilitates communication between and among the various systems described above. It should be understood, however, that the communications mechanism 46 may not necessarily function to enable communications between each and every system shown.
  • the fluoroscopic imaging system 30 may comprise conventional apparatus known in the art, for example, single plane or bi-plane configurations.
  • a display area 48 that is shown on monitor 20 corresponds to the display output of fluoroscopic imaging system 30 .
  • the intracardiac ultrasound and/or intracardiac echocardiography (ICE) imaging system 32 may also comprise conventional apparatus known in the art.
  • the system 32 may comprise a commercial system available under the trade designation ViewMateTM Z intracardiac ultrasound system compatible with a ViewFlexTM PLUS intracardiac echocardiography (ICE) catheter, from St. Jude Medical, Inc. of St. Paul, Minn., USA.
  • the system 32 is configured to provide real-time image guidance and visualization, for example, of the cardiac anatomy. Such high fidelity images can be used to help direct diagnosis or therapy during complex electrophysiology procedures.
  • a display area 50 that is shown on monitor 20 corresponds to the display output of the ultrasound imaging system 32 .
  • the system 34 is configured to provide many advanced features, such as visualization, mapping, navigation support and positioning (i.e., determine a position and orientation (P&O) of a sensor-equipped medical device, for example, a P&O of a distal tip portion of a catheter).
  • Such functionality can be provided as part of a larger visualization, mapping and navigation system, for example, an ENSITE VELOCITYTM cardiac electro-anatomic mapping system running a version of EnSite NavXTM navigation and visualization technology software commercially available from St. Jude Medical, Inc., of St. Paul, Minn. and as also seen generally by reference to U.S. Pat. No.
  • System 34 can be configured to perform further advanced functions, such as motion compensation and adjustment functions.
  • Motion compensation may include, for example, compensation for respiration-induced patient body movement, as described in copending U.S. patent application Ser. No.
  • System 34 can be used in connection with or for various medical procedures, for example, EP studies or cardiac ablation procedures.
  • System 34 is further configured to generate and display three dimensional (3D) cardiac chamber geometries or models, display activation timing and voltage data to identify arrhythmias, and to generally facilitate guidance of catheter movement in the body of the patient.
  • a display area 52 that is shown on monitor 20 corresponds to the display output of system 34 , can be viewed by physician 16 during a procedure, which can visually communicate information of interest or need to the physician.
  • the display area 52 in FIG. 2 shows a 3D cardiac model, which, as will be described below in greater detail, may be modified (i.e., rotated, zoomed, etc.) pursuant to commands given directly by physician 16 via the bedside interface device 26 .
  • System 36 is configured to provide positioning information with respect to suitably configured medical devices (i.e., those including a positioning sensor).
  • System 36 may use, at least in part, a magnetic field based localization technology, comprising conventional apparatus known in the art, for example, as seen by reference to U.S. Pat. No. 7,386,339 entitled “MEDICAL IMAGING AND NAVIGATION SYSTEM”, U.S. Pat. No. 6,233,476 entitled “MEDICAL POSITIONING SYSTEM”, and U.S. Pat. No. 7,197,354 entitled “SYSTEM FOR DETERMINING THE POSITION AND ORIENTATION OF A CATHETER”, all of which are hereby incorporated by reference in their entirety as though fully set forth herein.
  • System 36 may comprise a gMPSTM medical positioning system commercially offered by MediGuide Ltd. of Haifa, Israel and now owned by St. Jude Medical, Inc. of St. Paul, Minn., USA.
  • System 36 may alternatively comprise variants, which employ magnetic field generator operation, at least in part, such as a combination magnetic field and current field-based system such as the CARTOTM 3 System available from Biosense Webster, and as generally shown with reference to one or more of U.S. Pat. No. 6,498,944 entitled “Intrabody Measurement,” U.S. Pat. No. 6,788,967 entitled “Medical Diagnosis, Treatment and Imaging Systems,” and U.S. Pat. No. 6,690,963 entitled “System and Method for Determining the Location and Orientation of an Invasive Medical Instrument,” the entire disclosures of which are incorporated herein by reference as though fully set forth herein.
  • EP monitoring and recording system 38 is configured to receive, digitize, display and store electrocardiograms, invasive blood pressure waveforms, marker channels, and ablation data.
  • System 38 may comprise conventional apparatus known in the art.
  • system 38 may comprise a commercially available product sold under the trade designation EP-WorkMateTM from St. Jude Medical, Inc. of St. Paul, Minn., USA.
  • the system 38 can be configured to record a large number of intracardiac channels, may be further configured with an integrated cardiac stimulator (shown in FIG. 2 as stimulator 40 ), as well as offering storage and retrieval capabilities of an extensive database of patient information.
  • Display areas 54 , 56 shown on monitor 20 correspond to the display output of EP monitoring and recording system 38 .
  • Cardiac stimulator 40 is configured to provide electrical stimulation of the heart during EP studies.
  • Stimulator 40 can be provided in either a stand-alone configuration, or can be integrated with EP monitoring and recording system 38 , as shown in FIG. 2 .
  • Stimulator 40 is configured to allow the user to initiate or terminate tachy-arrhythmias manually or automatically using preprogrammed modes of operation.
  • Stimulator 40 may comprise conventional apparatus known in the art.
  • stimulator 40 can comprise a commercially available cardiac stimulator sold under the trade designation EP-4TM available from St. Jude Medical, Inc. of St. Paul, Minn., USA.
  • the display area 58 shown on monitor 20 corresponds to the display output of the cardiac stimulator 40 .
  • EP data editing/monitoring system 42 is configured to allow editing and monitoring of patient data (EP data), as well as charting, analysis, and other functions.
  • System 42 can be configured for connection to EP data recording system 38 for real-time patient charting, physiological monitoring, and data analysis during EP studies/procedures.
  • System 42 may comprise conventional apparatus known in the art.
  • system 42 may comprise a commercially available product sold under the trade designation EP-NurseMateTM available from St. Jude Medical, Inc. of St. Paul, Minn., USA.
  • ablation system 44 can be provided.
  • the ablation system 44 may be configured with various types of ablation energy sources that can be used in or by a catheter, such as radio-frequency (RF), ultrasound (e.g. acoustic/ultrasound or HIFU), laser, microwave, cryogenic, chemical, photo-chemical or other energy used (or combinations and/or hybrids thereof) for performing ablative procedures.
  • RF ablation embodiments may and typically will include other structure(s) not shown in FIG.
  • body surface electrodes skin patches
  • an RF dispersive indifferent electrode/patch for application onto the body of a patient
  • an irrigation fluid source gravitation feed or pump
  • an RF ablation generator e.g., such as a commercially available unit sold under the model number IBI-1500T RF Cardiac Ablation Generator, available from St. Jude Medical, Inc.
  • FIG. 3A is a plan view of a first embodiment of a bedside interface device comprising a computer 26 a , suitable for use in the EP lab of FIG. 2 , and showing a first application-specific user interface.
  • the computer 26 a includes a touch-responsive display panel and thus may be referred to hereinafter sometimes as a touch panel computer.
  • the touch panel computer 26 a includes an electronic control unit (ECU) having a processor 60 and a computer-readable memory 62 , user interface (UI) logic 64 stored in the memory 62 and configured to be executed by processor 60 , a microphone 66 and voice recognition logic 68 .
  • ECU electronice control unit
  • UI user interface
  • voice recognition logic 68 is also stored in memory 62 and is configured to be executed by processor 60 .
  • the touch panel computer 26 a is configured for wireless communication to base interface 28 (best shown in FIG. 2 ).
  • the touch panel computer 26 a is configured to draw operating power at least from a battery-based power source—eliminating the need for a power cable. The resulting portability (i.e., no cables needed for either communications or power) allows touch panel computer 26 a to be carried around by the EP physician 16 or other lab staff to provide control over the linked systems (described below) while moving throughout the procedure room 10 or even the control room 12 .
  • touch panel computer 26 a can be wired for one or both of communications and power, and can also be fixed to the bedrail or in the sterile field.
  • the UI logic 64 is configured to present a plurality of application-specific user interfaces, each configured to allow a user (e.g., the EP physician 16 ) to interact with a respective one of a plurality of diagnostic and/or therapeutic systems (and their unique interface or control applications). As shown in FIG. 3A , the UI logic 64 is configured to present on the touch panel surface of computer 26 a a plurality of touch-sensitive objects (i.e., “buttons”, “flattened joystick”, etc), to be described below. In the illustrative embodiment, the UI logic 64 produces a first, application-selection group of buttons, designated as group 70 , and which are located near the top of the touch panel.
  • group 70 the first, application-selection group of buttons
  • buttons in group 70 are associated with a respective diagnostic and/or therapeutic system (and control or interface application therefore).
  • the six buttons labeled “EnSite”, “WorkMate”, “EP4”, “NurseMate”, “MediGuide”, “ViewMate” correspond to electro-anatomic mapping system 34 (for mapping control), EP recording system 38 (for patient data recording control), stimulator 40 (for stimulator control), EP data editing and monitoring system 42 (for charting) and ultrasound imaging system 32 (for ultrasound control), respectively.
  • the UI logic 64 configures the screen display of computer 26 a with an application-specific user interface tailored for the control of and interface with the particular EP system selected by the user.
  • the “EnSite” system is selected, so the UI logic 64 alters the visual appearance of the “EnSite” button so that it is visually distinguishable from the other, non-selected buttons in group 70 .
  • the “EnSite” button may appear depressed or otherwise shaded differently than the other, non-selected buttons in group 70 . This always lets the user know what system is selected.
  • the UI logic 64 in an embodiment, also maintains the application-selection buttons in group 70 at the top of the screen regardless of the particular application selected by the user. This arrangement allows the user to move from system (application) to system (application) quickly and control each one independently.
  • UI logic 64 presents an application-specific user interface tailored and optimized for control of and interaction with system 34 .
  • This user interface includes a second, common-task group of selectable buttons, designated group 72 , a third, view-mode group of selectable buttons, designated group 74 , a fourth, view-select group of selectable buttons, designated group 76 , a flattened joystick 78 configured to receive view-manipulation input from the user, a voice recognition control button 80 , and a settings button 82 .
  • Each group will be addressed in turn.
  • the second group 72 of buttons includes a listing of common tasks performed by an EP physician when interacting with system 34 .
  • Each of the buttons in group 72 are associated with a respective task (and resulting action).
  • the five buttons in group 72 are labeled “Zoom In”, “Zoom Out”, “Add Lesion”, “Freeze Point”, and “Save Point”.
  • the “Zoom In” and “Zoom Out” buttons allow the user to adjust the apparent size of the 3D model displayed on monitor 20 (i.e., enlarging or reducing the 3D model on the monitor).
  • FIG. 4A is a view of the monitor 20 of FIG. 2 , showing multiple inset displays for different applications, where the display area (window) 52 1 shows the EnSiteTM display output of a 3D electro-anatomic model at a first magnification level.
  • FIG. 4B is a further view of monitor 20 , showing a zoomed-in view of the same display area (window), now designated 52 2 , which has an increased magnification level and thus apparent size. This change of course allows the physician to see details in window 52 2 that may not be easy to see in window 52 1 .
  • the “Add Lesion” button is configured to add a lesion marker to the 3D model.
  • Other commands can be also be executed using the “Freeze Point” and “Save Point” buttons. It should be understood that variations are possible.
  • buttons in group 74 are associated with a respective display mode, which alters the display output of system 34 to suit the wishes of the physician.
  • the three selectable buttons labeled “Dual View”, “Right View”, and “Map View” re-configure the display output of system 34 , as will appear on monitor 20 .
  • buttons in group 76 are associated with a respective viewpoint from which the 3D electro-anatomic model is “viewed” (i.e., as shown in window 52 on monitor 20 ).
  • Three of the five selectable buttons namely those labeled “LAO”, “AP”, and “RAO”, allow the user to reconfigure the view point from which the 3D electro-anatomic model is viewed (i.e., left anterior oblique, anterior-posterior, right anterior oblique, respectively).
  • the remaining two buttons namely those labeled “Center at Surface” and “Center at Electrode” allow the user to invoke, respectively, the following functions: (1) center the anatomy shape in the middle of the viewing area; and (2) center the current mapping electrode or electrodes in the middle of the viewing area.
  • the flattened joystick 78 is a screen object that allows the user to rotate the 3D model displayed in the window 52 .
  • the point of contact i.e., physician's finger
  • the joystick object 78 moves from the center or neutral position, for example at point 83 , towards the outer perimeter (e.g., through point 84 to point 86 )
  • the magnitude of the input action increases.
  • the acceleration of rotation of the model or cursor will increase.
  • FIG. 3A shows the joystick object 78 as having three (3) gradations or concentric bands, it should be appreciated that this is for clarity only and not limiting in number.
  • a relatively larger number of gradations or bands such as ten (10) may be provided so as to effectively provide for a substantially continuous increase in sensitivity (or magnitude) as the point of contact moves toward the outer radius.
  • a single gradient may be continuous from the center position, point 83 , to the outer edge of the joystick object 78 , with the centermost portion of the gradient being the brightest in intensity or color and the outermost portion of the gradient being the darkest in intensity or color, for example.
  • a single gradient may be continuous from the center position, point 83 , to the outer edge of the joystick object 78 , with the centermost portion of the gradient being the darkest in intensity or color and the outermost portion of the gradient being brightest in intensity or color, for example.
  • UI logic 64 can be further configured to present an additional button labeled “Follow Me” (not shown), which, when selected by the user, configures the electro-anatomic mapping system 34 for “follow me” control.
  • This style of control is not currently available using a conventional keyboard and mouse interface.
  • UI logic 64 is configured to receive a rotation input from the user via the touch panel (e.g., joystick 78 ); however, the received input is interpreted by system 34 as a request to rotate the endocardial surface rendering (the “map”) while maintaining the mapping catheter still or stationary on the display.
  • the physician can set the position and orientation of the mapping catheter, where it will remain stationary after the “Follow Me” button is selected.
  • computer 26 a includes microphone 66 for capturing speech (audio) and voice recognition logic 68 for analyzing the captured speech to extract or identify spoken commands.
  • the voice recognition feature can be used in combination with the touch panel functionality of computer 26 a .
  • the microphone 66 may comprise conventional apparatus known in the art, and can be a voice recognition optimized microphone particularly adapted for use in speech recognition applications (e.g., an echo-cancelling microphone).
  • Voice recognition logic 68 may comprise conventional apparatus known in the art.
  • voice recognition logic 68 may be a commercially available component, such as software available under the trade designation DRAGON DICTATIONTM speech recognition software.
  • computer 26 a is configured to recognize a defined set of words or phrases adapted to control various functions of the multiple applications that are accessible or controllable by computer 26 a .
  • the voice recognition feature can itself be configured to recognize unique words or phrases to selectively enable or disable the voice recognition feature.
  • a button such as button 80 in FIG. 3A , can be used to enable or disable the voice recognition feature.
  • the enable/disable button can be either a touch-sensitive button (i.e., screen object), or can be hardware button.
  • Voice recognition logic 68 is configured to interact with the physician or other user to “train” the logic (e.g., having the user speak known words) so as to improve word and/or phrase recognition.
  • the particulars for each user so trained can be stored in a respective voice (user) profile, stored in memory 62 .
  • the currently active voice profile is listed in dashed-line box 89 .
  • each user can have unique commands, which may also be stored in the respective voice profile.
  • the language need not be English, and can be other languages. This flexibility as to language choice enlarges the audience of users who can use the device 26 a .
  • the voice recognition feature presents a number of advantages, including the fact that the physician 16 does not have to remove his/her hands from the catheter or other medical device being manipulated. In addition, the absence of contact or need to touch computer 26 a maintains a sterile condition.
  • the voice recognition feature can also be used either alone or in combination with other technologies.
  • UI logic 64 also presents a “Settings” button 82 .
  • “Settings” button 82 When the “Settings” button 82 is selected, UI logic 64 generates another screen display that allows the user to adjust and/or set/reset various settings associated with the application currently selected.
  • the “Settings” button can also allow adjustment of parameters that are more global in nature (i.e., apply to more than one application). For example only, through “Settings”, the physician or another user can edit all of the phrases associated with a particular physician or specify a timeout (i.e., the elapsed amount of time, after which the computer will stop listening (or not) for voice commands). The physician or another user can also edit miscellaneous parameters, such as communication settings and the like.
  • FIG. 3B is an isometric view of a sterile drape 88 configured to protect the touch panel computer 26 a of FIG. 3A from contamination and to maintain the physician's sterility. Conventional materials and construction techniques can be used to make drape 88 .
  • FIG. 5 is a plan view of touch panel computer 26 a showing a different application-specific user interface, now relating to EP monitoring and recording system 38 (i.e., “EP-WorkMate”).
  • UI logic 64 produces the same application-selection group 70 of buttons along the top of the touch panel, for quick and easy movement by the user between applications.
  • a second, common-tasks group of buttons, designated as group 90 are shown below group 70 .
  • the three buttons labeled “Record”, “Update”, and “Add Map Point” can execute the identified function.
  • buttons are shown, grouped by function, for example the signals-adjustment group 92 , the events group 94 , the timer group 96 and the print group 98 . It should be understood that variations are possible, depending on the items that can be adjusted or controlled on the destination system. It warrants emphasizing that UI logic 64 thus presents a unique user interface tailored to the requirements of the particular application selected. Each group includes items that are commonly asked for by the physician.
  • the Speed +/ ⁇ buttons can be used to change the viewed waveform sweep speed as the physician may need more or less detail; the Page +/ ⁇ buttons can be used to change the page of signals being viewed (e.g., from surface ECG signals to intracardiac signals); and the Amplitude +/ ⁇ buttons can be used to change the signal amplitudes up or down.
  • the enumerated Events buttons cause a mark to be created in the patient charting log to indicate a noteworthy (i.e., important) item or event, such as the patient was just defibrillated or entered a tachy-arrhythmia.
  • the timer buttons can be used to keep track of such periods of time, for example, such as a certain time after an ablation (e.g., 30 minutes) to verify that the ablation procedure is still effective.
  • various print buttons are provided so as to avoid requiring a physician to verbally indicate (e.g., by way of shouting out “print that document to the case” or the like) and to include such documents in a final report.
  • FIG. 6 is a plan view of touch panel computer 26 a showing in exemplary fashion a further, different application-specific user interface relating to the ultrasound imaging system 32 (“ViewMate”).
  • ViewMate application-specific user interface
  • the user interface presented in FIG. 6 repeats the common, application-selection group of buttons, designated group 70 .
  • a further group of buttons and adjustment mechanisms are located in group 100 .
  • the controls (buttons, sliders) provided for this user interface completely eliminate the need to have a separate ultrasound keyboard to control the console.
  • the user interface shown can be different, independent on the kind of machine being controlled, but at a minimum may typically provide a way to control the receive gain, the depth setting, the focus zone, the TGC (i.e., time gain compensation) curve, the monitoring mode (e.g., B, M, color Doppler, Doppler), image recording, as well as other image attributes and states.
  • trackpad object 101 is shown in the center of the user interface.
  • UI logic 64 can be configured to provide additional and/or substitute functions, such as, without limitation, (1) map creation; (2) collecting points; (3) segmenting regions by anatomy; (4) map view (rotate and zoom); (5) select/manipulate a number of maps and view each; (6) selection of signal trace display; (7) adjust EP signal amplitude; (8) sweep speed; (9) provide single button (or touch, multi-touch, gesture) for recording a segment, placing an event marker, and/or placing a lesion marker.
  • the screen layouts in the illustrative embodiment are exemplary only and not limiting in nature.
  • the UI logic 64 can thus implement alternative screen layouts for interaction by the user.
  • the screen displays in FIGS. 3A, 5 and 6 show an approach that incorporates the top level menu items on every screen
  • multi-level menus can also be used.
  • the screen layouts can be arranged such that a user descends down a series of screens to further levels of control.
  • a “Back” button or the like can be provided.
  • a “Home” button can be provided.
  • UI logic 64 can be configured for bi-directional display of information, for example, on the touch-responsive display panel.
  • the “EnSite” user interface FIG. 3A
  • the user interface provided by UI logic 64 can allow the user to drag his or her finger on the panel to rotate the model.
  • the display of the model provides context with respect to the act of dragging.
  • Other information can be displayed as well, such as a waveform.
  • all or a portion of the items/windows displayed on monitor 20 see, e.g., FIGS.
  • display area or window 52 may be displayed on the touch-responsive display panel allowing the physician or other user to directly modify the features of window 52 at the patient's bedside.
  • Other display areas/windows, such as windows 50 , 54 , 56 , 58 , and/or 48 (see FIG. 2 ) may also be displayed and/or modified on the touch-panel display panel.
  • One further example involves displaying feedback information or messages originating from the various devices or systems back to the touch-responsive display panel.
  • the UI logic 64 can configure any of the user-interfaces to have a message area, which can show informational messages, warning messages or critical error messages for viewing by the user.
  • the message area feature provides a way to immediately alert the physician to such messages, rather than the physician having to watch for messages on multiple displays.
  • FIG. 7A is a diagrammatic and block diagram view of a second embodiment of the bedside interface device, comprising an electronic wand system 26 b .
  • wand system 26 b is configured to allow the EP physician to take control, bedside of the patient, of an EP diagnostic or therapeutic system, such as the electro-anatomic mapping system 34 .
  • the wand system 26 b includes a wireless remote control portion 102 , an optical emitter portion 104 , and a base interface 28 b , which may be coupled to the desired, target EP system through either a wired or wireless connection.
  • the wand system 26 b incorporates remote control technology, and includes the ability to detect and interpret motion of the remote control indicative of an EP physician's command or other instruction, detect and interpret key-presses on the remote control, and/or detect and interpret motion/keypress combinations.
  • wand system 26 b may be configured with a disposable remote control portion 102 , with a reusable remote control portion 102 that is contained within an enclosure compatible with sterilization procedures, with a reusable remote control portion 102 adapted to be secured in a sterilization-compatible wrapper, or with a reusable remote control portion 102 that is encased in a sterile but disposable wrapper.
  • remote control portion 102 may include an optical detector 106 , an electronic processor 108 , a memory 110 , an optional accelerometer 112 and a wireless transmitter/receiver 114 .
  • the processor 108 is configured to execute a control program that is stored in memory 110 , to achieve the functions described below.
  • the optical emitter 104 is configured to emit a light pattern 105 that can be detected and recognized by optical detector 106 .
  • the light pattern may be a pair of light sources spaced apart by a predetermined, known distance.
  • the control program in remote 102 can be configured to assess movement of the light pattern 105 as detected by detector 106 (e.g., by assessing a time-based sequence of images captured by detector 106 ).
  • processor 108 can be configured to determine the locations of the light sources (in pixel space).
  • the control program in remote 102 may only discern the light pattern 105 itself (e.g., the locations in pixel space) and transmit this information to base interface 28 b , which in turn assesses the movement of the detected light pattern in order to arrive at a description of the motion of the remote 102 .
  • various aspects of the processing may be divided between processor 108 and a processor (not shown) contained in base interface 28 b .
  • the processor 106 communicates with base interface 28 b via the wireless transmitter/receiver 114 , which may be any type of wireless communication method now known or hereafter developed (e.g., such as those technologies or standards branded BluetoothTM, Wi-FiTM, etc.).
  • the processor 108 is configured to transmit wirelessly to interface 28 b the detected keypresses and information concerning the motion of the remote control 102 (e.g., the information about or derived from the images from the optical detector 106 ).
  • the motion of remote control 102 may also be determined, or supplemented by, readings from accelerometer 112 (which may be single-axis or multi-axis, such as a 3-axis accelerometer). In some instances, rapid motion may be better detected using an accelerometer than using optical methods.
  • electronic wand system 26 b may be similar to (but differing in application, as described herein) a commercially available game controller sold under the trade designation Wii Remote Controller, from Nintendo of America, Inc.
  • Either the remote 102 or the base interface 28 b (or both, potentially in some division of computing labor) is configured to identify a command applicable to the one of the EP diagnostic/therapeutic systems, such as electro-anatomic mapping system 34 , based on the detected motion of the remote 102 .
  • the command may be indentified based on a key press, or a predetermined motion/key press combination.
  • the wireless remote control 102 is configured to allow an EP physician to issues a wide variety of commands, for example only, any of the commands (e.g., 3D model rotation, manipulation, etc.) described above in connection with touch panel computer 26 a .
  • any of the commands e.g., 3D model rotation, manipulation, etc.
  • touch panel computer 26 a By encoding at least some of the control through the wireless remote control 102 that the EP physician controls, one or more of the shortcomings of conventional EP labs, as described in the Background, can be minimized or eliminated.
  • electronic wand system 26 b can reduce procedure times as the EP physician will spend less time playing “hot or cold” with the mapping system operator (i.e., the control technician), but instead can set the display to his/her needs throughout the medical procedure.
  • FIG. 7B shows a further embodiment, designated interface device 26 c .
  • Interface device 26 integrates the remote control 102 described above into the handle of a catheter 115 .
  • the physician need not take his hands off the catheter, but rather can issue direct, physical commands (e.g., via key-presses) while retaining control of the catheter.
  • one or more of the keys or a slider switch on the catheter handle may serve as a safety mechanism to prevent inadvertent activation of one or more commands while operating the catheter.
  • the safety mechanism may be deactivated or otherwise turned off such that the physician can issue commands and then he or she may reactivate or turn on the safety mechanism and resume manipulating the catheter without fear of modifying the view or model shown on an on-screen display, for example.
  • the catheter 115 may further comprise one or more electrodes on a distal portion of the catheter shaft and a manual or motorized steering mechanism (not shown) to enable the distal portion of the catheter shaft to be steered in at least one direction.
  • the catheter handle may be generally symmetric on opposing sides and include identical or nearly identical sets of controls on opposing sides of the handle so that a physician need not worry about which side of the catheter handle contains the keys.
  • the catheter handle may be generally cylindrical in shape and include an annular and/or rotatable control feature for issuing at least one command, again so the physician need not worry about the catheter handle's orientation in his or her hand(s).
  • Exemplary catheters, handles, and steering mechanisms are shown and described in U.S. Pat. No. 5,861,024 to Rashidi, U.S. patent application publication no. 2010/0314031 to Heideman et al., U.S. Pat. No. 7,465,288 to Dudney et al., and U.S. Pat. No. 6,671,533 to Chen et al., each of which is hereby incorporated by reference as though fully set forth herein.
  • FIG. 8 is a diagrammatic and block diagram view of a fourth embodiment of the bedside interface device, comprising a motion capture apparatus 26 d .
  • motion capture apparatus 26 d is configured to allow the EP physician to take control, bedside of the patient, of an EP diagnostic or therapeutic system, such as electro-anatomical mapping system 34 .
  • the motion capture apparatus 26 d includes a capture apparatus 116 having both an optical sub-system 118 and a microphone sub-system 120 where the apparatus 116 is coupled to a base interface 28 b .
  • the apparatus 116 is configured to optically detect the motion or physical gestures of the EP physician or other user when such movements occur within a sensing volume 122 .
  • the base interface 28 b may be coupled to the desired, target EP system through either a wired or wireless connection.
  • the motion capture apparatus 26 d includes the capability to detect hand/arm/leg/trunk/facial motions (e.g., gestures) of the EP physician or other user and translate the detected patterns into a desired command.
  • Apparatus 26 d also includes audio capture and processing capability and thus also has the capability to detect speech and translate the same into desired commands.
  • apparatus 26 d is configured to detect and interpret combinations and sequences of gestures and speech into desired commands.
  • the base interface 28 b is configured to communicate the commands (e.g., rotation, zoom, pan of a 3D anatomical model) to the appropriate EP diagnostic or therapeutic system (e.g., the electro-anatomic mapping system 34 ).
  • the motion capture apparatus 26 d may comprise commercially available components, for example, the KinectTM game control system, available from Microsoft, Redmond, Wash., USA.
  • KinectTM software development kit SDK
  • API's rich application programming interfaces
  • the SDK allows access to raw sensor streams (e.g., depth sensor, color camera sensor, and four-element microphone array), skeletal tracking, advanced audio (i.e., integration with Windows speech recognition) as well as other features.
  • raw sensor streams e.g., depth sensor, color camera sensor, and four-element microphone array
  • skeletal tracking i.e., integration with Windows speech recognition
  • advanced audio i.e., integration with Windows speech recognition
  • the motion capture apparatus 26 d can reduce procedure times.
  • the motion capture apparatus 26 d can be used in concert with sensors and/or emitters in a sterile glove to assist the apparatus 26 d to discriminate commands intended to be directed to one of the EP systems, versus EP physician hand movements that result from his/her manipulation of the catheter or medical device, versus other movement in the EP lab in general.
  • the motion capture apparatus 26 d may discriminate such commands by being “activated” by a user when a specific verbal command is issued (e.g., “motion capture on”) and then “deactivated” by the user when another specific verbal command is issued (e.g., “motion capture off”).
  • FIGS. 9-10 are diagrammatic views of fifth and sixth embodiments of the bedside interface device, comprising touch responsive devices.
  • FIGS. 9 and 10 show touch-screen mouse pad devices 26 e and 26 f , respectively. These devices can be covered in a sterile bag.
  • the EP physician 16 can move the mouse cursor from application to application and control each such application independently.
  • Devices 26 e , 26 f may comprise conventional apparatus known in the art.
  • FIG. 11 is a diagrammatic view of a seventh embodiment of the bedside interface device comprising a customized joystick 26 g .
  • Joystick 26 g can also be covered in a sterile bag.
  • the device 26 g can be used to be provide application-specific control a particular application function(s), such as rotating a 3D model (system 34 ), adding lesion markers, and the like.
  • FIGS. 12-13 are diagrammatic views of eighth and ninth embodiments of the bedside interface device comprising holographic mouse and keyboard input devices, respectively.
  • Holographic mouse 26 h deploys light beam pattern 124 , which is used by the mouse 26 h to acquire user input (i.e., movement of the physician's finger, instead of moving a conventional mouse). The movement input can be used in the same manner as that obtained from a conventional mouse.
  • Holographic keyboard 26 i also deploys a light beam pattern 126 corresponding to a keyboard.
  • a physician's finger can be used to “select” the key much in the same manner as a conventional keyboard, but without any physical contact.
  • Devices 26 h , 26 i have the advantage of being sterile without any disposables, and can incorporate wireless communications and may be powered using batteries (i.e., no cables needed).
  • mapping system primary control by the physician in manipulating or interacting with the mapping system may be through use of voice control alone (i.e., a microphone coupled with voice recognition logic), apart from its inclusion with other modes or devices for user interaction described above.
  • physician can be equipped with headgear that monitors head movements to determine at what location on the screen/monitor the physician is looking. In effect, such headgear can act as a trackball to move or otherwise manipulate an image (or view of a model) on the monitor in accordance with the physician's head movements.
  • the physician can be equipped with headgear that monitors head movements and/or also monitors brainwave patterns (e.g., to record a user electroencephalogram (EEG)).
  • EEG user electroencephalogram
  • Such monitored data can be analyzed to derive or infer user input or commands for controlling an image (or view of a model), as described above.
  • An EEG-based embodiment may comprise conventional apparatus known in the art, for example, commercially available products respectively sold under the trade designation MindWaveTM headset from NeuroSky, Inc., San Jose, Calif., USA, or the Emotiv EPOCTM personal interface neuroheadset from Emotiv, Kwun Tong, Hong Kong.
  • the physician can be equipped with an eye tracking apparatus, wherein monitored eye movements constitute the user input to be interpreted by the system (e.g., the eye movements can be interpreted as a cursor movement or other command).
  • a catheter movement controller (not shown) described above may be incorporated into a larger robotic catheter guidance and control system, for example, as seen by reference to U.S. application Ser. No. 12/751,843 filed Mar. 31, 2010 entitled ROBOTIC CATHETER SYSTEM (published as U.S. patent application publication no. 2010/0256558), owned by the common assignee of the present invention and hereby incorporated by reference in its entirety as though fully set forth herein.
  • Such a robotic catheter system may be configured to manipulate and maneuver catheters within a lumen or a cavity of a human body, while the bedside interface devices described herein can be used to access and control the EP diagnostic and/or therapeutic systems.
  • a bedside interface device as described herein may also be used to access and control the robotic catheter system.
  • an article of manufacture includes a computer storage medium having a computer program encoded thereon, where the computer program includes code for acquiring user input based on at least one of a plurality of input modes, such as by touch, multi-touch, gesture, motion pattern, voice recognition and the like, and identifying one or more commands or requests for an EP diagnostic and/or therapeutic system.
  • Such embodiments may be configured to execute one or more processors, multiple processors that are integrated into a single system or are distributed over and connected together through a communications network, and where the network may be wired or wireless.
  • an electronic control unit as described above may include conventional processing apparatus known in the art, capable of executing preprogrammed instructions stored in an associated memory, all performing in accordance with the functionality described herein. It is contemplated that the methods described herein may be programmed, with the resulting software being stored in an associated memory and where so described, may also constitute the means for performing such methods. Implementation of an embodiment of the invention, in software, in view of the foregoing enabling description, would require no more than routine application of programming skills by one of ordinary skill in the art. Such a system may further be of the type having both ROM, RAM, a combination of non-volatile and volatile (modifiable) memory so that the software can be stored and yet allow storage and processing of dynamically produced data and/or signals.
  • joinder references do not necessarily infer that two elements are directly connected and in fixed relation to each other. It is intended that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not limiting. Changes in detail or structure may be made without departing from the spirit of the invention as defined in the appended claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Gynecology & Obstetrics (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Multimedia (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

In an electrophysiology (EP) lab, a bedside interface device allows an EP physician to directly control various diagnostic and therapeutic systems, including an electro-anatomic mapping system. The bedside interface device can include a computer with wireless communication capability as well as a touch-responsive display panel and voice recognition. The bedside interface device can also be a hand-graspable wireless remote control device that is configured to detect motions or gestures made with the remote control by the physician, allowing the physician to directly interact with the mapping system. The bedside interface device can also be a motion capture camera configured to determine motion patterns of the physician's arms, legs, trunk, face and the like, which are defined in advance to correspond to commands for the mapping system. The bedside interface device may also include voice recognition capabilities to allow a physician to directly issue verbal commands to the mapping system.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This application is a continuation patent application of U.S. patent application Ser. No. 13/208,924 (the '924 application), filed 12 Aug. 2011. The '924 application is hereby incorporated by reference in its entirety as though fully set forth herein.
  • BACKGROUND OF THE INVENTION
  • a. Field of the Invention
  • The instant disclosure relates generally to electrophysiology lab integration, and more particularly to user interfaces and devices therefore for electrophysiology lab diagnostic and therapeutic equipment.
  • b. Background Art
  • It is known to provide an electrophysiology lab in a medical facility. Such a lab may have use of a wide variety of diagnostic and therapeutic equipment useful in rendering medical service to a patient, such as imaging systems (e.g., fluoroscopy, intracardiac echocardiography, etc.), an electro-anatomic visualization, mapping and navigation system, ablation energy sources (e.g., radio frequency (RF) ablation generator), a recording system (e.g., for ECG, cardiac signals, etc.), a cardiac stimulator and the like. In a typical configuration, as seen by reference to FIG. 1, a procedure room 10 (i.e., a sterile environment) may have an associated control area or room 12, which is commonly outfitted with one or more control stations 141, 14 2, . . . 14 n that are operated by one or more control technicians. Each control station may include a respective display monitor, keyboard and mouse for use by the technician. Depending on the lab setup, the control station(s) may be across the room, or outside of the procedure room 10 completely, perhaps configured with a common window to allow the technician(s) to observe the procedure room through the window. These control station(s) allow access to and may be used to control the diagnostic and therapeutic equipment mentioned above.
  • In conventional practice, an electrophysiology (EP) physician 16 is scrubbed into a sterile procedure and typically manipulates one or more catheters (not shown) in a sterile drape covered body of the patient 18. The physician's sterile gloved hands are typically engaged with the catheter handle and shaft next to the patient and he or she is therefore unable to directly make changes himself to any of the EP systems. The procedure room 10 typically includes one or more monitors (e.g., an integrated multi-display monitor 20 is shown) arranged so that the physician 16 can see the monitor 20 on which is displayed various patient information being produced by the diagnostic and therapeutic equipment mentioned above. In FIG. 1, multiple applications, for example, an electro-anatomic mapping application (e.g., EnSite Velocity™) and an EP signal acquisition and recording application, direct a visual output to a respective display area of monitor 20. When changes to an application are needed, the physician 16 verbalizes such commands to the control technicians in the control area/room 12 who are working at the various control stations 14 1, 14 2, . . . 14 n. The multiple technicians at multiple control stations use multiple keyboard/mouse sets to control the multiple applications. The verbal commands between the physician and the technician occur throughout the procedure.
  • For example, the EP physician 16 can verbally communicate (i.e., to the control technician—a mapping system operator) the desired view of the map to be displayed, when to collect points, when to separate anatomic locations, and other details of creating and viewing an anatomic map. The EP physician 16 can also communicate which signal traces to show, the desired amplitude, when to drop a lesion marker, and when to record a segment, to name a few. Where the technician is in a separate room, communication can be facilitated using radio.
  • While some commands are straightforward, for example, “LAO View”, “record that” and “stop pacing”, other commands are not as easy to clearly communicate. For example, how much rotation of a model the command “rotate a little to the right” means can be different as between the physician and the technician. This type of command therefore involves a question of degree. Also, depending on the physician-technician relationship, other requests related to the mapping system views and setup can be misinterpreted. For example, a request to “rotate right” may mean to rotate the model right (i.e., rotate view left) when originating from one physician but can alternatively mean rotate view right (i.e., rotate model left) when coming from another physician. This type of command therefore involves physician-technician agreement as to convention. Furthermore, implementation of requests for event markers, segment recordings, lesion markers and the like can be delayed by the time it takes the technician to hear, understand and act on a physician's command. Ambient discussions and/or equipment noise in and around the EP lab can increase this delay.
  • There is therefore a need for improvements in EP lab integration that minimize or eliminate one or more problems are set forth above.
  • BRIEF SUMMARY OF THE INVENTION
  • One advantage of the methods and apparatuses described, depicted and claimed herein is that they provide an EP physician with the capability of directly controlling an EP diagnostic or therapeutic system, such as an electro-anatomic mapping system. This capability eliminates the need for the physician to first communicate his/her wishes to a control technician, who in turn must hear, interpret and act on the physician's command. The improved control paradigm results in reduced times for medical procedures.
  • A device for allowing a user to control an electro-anatomic mapping system includes an electronic control unit (ECU) and input means, using the ECU, for acquiring a user input with respect to a view of an anatomical model of at least a portion of a body of a patient. The user input is selected from the group comprising a user touch, a user multi-touch, a user gesture, a verbal command, a motion pattern of a user-controlled object, a user motion pattern and a user electroencephalogram. The ECU is configured to communicate the acquired input to the mapping system for further processing.
  • In an embodiment, the acquired user input can correspond to any of a variety of mapping systems commands, for example only at least one of: (1) creating a map with respect to the view; (2) collecting points with respect to the view; (3) segmenting regions by anatomy with respect to the view; (4) rotating the view; (5) enlarging or reducing a portion of the view; (6) panning the view; (7) selecting one of a plurality of maps for the view; (8) selecting a signal trace; (9) adjusting a signal amplitude; (10) adjusting a sweep speed; (11) recording a segment; (12) placing an event marker; (13) placing a lesion marker with respect to the view; (14) activating a replay feature of a stored, temporally varying physiologic parameter and (15) activating a replay of a stored video clip.
  • In an embodiment, the input means includes a touch-responsive display panel coupled to the ECU. The input means also includes user interface logic (executed by the ECU) configured to display a user interface on the touch-responsive display panel. The user interface logic is further configured to allow a user to interact with the touch-responsive panel for acquiring the above-mentioned user input with respect to the anatomical model. The user interface in combination with the touch-panel allows the user to provide input by way of touch, multi-touch, and gesture. In a further embodiment, the device further includes voice recognition logic configured to recognize a set of predefined verbal commands spoken by the user (e.g., the physician). In a still further embodiment, the device includes wireless communications functionality, improving portability of the device within a procedure room or the control room. In a still further embodiment, the user interface logic is configured to present a plurality of application-specific user interfaces associated with a plurality of different diagnostic or therapeutic systems. Through this capability, the user can rapidly switch between application-specific user interfaces (e.g., such as that for an electro-anatomic mapping system, an EP recording system, an ultrasound imaging system, a cardiac stimulator, etc.), while remaining bedside of the patient, and without needing to communicate via a control technician.
  • In another embodiment, the input means includes a remote control having a handle configured to be grasped by the user. The remote control includes logic configured to acquire the above-mentioned user input. The user input may include user-controlled motion patterns of the remote control, as well as user key-presses on the remote control. The device is also configured to communicate the acquired user input to the mapping system.
  • In yet another embodiment, the input means includes a motion capture apparatus configured to acquire imaging of movements of the user. The device includes logic configured to identify a motion pattern using the acquired imaging from the motion capture apparatus. The logic is further configured to produce a command, based on the identified motion pattern, and communicate the command to the electro-anatomic mapping system for further processing. The motion capture apparatus provides the capability of receiving input by way of physician gestures (e.g., hand, arm, leg, trunk, facial, etc.). In a further embodiment, the device further includes voice recognition logic configured to identify verbal commands spoken by the user.
  • Corresponding methods are also presented.
  • The foregoing and other aspects, features, details, utilities, and advantages of the present disclosure will be apparent from reading the following description and claims, and from reviewing the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram view of a conventional electrophysiology lab having a sterile procedure room and an associated control room.
  • FIG. 2 is a block diagram view of an embodiment of an electrophysiology lab having a bedside interface device for controlling diagnostic and therapeutic equipment.
  • FIG. 3A is a plan view of a first embodiment of a bedside interface device comprising a touch panel computer, suitable for use in the EP lab of FIG. 2, and showing a first application-specific user interface.
  • FIG. 3B is an isometric view of a sterile drape configured to isolate the touch panel computer of FIG. 3A.
  • FIG. 4A is a view of a monitor shown in FIG. 2, showing multiple inset displays associated with a plurality of diagnostic and/or therapeutic systems.
  • FIG. 4B is a view of the monitor of FIG. 4A, showing a zoomed-in window of the display associated with an electro-anatomic mapping system.
  • FIG. 5 is a plan view of the touch panel computer of FIG. 3A showing a second application-specific user interface.
  • FIG. 6 is a plan view of the touch panel computer of FIG. 3A showing a third application-specific user interface.
  • FIG. 7A is a diagrammatic and block diagram view of a second embodiment of the bedside interface device comprising an electronic wand system.
  • FIG. 7B is a diagrammatic view of a third embodiment of the bedside interface device wherein a catheter is integrated with the remote control portion of FIG. 7A.
  • FIG. 8 is a diagrammatic and block diagram view of a fourth embodiment of the bedside interface device comprising a motion capture apparatus.
  • FIGS. 9-10 are diagrammatic views of fifth and sixth embodiments of the bedside interface device comprising touch responsive surface devices that can be covered in a sterile bag.
  • FIG. 11 is a diagrammatic view of a seventh embodiment of the bedside interface device comprising a customized joystick that can be covered in a sterile bag.
  • FIGS. 12-13 are diagrammatic views of eighth and ninth embodiments of the bedside interface device comprising holographic mouse and keyboard input devices, respectively.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Referring now to the drawings wherein like reference numerals are used to identify identical or similar components in the various views, FIG. 2 is a diagrammatic overview of an electrophysiology (EP) laboratory in which embodiments of the present invention may be used. FIG. 2 shows a sterile procedure room 10 where an EP physician 16 is set to perform one or more diagnostic and/or therapeutic procedures. It should be understood that the separate control area/room 12 of FIG. 1 (not shown in FIG. 2) may continue to be used in conjunction with the bedside interface device to be described below. FIG. 2 also shows multi-display monitor 20 as well as a procedure table or bed 22. While procedure room 10 may include multiple, individual monitors, monitor 20 may be a multi-display monitor configured to display a plurality of different input channels in respective display areas on the monitor. In an embodiment, the monitor 20 may be a commercially available product sold under the trade designation VantageView™ from St. Jude Medical, Inc. of St. Paul, Minn., USA, which can have a 3840×2160 Quad-HD screen resolution with the flexibility to accept up to sixteen (16) digital or analog image inputs while displaying up to eight (8) images on one screen at one time. The procedure table 22, which may be of conventional construction, is configured to receive a patient (not shown) on whom diagnostic and/or therapeutic procedure(s) are to be performed.
  • FIG. 2 further shows means or apparatus 24 for facilitating physician interaction with one or more diagnostic and/or therapeutic systems. Means or apparatus 24 includes a bedside interface device 26 and optionally one or more base interfaces 28. Means or apparatus 24 provides the mechanism for the EP physician 16 to directly interact with such systems without the need for the intermediate step of verbalizing commands to a control technician, as described in connection with FIG. 1. In this regard, bedside interface device 26 is configured to present a user interface or other input logic with which the user (e.g., the EP physician 16) can directly interact or from which an input can be acquired. In multiple embodiments, various modes of interaction are presented, such as interaction via a user touch, a user multi-touch, a user gesture, a verbal command, a motion pattern of a user-controlled device, a user motion pattern and a user electroencephalogram. In addition, bedside interface device 26 can be configured to communicate with one or more of the diagnostic/therapeutic systems either wirelessly (as shown) or via a wired connection (not shown).
  • The base interface 28 is configured to interpret and/or facilitate directing the input acquired by the bedside interface device 26 to the appropriate one or more diagnostic and/or therapeutic systems (e.g., an electro-anatomic mapping system). In an embodiment, base interface 28 is centralized (as shown), wherein all communications with bedside device 26 occur through base interface 28. In a further embodiment, base interface 28 may be functionally distributed, wherein interface functions are located within each diagnostic or therapeutic system. In a still further embodiment, communications between bedside interface 26 and certain ones of the diagnostic/therapeutic systems can be centralized, while communications with other ones of the diagnostic/therapeutic systems can occur directly (i.e., separately).
  • The means or apparatus 24 addresses a number of the shortcomings of the conventional practice as described in the Background. For example, means or apparatus 24 allows the EP physician 16 to directly input levels of degree, for example, how much to rotate a view, as opposed to trying to verbally communicate “how much” to a control technician. Further, the use of means or apparatus 24 avoids the potential confusion that can sometimes occur between the EP physician and the control technician as to convention (i.e., does “rotate right” mean rotate the view or the model?). In addition, the use of means or apparatus 24 reduces or eliminates the inherent time delay between the time when the EP physician verbally issues a command and the time when the command is understood and acted upon by the technician.
  • With continued reference to FIG. 2, the physician 16 will typically have access to a plurality of diagnostic and/or therapeutic systems in order to perform one or more medical procedures. In the illustrative embodiment, the physician 16 may have access to a first imaging system, such as a fluoroscopic imaging system 30, a second imaging system, such as an intracardiac ultrasound or echocardiography (ICE) imaging system 32, an electro-anatomic positioning, mapping, and visualization system 34, a further positioning system, such as a medical positioning system (magnetic-field based) 36, a patient data (electrophysiological (EP) data) monitoring and recording system 38, a cardiac stimulator 40, an EP data editing/monitoring system 42 and an ablation system 44. FIG. 2 schematically shows a communication mechanism 46 which facilitates communication between and among the various systems described above. It should be understood, however, that the communications mechanism 46 may not necessarily function to enable communications between each and every system shown.
  • The fluoroscopic imaging system 30 may comprise conventional apparatus known in the art, for example, single plane or bi-plane configurations. A display area 48 that is shown on monitor 20 corresponds to the display output of fluoroscopic imaging system 30.
  • The intracardiac ultrasound and/or intracardiac echocardiography (ICE) imaging system 32 may also comprise conventional apparatus known in the art. For example, in one embodiment, the system 32 may comprise a commercial system available under the trade designation ViewMate™ Z intracardiac ultrasound system compatible with a ViewFlex™ PLUS intracardiac echocardiography (ICE) catheter, from St. Jude Medical, Inc. of St. Paul, Minn., USA. The system 32 is configured to provide real-time image guidance and visualization, for example, of the cardiac anatomy. Such high fidelity images can be used to help direct diagnosis or therapy during complex electrophysiology procedures. A display area 50 that is shown on monitor 20 corresponds to the display output of the ultrasound imaging system 32.
  • The system 34 is configured to provide many advanced features, such as visualization, mapping, navigation support and positioning (i.e., determine a position and orientation (P&O) of a sensor-equipped medical device, for example, a P&O of a distal tip portion of a catheter). Such functionality can be provided as part of a larger visualization, mapping and navigation system, for example, an ENSITE VELOCITY™ cardiac electro-anatomic mapping system running a version of EnSite NavX™ navigation and visualization technology software commercially available from St. Jude Medical, Inc., of St. Paul, Minn. and as also seen generally by reference to U.S. Pat. No. 7,263,397 entitled “METHOD AND APPARATUS FOR CATHETER NAVIGATION AND LOCATION AND MAPPING IN THE HEART” to Hauck et al., or U.S. Patent Publication No. 2007/0060833 A1 to Hauck entitled “METHOD OF SCALING NAVIGATION SIGNALS TO ACCOUNT FOR IMPEDANCE DRIFT IN TISSUE”, both owned by the common assignee of the present invention, and both hereby incorporated by reference in their entireties as though fully set forth herein. System 34 can be configured to perform further advanced functions, such as motion compensation and adjustment functions. Motion compensation may include, for example, compensation for respiration-induced patient body movement, as described in copending U.S. patent application Ser. No. 12/980,515, entitled “DYNAMIC ADAPTIVE RESPIRATION COMPENSATION WITH AUTOMATIC GAIN CONTROL”, which is hereby incorporated by reference in its entirety as though fully set forth herein. System 34 can be used in connection with or for various medical procedures, for example, EP studies or cardiac ablation procedures.
  • System 34 is further configured to generate and display three dimensional (3D) cardiac chamber geometries or models, display activation timing and voltage data to identify arrhythmias, and to generally facilitate guidance of catheter movement in the body of the patient. For example, a display area 52 that is shown on monitor 20 corresponds to the display output of system 34, can be viewed by physician 16 during a procedure, which can visually communicate information of interest or need to the physician. The display area 52 in FIG. 2 shows a 3D cardiac model, which, as will be described below in greater detail, may be modified (i.e., rotated, zoomed, etc.) pursuant to commands given directly by physician 16 via the bedside interface device 26.
  • System 36 is configured to provide positioning information with respect to suitably configured medical devices (i.e., those including a positioning sensor). System 36 may use, at least in part, a magnetic field based localization technology, comprising conventional apparatus known in the art, for example, as seen by reference to U.S. Pat. No. 7,386,339 entitled “MEDICAL IMAGING AND NAVIGATION SYSTEM”, U.S. Pat. No. 6,233,476 entitled “MEDICAL POSITIONING SYSTEM”, and U.S. Pat. No. 7,197,354 entitled “SYSTEM FOR DETERMINING THE POSITION AND ORIENTATION OF A CATHETER”, all of which are hereby incorporated by reference in their entirety as though fully set forth herein. System 36 may comprise a gMPS™ medical positioning system commercially offered by MediGuide Ltd. of Haifa, Israel and now owned by St. Jude Medical, Inc. of St. Paul, Minn., USA. System 36 may alternatively comprise variants, which employ magnetic field generator operation, at least in part, such as a combination magnetic field and current field-based system such as the CARTO™ 3 System available from Biosense Webster, and as generally shown with reference to one or more of U.S. Pat. No. 6,498,944 entitled “Intrabody Measurement,” U.S. Pat. No. 6,788,967 entitled “Medical Diagnosis, Treatment and Imaging Systems,” and U.S. Pat. No. 6,690,963 entitled “System and Method for Determining the Location and Orientation of an Invasive Medical Instrument,” the entire disclosures of which are incorporated herein by reference as though fully set forth herein.
  • EP monitoring and recording system 38 is configured to receive, digitize, display and store electrocardiograms, invasive blood pressure waveforms, marker channels, and ablation data. System 38 may comprise conventional apparatus known in the art. In one embodiment, system 38 may comprise a commercially available product sold under the trade designation EP-WorkMate™ from St. Jude Medical, Inc. of St. Paul, Minn., USA. The system 38 can be configured to record a large number of intracardiac channels, may be further configured with an integrated cardiac stimulator (shown in FIG. 2 as stimulator 40), as well as offering storage and retrieval capabilities of an extensive database of patient information. Display areas 54, 56 shown on monitor 20 correspond to the display output of EP monitoring and recording system 38.
  • Cardiac stimulator 40 is configured to provide electrical stimulation of the heart during EP studies. Stimulator 40 can be provided in either a stand-alone configuration, or can be integrated with EP monitoring and recording system 38, as shown in FIG. 2. Stimulator 40 is configured to allow the user to initiate or terminate tachy-arrhythmias manually or automatically using preprogrammed modes of operation. Stimulator 40 may comprise conventional apparatus known in the art. In an embodiment, stimulator 40 can comprise a commercially available cardiac stimulator sold under the trade designation EP-4™ available from St. Jude Medical, Inc. of St. Paul, Minn., USA. The display area 58 shown on monitor 20 corresponds to the display output of the cardiac stimulator 40.
  • EP data editing/monitoring system 42 is configured to allow editing and monitoring of patient data (EP data), as well as charting, analysis, and other functions. System 42 can be configured for connection to EP data recording system 38 for real-time patient charting, physiological monitoring, and data analysis during EP studies/procedures. System 42 may comprise conventional apparatus known in the art. In an embodiment, system 42 may comprise a commercially available product sold under the trade designation EP-NurseMate™ available from St. Jude Medical, Inc. of St. Paul, Minn., USA.
  • To the extent the medical procedure involves tissue ablation (e.g., cardiac tissue ablation), ablation system 44 can be provided. The ablation system 44 may be configured with various types of ablation energy sources that can be used in or by a catheter, such as radio-frequency (RF), ultrasound (e.g. acoustic/ultrasound or HIFU), laser, microwave, cryogenic, chemical, photo-chemical or other energy used (or combinations and/or hybrids thereof) for performing ablative procedures. RF ablation embodiments may and typically will include other structure(s) not shown in FIG. 2, such as one or more body surface electrodes (skin patches) for application onto the body of a patient (e.g., an RF dispersive indifferent electrode/patch), an irrigation fluid source (gravity feed or pump), and an RF ablation generator (e.g., such as a commercially available unit sold under the model number IBI-1500T RF Cardiac Ablation Generator, available from St. Jude Medical, Inc.).
  • FIG. 3A is a plan view of a first embodiment of a bedside interface device comprising a computer 26 a, suitable for use in the EP lab of FIG. 2, and showing a first application-specific user interface. The computer 26 a includes a touch-responsive display panel and thus may be referred to hereinafter sometimes as a touch panel computer. The touch panel computer 26 a, as shown in inset in FIG. 3A, includes an electronic control unit (ECU) having a processor 60 and a computer-readable memory 62, user interface (UI) logic 64 stored in the memory 62 and configured to be executed by processor 60, a microphone 66 and voice recognition logic 68. In an embodiment, voice recognition logic 68 is also stored in memory 62 and is configured to be executed by processor 60. In an embodiment, the touch panel computer 26 a is configured for wireless communication to base interface 28 (best shown in FIG. 2). In addition, the touch panel computer 26 a is configured to draw operating power at least from a battery-based power source—eliminating the need for a power cable. The resulting portability (i.e., no cables needed for either communications or power) allows touch panel computer 26 a to be carried around by the EP physician 16 or other lab staff to provide control over the linked systems (described below) while moving throughout the procedure room 10 or even the control room 12. In another embodiment, touch panel computer 26 a can be wired for one or both of communications and power, and can also be fixed to the bedrail or in the sterile field.
  • In the illustrated embodiment, the UI logic 64 is configured to present a plurality of application-specific user interfaces, each configured to allow a user (e.g., the EP physician 16) to interact with a respective one of a plurality of diagnostic and/or therapeutic systems (and their unique interface or control applications). As shown in FIG. 3A, the UI logic 64 is configured to present on the touch panel surface of computer 26 a a plurality of touch-sensitive objects (i.e., “buttons”, “flattened joystick”, etc), to be described below. In the illustrative embodiment, the UI logic 64 produces a first, application-selection group of buttons, designated as group 70, and which are located near the top of the touch panel. Each of the buttons in group 70 are associated with a respective diagnostic and/or therapeutic system (and control or interface application therefore). For example, the six buttons labeled “EnSite”, “WorkMate”, “EP4”, “NurseMate”, “MediGuide”, “ViewMate” correspond to electro-anatomic mapping system 34 (for mapping control), EP recording system 38 (for patient data recording control), stimulator 40 (for stimulator control), EP data editing and monitoring system 42 (for charting) and ultrasound imaging system 32 (for ultrasound control), respectively.
  • When a user selects one of the buttons in group 70, the UI logic 64 configures the screen display of computer 26 a with an application-specific user interface tailored for the control of and interface with the particular EP system selected by the user. In FIG. 3A, the “EnSite” system is selected, so the UI logic 64 alters the visual appearance of the “EnSite” button so that it is visually distinguishable from the other, non-selected buttons in group 70. For example, when selected, the “EnSite” button may appear depressed or otherwise shaded differently than the other, non-selected buttons in group 70. This always lets the user know what system is selected. The UI logic 64, in an embodiment, also maintains the application-selection buttons in group 70 at the top of the screen regardless of the particular application selected by the user. This arrangement allows the user to move from system (application) to system (application) quickly and control each one independently.
  • With continued reference to FIG. 3A, UI logic 64 presents an application-specific user interface tailored and optimized for control of and interaction with system 34. This user interface includes a second, common-task group of selectable buttons, designated group 72, a third, view-mode group of selectable buttons, designated group 74, a fourth, view-select group of selectable buttons, designated group 76, a flattened joystick 78 configured to receive view-manipulation input from the user, a voice recognition control button 80, and a settings button 82. Each group will be addressed in turn.
  • The second group 72 of buttons includes a listing of common tasks performed by an EP physician when interacting with system 34. Each of the buttons in group 72 are associated with a respective task (and resulting action). For example, the five buttons in group 72 are labeled “Zoom In”, “Zoom Out”, “Add Lesion”, “Freeze Point”, and “Save Point”. The “Zoom In” and “Zoom Out” buttons allow the user to adjust the apparent size of the 3D model displayed on monitor 20 (i.e., enlarging or reducing the 3D model on the monitor).
  • For example, FIG. 4A is a view of the monitor 20 of FIG. 2, showing multiple inset displays for different applications, where the display area (window) 52 1 shows the EnSite™ display output of a 3D electro-anatomic model at a first magnification level. FIG. 4B is a further view of monitor 20, showing a zoomed-in view of the same display area (window), now designated 52 2, which has an increased magnification level and thus apparent size. This change of course allows the physician to see details in window 52 2 that may not be easy to see in window 52 1.
  • Referring again to FIG. 3A, the “Add Lesion” button is configured to add a lesion marker to the 3D model. Other commands can be also be executed using the “Freeze Point” and “Save Point” buttons. It should be understood that variations are possible.
  • Each of the buttons in group 74 are associated with a respective display mode, which alters the display output of system 34 to suit the wishes of the physician. For example, the three selectable buttons labeled “Dual View”, “Right View”, and “Map View” re-configure the display output of system 34, as will appear on monitor 20.
  • Each of the buttons in group 76 are associated with a respective viewpoint from which the 3D electro-anatomic model is “viewed” (i.e., as shown in window 52 on monitor 20). Three of the five selectable buttons, namely those labeled “LAO”, “AP”, and “RAO”, allow the user to reconfigure the view point from which the 3D electro-anatomic model is viewed (i.e., left anterior oblique, anterior-posterior, right anterior oblique, respectively). The remaining two buttons, namely those labeled “Center at Surface” and “Center at Electrode” allow the user to invoke, respectively, the following functions: (1) center the anatomy shape in the middle of the viewing area; and (2) center the current mapping electrode or electrodes in the middle of the viewing area.
  • The flattened joystick 78 is a screen object that allows the user to rotate the 3D model displayed in the window 52. In addition, as the point of contact (i.e., physician's finger) with the joystick object 78 moves from the center or neutral position, for example at point 83, towards the outer perimeter (e.g., through point 84 to point 86), the magnitude of the input action increases. For example, the acceleration of rotation of the model or cursor will increase. While FIG. 3A shows the joystick object 78 as having three (3) gradations or concentric bands, it should be appreciated that this is for clarity only and not limiting in number. For example, in an embodiment, a relatively larger number of gradations or bands, such as ten (10), may be provided so as to effectively provide for a substantially continuous increase in sensitivity (or magnitude) as the point of contact moves toward the outer radius. In another embodiment, a single gradient may be continuous from the center position, point 83, to the outer edge of the joystick object 78, with the centermost portion of the gradient being the brightest in intensity or color and the outermost portion of the gradient being the darkest in intensity or color, for example. In yet another embodiment, a single gradient may be continuous from the center position, point 83, to the outer edge of the joystick object 78, with the centermost portion of the gradient being the darkest in intensity or color and the outermost portion of the gradient being brightest in intensity or color, for example.
  • In a further embodiment, UI logic 64 can be further configured to present an additional button labeled “Follow Me” (not shown), which, when selected by the user, configures the electro-anatomic mapping system 34 for “follow me” control. This style of control is not currently available using a conventional keyboard and mouse interface. For “follow me” control, UI logic 64 is configured to receive a rotation input from the user via the touch panel (e.g., joystick 78); however, the received input is interpreted by system 34 as a request to rotate the endocardial surface rendering (the “map”) while maintaining the mapping catheter still or stationary on the display. In an embodiment, the physician can set the position and orientation of the mapping catheter, where it will remain stationary after the “Follow Me” button is selected.
  • Another feature of the touch panel computer 26 a is that it incorporates, in an embodiment, voice recognition technology. As described above, computer 26 a includes microphone 66 for capturing speech (audio) and voice recognition logic 68 for analyzing the captured speech to extract or identify spoken commands. The voice recognition feature can be used in combination with the touch panel functionality of computer 26 a. The microphone 66 may comprise conventional apparatus known in the art, and can be a voice recognition optimized microphone particularly adapted for use in speech recognition applications (e.g., an echo-cancelling microphone). Voice recognition logic 68 may comprise conventional apparatus known in the art. In an embodiment, voice recognition logic 68 may be a commercially available component, such as software available under the trade designation DRAGON DICTATION™ speech recognition software.
  • In an embodiment, computer 26 a is configured to recognize a defined set of words or phrases adapted to control various functions of the multiple applications that are accessible or controllable by computer 26 a. The voice recognition feature can itself be configured to recognize unique words or phrases to selectively enable or disable the voice recognition feature. Alternatively (or in addition to), a button, such as button 80 in FIG. 3A, can be used to enable or disable the voice recognition feature. In this regard, the enable/disable button can be either a touch-sensitive button (i.e., screen object), or can be hardware button.
  • Voice recognition logic 68 is configured to interact with the physician or other user to “train” the logic (e.g., having the user speak known words) so as to improve word and/or phrase recognition. The particulars for each user so trained can be stored in a respective voice (user) profile, stored in memory 62. For example, in FIG. 3A, the currently active voice profile is listed in dashed-line box 89. In an embodiment, each user can have unique commands, which may also be stored in the respective voice profile. In a further embodiment, the language need not be English, and can be other languages. This flexibility as to language choice enlarges the audience of users who can use the device 26 a. The voice recognition feature presents a number of advantages, including the fact that the physician 16 does not have to remove his/her hands from the catheter or other medical device being manipulated. In addition, the absence of contact or need to touch computer 26 a maintains a sterile condition. The voice recognition feature can also be used either alone or in combination with other technologies.
  • With continued reference to FIG. 3A, UI logic 64 also presents a “Settings” button 82. When the “Settings” button 82 is selected, UI logic 64 generates another screen display that allows the user to adjust and/or set/reset various settings associated with the application currently selected. In an embodiment, the “Settings” button can also allow adjustment of parameters that are more global in nature (i.e., apply to more than one application). For example only, through “Settings”, the physician or another user can edit all of the phrases associated with a particular physician or specify a timeout (i.e., the elapsed amount of time, after which the computer will stop listening (or not) for voice commands). The physician or another user can also edit miscellaneous parameters, such as communication settings and the like.
  • FIG. 3B is an isometric view of a sterile drape 88 configured to protect the touch panel computer 26 a of FIG. 3A from contamination and to maintain the physician's sterility. Conventional materials and construction techniques can be used to make drape 88.
  • FIG. 5 is a plan view of touch panel computer 26 a showing a different application-specific user interface, now relating to EP monitoring and recording system 38 (i.e., “EP-WorkMate”). In the illustrative embodiment, UI logic 64 produces the same application-selection group 70 of buttons along the top of the touch panel, for quick and easy movement by the user between applications. A second, common-tasks group of buttons, designated as group 90, are shown below group 70. For example, the three buttons labeled “Record”, “Update”, and “Add Map Point” can execute the identified function. Likewise, additional groups of buttons are shown, grouped by function, for example the signals-adjustment group 92, the events group 94, the timer group 96 and the print group 98. It should be understood that variations are possible, depending on the items that can be adjusted or controlled on the destination system. It warrants emphasizing that UI logic 64 thus presents a unique user interface tailored to the requirements of the particular application selected. Each group includes items that are commonly asked for by the physician. For example, in the signals group 92, the Speed +/− buttons can be used to change the viewed waveform sweep speed as the physician may need more or less detail; the Page +/− buttons can be used to change the page of signals being viewed (e.g., from surface ECG signals to intracardiac signals); and the Amplitude +/− buttons can be used to change the signal amplitudes up or down. As a further example, in the Events group 94, the enumerated Events buttons cause a mark to be created in the patient charting log to indicate a noteworthy (i.e., important) item or event, such as the patient was just defibrillated or entered a tachy-arrhythmia. Note that these items are all user definable and speakable (capable of being tied to the voice recognition function). The physician also needs to keep track of certain periods of time. Thus, in the Timer group 96, the timer buttons can be used to keep track of such periods of time, for example, such as a certain time after an ablation (e.g., 30 minutes) to verify that the ablation procedure is still effective. Finally, regarding the print group 98, various print buttons are provided so as to avoid requiring a physician to verbally indicate (e.g., by way of shouting out “print that document to the case” or the like) and to include such documents in a final report.
  • FIG. 6 is a plan view of touch panel computer 26 a showing in exemplary fashion a further, different application-specific user interface relating to the ultrasound imaging system 32 (“ViewMate”). As with the other application-specific user interfaces, the user interface presented in FIG. 6 repeats the common, application-selection group of buttons, designated group 70. A further group of buttons and adjustment mechanisms are located in group 100. The controls (buttons, sliders) provided for this user interface completely eliminate the need to have a separate ultrasound keyboard to control the console. The user interface shown can be different, independent on the kind of machine being controlled, but at a minimum may typically provide a way to control the receive gain, the depth setting, the focus zone, the TGC (i.e., time gain compensation) curve, the monitoring mode (e.g., B, M, color Doppler, Doppler), image recording, as well as other image attributes and states. Note, trackpad object 101 is shown in the center of the user interface. The capability provided by UI logic 64 to rapidly switch applications and present to the bedside user an application-specific user interface minimizes or eliminates many of the shortcomings set forth in the Background.
  • It should be understood that variations in UI logic 64 are possible. For example, certain applications can be linked (in software) so that multiple applications can be controlled with a single command (e.g., the Record command). In another embodiment, UI logic 64 can be configured to provide additional and/or substitute functions, such as, without limitation, (1) map creation; (2) collecting points; (3) segmenting regions by anatomy; (4) map view (rotate and zoom); (5) select/manipulate a number of maps and view each; (6) selection of signal trace display; (7) adjust EP signal amplitude; (8) sweep speed; (9) provide single button (or touch, multi-touch, gesture) for recording a segment, placing an event marker, and/or placing a lesion marker.
  • It should be further understood that the screen layouts in the illustrative embodiment are exemplary only and not limiting in nature. The UI logic 64 can thus implement alternative screen layouts for interaction by the user. For example, while the screen displays in FIGS. 3A, 5 and 6 show an approach that incorporates the top level menu items on every screen, multi-level menus can also be used. For example, the screen layouts can be arranged such that a user descends down a series of screens to further levels of control. To return to upper levels (and to the “home” screen), a “Back” button or the like can be provided. Alternatively, a “Home” button can be provided.
  • In a still further embodiment, UI logic 64 can be configured for bi-directional display of information, for example, on the touch-responsive display panel. As one example, the “EnSite” user interface (FIG. 3A) can be configured so that the EnSite™ model is sent to the computer 26 a and displayed on the touch-responsive display panel. The user interface provided by UI logic 64 can allow the user to drag his or her finger on the panel to rotate the model. The display of the model provides context with respect to the act of dragging. Other information can be displayed as well, such as a waveform. In various embodiments, all or a portion of the items/windows displayed on monitor 20 (see, e.g., FIGS. 2, 4A, and 4B) may be displayed or mirrored on the touch-responsive display panel. For example, display area or window 52 may be displayed on the touch-responsive display panel allowing the physician or other user to directly modify the features of window 52 at the patient's bedside. Other display areas/windows, such as windows 50, 54, 56, 58, and/or 48 (see FIG. 2) may also be displayed and/or modified on the touch-panel display panel. One further example involves displaying feedback information or messages originating from the various devices or systems back to the touch-responsive display panel. In this regard, the UI logic 64 can configure any of the user-interfaces to have a message area, which can show informational messages, warning messages or critical error messages for viewing by the user. The message area feature provides a way to immediately alert the physician to such messages, rather than the physician having to watch for messages on multiple displays.
  • FIG. 7A is a diagrammatic and block diagram view of a second embodiment of the bedside interface device, comprising an electronic wand system 26 b. As with touch panel computer 26 a, wand system 26 b is configured to allow the EP physician to take control, bedside of the patient, of an EP diagnostic or therapeutic system, such as the electro-anatomic mapping system 34. The wand system 26 b includes a wireless remote control portion 102, an optical emitter portion 104, and a base interface 28 b, which may be coupled to the desired, target EP system through either a wired or wireless connection. The wand system 26 b incorporates remote control technology, and includes the ability to detect and interpret motion of the remote control indicative of an EP physician's command or other instruction, detect and interpret key-presses on the remote control, and/or detect and interpret motion/keypress combinations.
  • Since the wand system 26 b is contemplated as being used in the sterile procedure room, multiple embodiments are contemplated for avoiding contamination. In this regard, wand system 26 b may be configured with a disposable remote control portion 102, with a reusable remote control portion 102 that is contained within an enclosure compatible with sterilization procedures, with a reusable remote control portion 102 adapted to be secured in a sterilization-compatible wrapper, or with a reusable remote control portion 102 that is encased in a sterile but disposable wrapper.
  • With continued reference to FIG. 7A, remote control portion 102 may include an optical detector 106, an electronic processor 108, a memory 110, an optional accelerometer 112 and a wireless transmitter/receiver 114. The processor 108 is configured to execute a control program that is stored in memory 110, to achieve the functions described below. The optical emitter 104 is configured to emit a light pattern 105 that can be detected and recognized by optical detector 106. For example, the light pattern may be a pair of light sources spaced apart by a predetermined, known distance. The control program in remote 102 can be configured to assess movement of the light pattern 105 as detected by detector 106 (e.g., by assessing a time-based sequence of images captured by detector 106). For example, in the exemplary light pattern described above, processor 108 can be configured to determine the locations of the light sources (in pixel space). In an embodiment, the control program in remote 102 may only discern the light pattern 105 itself (e.g., the locations in pixel space) and transmit this information to base interface 28 b, which in turn assesses the movement of the detected light pattern in order to arrive at a description of the motion of the remote 102. In a still further embodiment, various aspects of the processing may be divided between processor 108 and a processor (not shown) contained in base interface 28 b. The processor 106 communicates with base interface 28 b via the wireless transmitter/receiver 114, which may be any type of wireless communication method now known or hereafter developed (e.g., such as those technologies or standards branded Bluetooth™, Wi-Fi™, etc.). The processor 108 is configured to transmit wirelessly to interface 28 b the detected keypresses and information concerning the motion of the remote control 102 (e.g., the information about or derived from the images from the optical detector 106). In an embodiment, the motion of remote control 102 may also be determined, or supplemented by, readings from accelerometer 112 (which may be single-axis or multi-axis, such as a 3-axis accelerometer). In some instances, rapid motion may be better detected using an accelerometer than using optical methods. In an embodiment, electronic wand system 26 b may be similar to (but differing in application, as described herein) a commercially available game controller sold under the trade designation Wii Remote Controller, from Nintendo of America, Inc.
  • Either the remote 102 or the base interface 28 b (or both, potentially in some division of computing labor) is configured to identify a command applicable to the one of the EP diagnostic/therapeutic systems, such as electro-anatomic mapping system 34, based on the detected motion of the remote 102. Alternatively, the command may be indentified based on a key press, or a predetermined motion/key press combination. Once the remote 102 and/or interface 28 b indentifies the command it is transmitted to the appropriate EP system. In an electro-anatomic mapping system embodiment, the wireless remote control 102 is configured to allow an EP physician to issues a wide variety of commands, for example only, any of the commands (e.g., 3D model rotation, manipulation, etc.) described above in connection with touch panel computer 26 a. By encoding at least some of the control through the wireless remote control 102 that the EP physician controls, one or more of the shortcomings of conventional EP labs, as described in the Background, can be minimized or eliminated. As with touch panel computer 26 a, electronic wand system 26 b can reduce procedure times as the EP physician will spend less time playing “hot or cold” with the mapping system operator (i.e., the control technician), but instead can set the display to his/her needs throughout the medical procedure.
  • FIG. 7B shows a further embodiment, designated interface device 26 c. Interface device 26 integrates the remote control 102 described above into the handle of a catheter 115. Through the foregoing, the physician need not take his hands off the catheter, but rather can issue direct, physical commands (e.g., via key-presses) while retaining control of the catheter. Additionally, one or more of the keys or a slider switch on the catheter handle may serve as a safety mechanism to prevent inadvertent activation of one or more commands while operating the catheter. In such an embodiment, after advancing the catheter into a patient's body, the safety mechanism may be deactivated or otherwise turned off such that the physician can issue commands and then he or she may reactivate or turn on the safety mechanism and resume manipulating the catheter without fear of modifying the view or model shown on an on-screen display, for example. The catheter 115 may further comprise one or more electrodes on a distal portion of the catheter shaft and a manual or motorized steering mechanism (not shown) to enable the distal portion of the catheter shaft to be steered in at least one direction. In at least one embodiment, the catheter handle may be generally symmetric on opposing sides and include identical or nearly identical sets of controls on opposing sides of the handle so that a physician need not worry about which side of the catheter handle contains the keys. In another embodiment, the catheter handle may be generally cylindrical in shape and include an annular and/or rotatable control feature for issuing at least one command, again so the physician need not worry about the catheter handle's orientation in his or her hand(s). Exemplary catheters, handles, and steering mechanisms are shown and described in U.S. Pat. No. 5,861,024 to Rashidi, U.S. patent application publication no. 2010/0314031 to Heideman et al., U.S. Pat. No. 7,465,288 to Dudney et al., and U.S. Pat. No. 6,671,533 to Chen et al., each of which is hereby incorporated by reference as though fully set forth herein.
  • FIG. 8 is a diagrammatic and block diagram view of a fourth embodiment of the bedside interface device, comprising a motion capture apparatus 26 d. As with touch panel computer 26 a, wand system 26 b and integrated system 26 c, motion capture apparatus 26 d is configured to allow the EP physician to take control, bedside of the patient, of an EP diagnostic or therapeutic system, such as electro-anatomical mapping system 34. The motion capture apparatus 26 d includes a capture apparatus 116 having both an optical sub-system 118 and a microphone sub-system 120 where the apparatus 116 is coupled to a base interface 28 b. The apparatus 116 is configured to optically detect the motion or physical gestures of the EP physician or other user when such movements occur within a sensing volume 122. The base interface 28 b may be coupled to the desired, target EP system through either a wired or wireless connection.
  • The motion capture apparatus 26 d includes the capability to detect hand/arm/leg/trunk/facial motions (e.g., gestures) of the EP physician or other user and translate the detected patterns into a desired command. Apparatus 26 d also includes audio capture and processing capability and thus also has the capability to detect speech and translate the same into desired commands. In an embodiment, apparatus 26 d is configured to detect and interpret combinations and sequences of gestures and speech into desired commands. The base interface 28 b is configured to communicate the commands (e.g., rotation, zoom, pan of a 3D anatomical model) to the appropriate EP diagnostic or therapeutic system (e.g., the electro-anatomic mapping system 34). In an embodiment, the motion capture apparatus 26 d may comprise commercially available components, for example, the Kinect™ game control system, available from Microsoft, Redmond, Wash., USA. A so-called Kinect™ software development kit (SDK) is available, which includes drivers, rich application programming interfaces (API's), among other things contents, that enables access to the capabilities of the Kinect™ device. In particular, the SDK allows access to raw sensor streams (e.g., depth sensor, color camera sensor, and four-element microphone array), skeletal tracking, advanced audio (i.e., integration with Windows speech recognition) as well as other features.
  • Since there is no contact contemplated by EP physician 16 during use of motion capture apparatus 26 d, contamination and subsequent sterilization issues are eliminated or reduced. In addition, the lack of contact with apparatus 26 d for control purposes allows the EP physician to keep his hands on the catheter or other medical device(s) being manipulated during an EP procedure. By encoding at least some of the control through the motion capture apparatus 26 d, with which the EP physician interacts, one or more of the shortcomings of conventional EP labs, as described in the Background, can be minimized or eliminated. As with the previous embodiments, the motion capture apparatus 26 d can reduce procedure times.
  • It should be understood that variations are possible. For example, the motion capture apparatus 26 d can be used in concert with sensors and/or emitters in a sterile glove to assist the apparatus 26 d to discriminate commands intended to be directed to one of the EP systems, versus EP physician hand movements that result from his/her manipulation of the catheter or medical device, versus other movement in the EP lab in general. In another embodiment, the motion capture apparatus 26 d may discriminate such commands by being “activated” by a user when a specific verbal command is issued (e.g., “motion capture on”) and then “deactivated” by the user when another specific verbal command is issued (e.g., “motion capture off”).
  • FIGS. 9-10 are diagrammatic views of fifth and sixth embodiments of the bedside interface device, comprising touch responsive devices. FIGS. 9 and 10 show touch-screen mouse pad devices 26 e and 26 f, respectively. These devices can be covered in a sterile bag. The EP physician 16 can move the mouse cursor from application to application and control each such application independently. Devices 26 e, 26 f may comprise conventional apparatus known in the art.
  • FIG. 11 is a diagrammatic view of a seventh embodiment of the bedside interface device comprising a customized joystick 26 g. Joystick 26 g can also be covered in a sterile bag. The device 26 g can be used to be provide application-specific control a particular application function(s), such as rotating a 3D model (system 34), adding lesion markers, and the like.
  • FIGS. 12-13 are diagrammatic views of eighth and ninth embodiments of the bedside interface device comprising holographic mouse and keyboard input devices, respectively. Holographic mouse 26 h deploys light beam pattern 124, which is used by the mouse 26 h to acquire user input (i.e., movement of the physician's finger, instead of moving a conventional mouse). The movement input can be used in the same manner as that obtained from a conventional mouse. Holographic keyboard 26 i also deploys a light beam pattern 126 corresponding to a keyboard. A physician's finger can be used to “select” the key much in the same manner as a conventional keyboard, but without any physical contact. Devices 26 h, 26 i have the advantage of being sterile without any disposables, and can incorporate wireless communications and may be powered using batteries (i.e., no cables needed).
  • It should be understood that variations are possible. For example, in a further embodiment, primary control by the physician in manipulating or interacting with the mapping system may be through use of voice control alone (i.e., a microphone coupled with voice recognition logic), apart from its inclusion with other modes or devices for user interaction described above. In a still further embodiment, the physician can be equipped with headgear that monitors head movements to determine at what location on the screen/monitor the physician is looking. In effect, such headgear can act as a trackball to move or otherwise manipulate an image (or view of a model) on the monitor in accordance with the physician's head movements. In a yet further embodiment, the physician can be equipped with headgear that monitors head movements and/or also monitors brainwave patterns (e.g., to record a user electroencephalogram (EEG)). Such monitored data can be analyzed to derive or infer user input or commands for controlling an image (or view of a model), as described above. An EEG-based embodiment may comprise conventional apparatus known in the art, for example, commercially available products respectively sold under the trade designation MindWave™ headset from NeuroSky, Inc., San Jose, Calif., USA, or the Emotiv EPOC™ personal interface neuroheadset from Emotiv, Kwun Tong, Hong Kong. In a still further embodiment, the physician can be equipped with an eye tracking apparatus, wherein monitored eye movements constitute the user input to be interpreted by the system (e.g., the eye movements can be interpreted as a cursor movement or other command).
  • It should also be appreciated that while the foregoing description pertains to an EP physician manually controlling a catheter through the use of a manually-actuated handle or the like, other configurations are possible, such as robotically-actuated embodiments. For example, a catheter movement controller (not shown) described above may be incorporated into a larger robotic catheter guidance and control system, for example, as seen by reference to U.S. application Ser. No. 12/751,843 filed Mar. 31, 2010 entitled ROBOTIC CATHETER SYSTEM (published as U.S. patent application publication no. 2010/0256558), owned by the common assignee of the present invention and hereby incorporated by reference in its entirety as though fully set forth herein. Such a robotic catheter system may be configured to manipulate and maneuver catheters within a lumen or a cavity of a human body, while the bedside interface devices described herein can be used to access and control the EP diagnostic and/or therapeutic systems. In at least one embodiment, a bedside interface device as described herein may also be used to access and control the robotic catheter system.
  • In accordance with another embodiment, an article of manufacture includes a computer storage medium having a computer program encoded thereon, where the computer program includes code for acquiring user input based on at least one of a plurality of input modes, such as by touch, multi-touch, gesture, motion pattern, voice recognition and the like, and identifying one or more commands or requests for an EP diagnostic and/or therapeutic system. Such embodiments may be configured to execute one or more processors, multiple processors that are integrated into a single system or are distributed over and connected together through a communications network, and where the network may be wired or wireless.
  • It should be understood that while the foregoing description describes various embodiments of a bedside interface device in the context of the practice of electrophysiology, and specifically catheterization, the teachings are not so limited and can be applied to other clinical settings.
  • It should be understood that the an electronic control unit as described above may include conventional processing apparatus known in the art, capable of executing preprogrammed instructions stored in an associated memory, all performing in accordance with the functionality described herein. It is contemplated that the methods described herein may be programmed, with the resulting software being stored in an associated memory and where so described, may also constitute the means for performing such methods. Implementation of an embodiment of the invention, in software, in view of the foregoing enabling description, would require no more than routine application of programming skills by one of ordinary skill in the art. Such a system may further be of the type having both ROM, RAM, a combination of non-volatile and volatile (modifiable) memory so that the software can be stored and yet allow storage and processing of dynamically produced data and/or signals.
  • Although numerous embodiments of this invention have been described above with a certain degree of particularity, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this invention. All directional references (e.g., plus, minus, upper, lower, upward, downward, left, right, leftward, rightward, top, bottom, above, below, vertical, horizontal, clockwise, and counterclockwise) are only used for identification purposes to aid the reader's understanding of the present invention, and do not create limitations, particularly as to the position, orientation, or use of the invention. Joinder references (e.g., attached, coupled, connected, and the like) are to be construed broadly and may include intermediate members between a connection of elements and relative movement between elements. As such, joinder references do not necessarily infer that two elements are directly connected and in fixed relation to each other. It is intended that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not limiting. Changes in detail or structure may be made without departing from the spirit of the invention as defined in the appended claims.

Claims (19)

1.-20. (canceled)
21. A device for allowing a user to communicate with a plurality of electrophysiological systems, comprising:
an electronic control unit, a display panel, and a microphone;
user interface logic stored in a memory configured to be executed by said electronic control unit and configured to display on said display panel a user interface which includes a first group of buttons corresponding to a plurality of electrophysiological (EP) diagnostic and therapeutic systems; and
voice recognition logic stored in said memory configured to be executed by said electronic control unit and configured to analyze user speech input captured by said microphone to identify a user-spoken command;
wherein said user interface logic is configured to allow the user to select one button from said first group of buttons according to said identified command to thereby select a corresponding one of said plurality of EP systems and to present, in response to said user selection, an application-specific user interface on said display panel that enables access to and control of said one user-selected EP system while maintaining said display of said first group of buttons;
wherein said user interface logic is further configured to allow the user to interact with said application-specific user interface, said voice recognition logic being configured to identify a further spoken command with respect to said application-specific user interface wherein said electronic control unit is configured to communicate said further spoken command to the user-selected one EP system.
22. The device of claim 21 where said application-specific user interface of said one user-selected EP system comprises at least a second group of buttons displayed on said display panel that is different from said first group of buttons.
23. The device of claim 21 wherein said user interface logic is configured to present on said display panel, for each one of said plurality of EP systems when selected by the user, a respective application-specific user interface that enables access to and control of said user-selected EP system.
24. The device of claim 21 wherein said display panel comprises a touch-responsive display panel, and wherein said user interface logic is configured to receive from the user a user touch input from said touch-responsive display panel, said user interface logic being further configured to allow the user to select one button from said first group of buttons according to said user touch input.
25. The device of claim 21 wherein said user interface logic is further configured to switch between respective application-specific user interfaces via a common interface displayed on said display panel, wherein said common interface includes said first group of buttons.
26. The device of claim 21 wherein said electronic control unit communicates said identified command wirelessly.
27. The device of claim 21 wherein the user interface logic is further configured to alter an appearance of said one user-selected button of said first group of buttons so as to be visually distinguishable from remaining, non-selected buttons of said first group of buttons, thereby visually indicating to the user which corresponding EP system has been selected.
28. The device of claim 27 wherein said one user-selected button is altered so as to have one of a depressed appearance and shaded appearance.
29. The device of claim 21 further comprising a user profile stored in said memory and associated with the user, wherein said voice recognition logic is further configured to identify said command using said user profile.
30. The device of claim 29 wherein the user is a first user and the user profile is a first user profile, further comprising a second user and a second user profile associated with the second user wherein said second user profile is stored in said memory.
31. The device of claim 30 wherein said voice recognition logic is configured to identify a spoken command of the second user by using said second profile.
32. The device of claim 30 wherein each of the first and second users have unique commands associated therewith stored in respective first and second user profiles.
33. The device of claim 30 wherein the currently active user profile is displayed on said display panel.
34. The device of claim 21 wherein said user interface logic is further configured to allow the user to enable or disable the operation of said voice recognition logic.
35. The device of claim 21 further comprising a sterile drape configured to protect said display panel from contamination.
36. The device of claim 21 wherein said plurality of EP systems includes an electro-anatomic mapping system, an EP monitoring and recording system, a cardiac stimulator, an EP data editing system, a medical positioning system, and an imaging system.
37. A device for allowing a user to communicate with a plurality of electrophysiological systems, comprising:
an electronic control unit, a touch-responsive display panel, and a microphone;
user interface logic stored in a memory configured to be executed by said electronic control unit and configured to display on said display panel a user interface which includes a common interface comprising a first group of buttons corresponding to a plurality of electrophysiological (EP) diagnostic and therapeutic systems, said user interface logic is further configured to receive from the user a user touch input from said touch-responsive display panel;
voice recognition logic stored in said memory configured to be executed by said electronic control unit and configured to analyze user speech input captured by said microphone to identify a user-spoken command;
wherein said user interface logic is configured to allow the user to select one button from said common interface using one of (i) said user interface logic according to said user touch, and (ii) said voice recognition logic according to said identified user-spoken command, to thereby select a corresponding one of said plurality of EP systems and to present, in response to said user selection, an application-specific user interface on said display panel that enables access to and control of said one user-selected EP system while maintaining said display of said common interface;
wherein said user interface logic is further configured to allow the user to interact with said application-specific user interface, said voice recognition logic being configured to identify a further spoken command with respect to said application-specific user interface wherein said electronic control unit is configured to communicate said further spoken command to the user-selected one EP system.
38. A device for allowing a user to communicate with a plurality of electrophysiological systems, comprising:
an electronic control unit coupled to a memory;
a microphone;
an input means for acquiring a user input comprising a display panel;
user interface logic stored in said memory configured to be executed by said electronic control unit and configured to display on said display panel a user interface which includes a first group of buttons corresponding to a plurality of electrophysiological (EP) diagnostic and therapeutic systems; and
voice recognition logic stored in said memory configured to be executed by said electronic control unit and configured to analyze user speech input captured by said microphone to identify a user-spoken command;
wherein said user interface logic is configured to allow the user to select one button from said first group of buttons according to said identified command to thereby select a corresponding one of said plurality of EP systems and to present, in response to said user selection, an application-specific user interface on said display panel that enables access to and control of said one user-selected EP system while maintaining said display of said first group of buttons;
wherein said user interface logic is further configured to allow the user to interact with said application-specific user interface and obtain a further command taken with respect to said application-specific user interface, and wherein said electronic control unit is configured to communicate said further command to said user-selected one EP system.
US15/144,135 2011-08-12 2016-05-02 User interface devices for electrophysiology lab diagnostic and therapeutic equipment Abandoned US20160320930A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/144,135 US20160320930A1 (en) 2011-08-12 2016-05-02 User interface devices for electrophysiology lab diagnostic and therapeutic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/208,924 US9330497B2 (en) 2011-08-12 2011-08-12 User interface devices for electrophysiology lab diagnostic and therapeutic equipment
US15/144,135 US20160320930A1 (en) 2011-08-12 2016-05-02 User interface devices for electrophysiology lab diagnostic and therapeutic equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/208,924 Continuation US9330497B2 (en) 2009-07-22 2011-08-12 User interface devices for electrophysiology lab diagnostic and therapeutic equipment

Publications (1)

Publication Number Publication Date
US20160320930A1 true US20160320930A1 (en) 2016-11-03

Family

ID=47677954

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/208,924 Active 2035-02-13 US9330497B2 (en) 2009-07-22 2011-08-12 User interface devices for electrophysiology lab diagnostic and therapeutic equipment
US15/144,135 Abandoned US20160320930A1 (en) 2011-08-12 2016-05-02 User interface devices for electrophysiology lab diagnostic and therapeutic equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/208,924 Active 2035-02-13 US9330497B2 (en) 2009-07-22 2011-08-12 User interface devices for electrophysiology lab diagnostic and therapeutic equipment

Country Status (2)

Country Link
US (2) US9330497B2 (en)
WO (1) WO2013025257A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140255890A1 (en) * 2013-03-07 2014-09-11 Hill-Rom Services, Inc. Patient support apparatus with physical therapy system
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8446378B2 (en) * 2008-04-16 2013-05-21 Nike, Inc. Athletic performance user interface for mobile device
CN103034337A (en) * 2011-09-30 2013-04-10 Ge医疗系统环球技术有限公司 Keyboard input device and manufacturing method thereof
US9940845B2 (en) * 2012-11-14 2018-04-10 i-Human Patients, Inc. Electronic healthcare education system and method
KR20140099111A (en) * 2013-02-01 2014-08-11 삼성전자주식회사 Method for control a camera apparatus and the camera apparatus
CN103546585B (en) * 2013-11-19 2016-08-17 上海华东汽车信息技术有限公司 Wireless remote refreshes diagnostic system and method
WO2015130954A1 (en) * 2014-02-26 2015-09-03 Nantomics, Llc Secured mobile genome browsing devices and methods therefor
US9274620B2 (en) * 2014-04-09 2016-03-01 Wei-Chih Cheng Operating system with shortcut touch panel having shortcut function
US10026328B2 (en) * 2014-10-21 2018-07-17 i-Human Patients, Inc. Dynamic differential diagnosis training and evaluation system and method for patient condition determination
US9921805B2 (en) * 2015-06-17 2018-03-20 Lenovo (Singapore) Pte. Ltd. Multi-modal disambiguation of voice assisted input
US9925015B2 (en) 2015-07-17 2018-03-27 Thincubus, LLC Wearable protective articles donning system
US20170065254A1 (en) * 2015-09-04 2017-03-09 National Tsing Hua University Imaging agent delivery method and system thereof
US10460024B2 (en) * 2016-01-05 2019-10-29 Adobe Inc. Interactive electronic form workflow assistant that guides interactions with electronic forms in a conversational manner
US10657200B2 (en) 2016-01-05 2020-05-19 Adobe Inc. Proactive form guidance for interacting with electronic forms
CN109997196B (en) * 2016-11-25 2024-02-23 霍罗吉克公司 Medical care information manipulation and visualization controller
EP3360486A1 (en) * 2017-02-13 2018-08-15 Koninklijke Philips N.V. Ultrasound evaluation of anatomical features
AU2018266132A1 (en) * 2017-05-09 2019-10-10 Boston Scientific Scimed, Inc. Operating room devices, methods, and systems
WO2019055115A1 (en) * 2017-09-18 2019-03-21 St. Jude Medical, Cardiology Division, Inc. System and method for sorting electrophysiological signals from multi-dimensional catheters
US11040214B2 (en) 2018-03-01 2021-06-22 West Affum Holdings Corp. Wearable cardioverter defibrillator (WCD) system having main UI that conveys message and peripheral device that amplifies the message
US11672934B2 (en) 2020-05-12 2023-06-13 Covidien Lp Remote ventilator adjustment
USD995539S1 (en) 2021-03-03 2023-08-15 GE Precision Healthcare LLC Display screen or portion thereof with graphical user interface
USD1029847S1 (en) * 2021-03-24 2024-06-04 Neutrace Inc. Display screen or portion thereof with a graphical user interface for a medical device
USD1087979S1 (en) * 2022-10-06 2025-08-12 Ventis Medical, Inc. Display screen with a graphical user interface
JP2024101281A (en) * 2023-01-17 2024-07-29 コニカミノルタ株式会社 Image display device, image display method, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6985865B1 (en) * 2001-09-26 2006-01-10 Sprint Spectrum L.P. Method and system for enhanced response to voice commands in a voice command platform
US20090138280A1 (en) * 2007-11-26 2009-05-28 The General Electric Company Multi-stepped default display protocols
US20100053085A1 (en) * 2008-08-29 2010-03-04 Siemens Medical Solutions Usa, Inc. Control System for Use Within a Sterile Environment
US20100131482A1 (en) * 2008-11-26 2010-05-27 General Electric Company Adaptive user interface systems and methods for healthcare applications

Family Cites Families (195)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3091130A (en) 1960-06-27 1963-05-28 Morse Instr Co Single lever control for multiple actions
US3605725A (en) 1968-08-07 1971-09-20 Medi Tech Inc Controlled motion devices
US3893449A (en) 1973-12-21 1975-07-08 Nasa Reference apparatus for medical ultrasonic transducer
US4160508A (en) 1977-08-19 1979-07-10 Nasa Controller arm for a remotely related slave arm
SE419421B (en) 1979-03-16 1981-08-03 Ove Larson RESIDENTIAL ARM IN SPECIAL ROBOT ARM
US4348556A (en) 1981-03-30 1982-09-07 Gettig Engineering & Manufacturing Co. Multi-position switch
US4758222A (en) 1985-05-03 1988-07-19 Mccoy William C Steerable and aimable catheter
US4543090A (en) 1983-10-31 1985-09-24 Mccoy William C Steerable and aimable catheter
DE3404047A1 (en) 1984-02-06 1985-08-08 Siemens AG, 1000 Berlin und 8000 München CONTROL STAFF
JPS60221280A (en) 1984-04-19 1985-11-05 三菱電機株式会社 Industrial robot hand device
US4974151A (en) 1985-02-21 1990-11-27 International Business Machines Corporation Configuration capability for devices in an open system having the capability of adding or changing devices by user commands
US4784042A (en) 1986-02-12 1988-11-15 Nathaniel A. Hardin Method and system employing strings of opposed gaseous-fluid inflatable tension actuators in jointed arms, legs, beams and columns for controlling their movements
US5078140A (en) 1986-05-08 1992-01-07 Kwoh Yik S Imaging device - aided robotic stereotaxis system
US4802487A (en) 1987-03-26 1989-02-07 Washington Research Foundation Endoscopically deliverable ultrasound imaging system
US4884557A (en) 1987-05-15 1989-12-05 Olympus Optical Co., Ltd. Endoscope for automatically adjusting an angle with a shape memory alloy
GB2211280B (en) 1987-10-16 1991-10-30 Daco Scient Limited Joystick
US5303148A (en) 1987-11-27 1994-04-12 Picker International, Inc. Voice actuated volume image controller and display controller
US4962448A (en) 1988-09-30 1990-10-09 Demaio Joseph Virtual pivot handcontroller
US5449345A (en) 1989-03-17 1995-09-12 Merit Medical Systems, Inc. Detachable and reusable digital control unit for monitoring balloon catheter data in a syringe inflation system
US5661253A (en) 1989-11-01 1997-08-26 Yamaha Corporation Control apparatus and electronic musical instrument using the same
US5107080A (en) 1989-12-01 1992-04-21 Massachusetts Institute Of Technology Multiple degree of freedom damped hand controls
US6413234B1 (en) 1990-02-02 2002-07-02 Ep Technologies, Inc. Assemblies for creating compound curves in distal catheter regions
US5298930A (en) 1990-05-25 1994-03-29 Olympus Optical Co., Ltd. Camera and film winding mechanism thereof
US5170817A (en) 1991-04-03 1992-12-15 Sherwood Medical Company Support device for fluid delivery system and case therefore
US5339799A (en) 1991-04-23 1994-08-23 Olympus Optical Co., Ltd. Medical system for reproducing a state of contact of the treatment section in the operation unit
JPH05184526A (en) 1991-09-17 1993-07-27 Olympus Optical Co Ltd Bending mechanism for flexible tube
US5238005A (en) 1991-11-18 1993-08-24 Intelliwire, Inc. Steerable catheter guidewire
US6850252B1 (en) 1999-10-05 2005-02-01 Steven M. Hoffberg Intelligent electronic appliance system and method
CA2128606C (en) 1992-01-21 2008-07-22 Philip S. Green Teleoperator system and method with telepresence
ES2098729T3 (en) 1992-03-25 1997-05-01 Penny & Giles Blackwood Ltd CONTROL LEVER CONTROLLERS.
US5318525A (en) 1992-04-10 1994-06-07 Medtronic Cardiorhythm Steerable electrode catheter
US6290683B1 (en) 1992-04-29 2001-09-18 Mali-Tech Ltd. Skin piercing needle assembly
AU675077B2 (en) 1992-08-14 1997-01-23 British Telecommunications Public Limited Company Position location system
US5441483A (en) 1992-11-16 1995-08-15 Avitall; Boaz Catheter deflection control
US5389073A (en) 1992-12-01 1995-02-14 Cardiac Pathways Corporation Steerable catheter with adjustable bend location
JPH06314103A (en) 1993-04-30 1994-11-08 Fujitsu Ltd Controller and passive sensing device
US5410638A (en) 1993-05-03 1995-04-25 Northwestern University System for positioning a medical instrument within a biotic structure using a micromanipulator
US5396266A (en) 1993-06-08 1995-03-07 Technical Research Associates, Inc. Kinesthetic feedback apparatus and method
JPH06344285A (en) 1993-06-08 1994-12-20 Toshiba Corp robot
US5545200A (en) 1993-07-20 1996-08-13 Medtronic Cardiorhythm Steerable electrophysiology catheter
US5607462A (en) 1993-09-24 1997-03-04 Cardiac Pathways Corporation Catheter assembly, catheter and multi-catheter introducer for use therewith
US5876325A (en) 1993-11-02 1999-03-02 Olympus Optical Co., Ltd. Surgical manipulation system
US5623582A (en) 1994-07-14 1997-04-22 Immersion Human Interface Corporation Computer interface or control input device for laparoscopic surgical instrument and other elongated mechanical objects
US5706827A (en) 1994-09-21 1998-01-13 Scimed Life Systems, Inc. Magnetic lumen catheter
GB2295662A (en) 1994-11-28 1996-06-05 Wah Leung Chan Joystick eg for video games
US5882206A (en) 1995-03-29 1999-03-16 Gillio; Robert G. Virtual surgery system
JPH08280709A (en) 1995-04-18 1996-10-29 Olympus Optical Co Ltd Display device for operation
EP0829066B1 (en) 1995-06-02 2001-03-14 Gerhard Wergen Analogue control element
US5691898A (en) 1995-09-27 1997-11-25 Immersion Human Interface Corp. Safe and low cost computer peripherals with force feedback for consumer applications
DE69637531D1 (en) 1995-06-07 2008-06-26 Stanford Res Inst Int Surgical manipulator for a remote-controlled robot system
US5630783A (en) 1995-08-11 1997-05-20 Steinberg; Jeffrey Portable cystoscope
US5828813A (en) 1995-09-07 1998-10-27 California Institute Of Technology Six axis force feedback input device
US5784542A (en) 1995-09-07 1998-07-21 California Institute Of Technology Decoupled six degree-of-freedom teleoperated robot system
US6219032B1 (en) 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US6063095A (en) 1996-02-20 2000-05-16 Computer Motion, Inc. Method and apparatus for performing minimally invasive surgical procedures
US5807377A (en) 1996-05-20 1998-09-15 Intuitive Surgical, Inc. Force-reflecting surgical instrument and positioning mechanism for performing minimally invasive surgery with enhanced dexterity and sensitivity
US5854622A (en) 1997-01-17 1998-12-29 Brannon; Daniel J. Joystick apparatus for measuring handle movement with six degrees of freedom
JPH10216238A (en) 1997-02-05 1998-08-18 Mitsubishi Cable Ind Ltd Bending mechanism
US5861024A (en) 1997-06-20 1999-01-19 Cardiac Assist Devices, Inc Electrophysiology catheter and remote actuator therefor
US6123699A (en) 1997-09-05 2000-09-26 Cordis Webster, Inc. Omni-directional steerable catheter
US6281651B1 (en) 1997-11-03 2001-08-28 Immersion Corporation Haptic pointing devices
US6088019A (en) 1998-06-23 2000-07-11 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
EP0973440B1 (en) 1998-01-22 2006-08-02 Biosense Webster, Inc. Intrabody measurement
US6692485B1 (en) 1998-02-24 2004-02-17 Endovia Medical, Inc. Articulated apparatus for telemanipulator system
US20020087048A1 (en) 1998-02-24 2002-07-04 Brock David L. Flexible instrument
US6233504B1 (en) 1998-04-16 2001-05-15 California Institute Of Technology Tool actuation and force feedback on robot-assisted microsurgery system
US7263397B2 (en) 1998-06-30 2007-08-28 St. Jude Medical, Atrial Fibrillation Division, Inc. Method and apparatus for catheter navigation and location and mapping in the heart
US6319227B1 (en) 1998-08-05 2001-11-20 Scimed Life Systems, Inc. Automatic/manual longitudinal position translator and rotary drive system for catheters
US6113395A (en) 1998-08-18 2000-09-05 Hon; David C. Selectable instruments with homing devices for haptic virtual reality medical simulation
US6142940A (en) 1998-10-06 2000-11-07 Scimed Life Systems, Inc. Control panel for intravascular ultrasonic imaging system
US7193521B2 (en) 1998-10-29 2007-03-20 Medtronic Minimed, Inc. Method and apparatus for detecting errors, fluid pressure, and occlusions in an ambulatory infusion pump
US6040758A (en) 1998-11-06 2000-03-21 Midway Games Inc. Potentiometer mounting clip for a joystick controller
US6396266B1 (en) 1998-11-25 2002-05-28 General Electric Company MR imaging system with interactive MR geometry prescription control
US6799065B1 (en) 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US7107539B2 (en) 1998-12-18 2006-09-12 Tangis Corporation Thematic response to a computer user's context, such as by a wearable personal computer
US7386339B2 (en) 1999-05-18 2008-06-10 Mediguide Ltd. Medical imaging and navigation system
US6233476B1 (en) 1999-05-18 2001-05-15 Mediguide Ltd. Medical positioning system
US6709667B1 (en) 1999-08-23 2004-03-23 Conceptus, Inc. Deployment actuation system for intrafallopian contraception
US7333648B2 (en) 1999-11-19 2008-02-19 General Electric Company Feature quantification from multidimensional image data
US20010025183A1 (en) 2000-02-25 2001-09-27 Ramin Shahidi Methods and apparatuses for maintaining a trajectory in sterotaxi for tracking a target inside a body
US8888688B2 (en) 2000-04-03 2014-11-18 Intuitive Surgical Operations, Inc. Connector device for a controllable instrument
US6468203B2 (en) 2000-04-03 2002-10-22 Neoguide Systems, Inc. Steerable endoscope and improved method of insertion
US6869390B2 (en) 2000-06-05 2005-03-22 Mentor Corporation Automated implantation system for radioisotope seeds
US6540685B1 (en) 2000-11-09 2003-04-01 Koninklijke Philips Electronics N.V. Ultrasound diagnostic device
GB0100729D0 (en) 2001-01-11 2001-02-21 Rehab Robotics Ltd Robotic arrangement
US7766894B2 (en) 2001-02-15 2010-08-03 Hansen Medical, Inc. Coaxial catheter system
WO2002087676A2 (en) 2001-04-27 2002-11-07 C.R. Bard, Inc. Electrophysiology catheter for mapping and/or ablation
DK1389958T3 (en) 2001-05-06 2009-01-12 Stereotaxis Inc Catheter delivery system
US20020184055A1 (en) 2001-05-29 2002-12-05 Morteza Naghavi System and method for healthcare specific operating system
US20040243147A1 (en) 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller
JP2003024336A (en) 2001-07-16 2003-01-28 Hitachi Ltd Operation instrument
US6728599B2 (en) 2001-09-07 2004-04-27 Computer Motion, Inc. Modularity system for computer assisted surgery
US6785358B2 (en) 2001-10-09 2004-08-31 General Electric Company Voice activated diagnostic imaging control user interface
US6671533B2 (en) 2001-10-11 2003-12-30 Irvine Biomedical Inc. System and method for mapping and ablating body tissue of the interior region of the heart
US6839612B2 (en) 2001-12-07 2005-01-04 Institute Surgical, Inc. Microwrist system for surgical procedures
FR2833367B1 (en) 2001-12-10 2004-01-30 Commissariat Energie Atomique CONTROL DEVICE WITH TENSILE CABLES
US6869010B2 (en) 2001-12-28 2005-03-22 Xerox Corporation In-line automated dual or selective multi-hole punch
US6968223B2 (en) 2002-02-01 2005-11-22 Ge Medical Systems Global Technology Company, Llc System and method for wireless voice control of an interventional or diagnostic medical device
US7311705B2 (en) 2002-02-05 2007-12-25 Medtronic, Inc. Catheter apparatus for treatment of heart arrhythmia
AU2003281217A1 (en) 2002-07-09 2004-01-23 Saeed Behzadipour Light weight parallel manipulators using active/passive cables
JP3973504B2 (en) 2002-07-15 2007-09-12 株式会社日立製作所 Tow positioning device
US7630752B2 (en) 2002-08-06 2009-12-08 Stereotaxis, Inc. Remote control of medical devices using a virtual device interface
NZ521094A (en) 2002-08-30 2005-02-25 Holmes Solutions Ltd Apparatus for testing tension of elongated flexible member
JP4712385B2 (en) 2002-09-06 2011-06-29 ヒル−ロム サービシーズ,インコーポレイティド Hospital bed
JP2004208922A (en) 2002-12-27 2004-07-29 Olympus Corp Medical apparatus, medical manipulator and control process for medical apparatus
GB2397177B (en) 2003-01-11 2006-03-08 Eleksen Ltd Manually deformable input device
KR100526741B1 (en) 2003-03-26 2005-11-08 김시학 Tension Based Interface System for Force Feedback and/or Position Tracking and Surgically Operating System for Minimally Incising the affected Part Using the Same
US20040199052A1 (en) 2003-04-01 2004-10-07 Scimed Life Systems, Inc. Endoscopic imaging system
US7247139B2 (en) 2003-09-09 2007-07-24 Ge Medical Systems Global Technology Company, Llc Method and apparatus for natural voice control of an ultrasound machine
US7195599B2 (en) 2003-10-22 2007-03-27 Medtronic Vascular, Inc. Instrumented catheter with distance compensation to sense vulnerable plaque
US20070276214A1 (en) 2003-11-26 2007-11-29 Dachille Frank C Systems and Methods for Automated Segmentation, Visualization and Analysis of Medical Images
US8164573B2 (en) 2003-11-26 2012-04-24 Immersion Corporation Systems and methods for adaptive interpretation of input from a touch-sensitive input device
US8046049B2 (en) 2004-02-23 2011-10-25 Biosense Webster, Inc. Robotically guided catheter
WO2005087128A1 (en) 2004-03-05 2005-09-22 Hansen Medical, Inc. Robotic catheter system
US20060100610A1 (en) 2004-03-05 2006-05-11 Wallace Daniel T Methods using a robotic catheter system
US8052636B2 (en) 2004-03-05 2011-11-08 Hansen Medical, Inc. Robotic catheter system and methods
JP3999214B2 (en) 2004-03-31 2007-10-31 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー MEDICAL INFORMATION DISPLAY METHOD, DEVICE, AND PROGRAM
JP3922284B2 (en) 2004-03-31 2007-05-30 有限会社エスアールジェイ Holding device
US10258285B2 (en) 2004-05-28 2019-04-16 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic surgical system and method for automated creation of ablation lesions
US9782130B2 (en) 2004-05-28 2017-10-10 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic surgical system
US8755864B2 (en) 2004-05-28 2014-06-17 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic surgical system and method for diagnostic data mapping
US8528565B2 (en) 2004-05-28 2013-09-10 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic surgical system and method for automated therapy delivery
US7197354B2 (en) 2004-06-21 2007-03-27 Mediguide Ltd. System for determining the position and orientation of a catheter
WO2006046084A1 (en) 2004-09-20 2006-05-04 Nokia Corporation Foldable cellular phone device
US20060089637A1 (en) 2004-10-14 2006-04-27 Werneth Randell L Ablation catheter
US7736384B2 (en) 2005-01-07 2010-06-15 Rex Medical, L.P. Cartridge for vascular device
US20070172803A1 (en) 2005-08-26 2007-07-26 Blake Hannaford Skill evaluation
US7963288B2 (en) * 2005-05-03 2011-06-21 Hansen Medical, Inc. Robotic catheter system
US7742803B2 (en) 2005-05-06 2010-06-22 Stereotaxis, Inc. Voice controlled user interface for remote navigation systems
US8257302B2 (en) 2005-05-10 2012-09-04 Corindus, Inc. User interface for remote control catheterization
US20070016008A1 (en) 2005-06-23 2007-01-18 Ryan Schoenefeld Selective gesturing input to a surgical navigation system
US7465288B2 (en) 2005-06-28 2008-12-16 St. Jude Medical, Atrial Fibrillation Division, Inc. Actuation handle for a catheter
US20070005002A1 (en) 2005-06-30 2007-01-04 Intuitive Surgical Inc. Robotic surgical instruments for irrigation, aspiration, and blowing
EP1907041B1 (en) 2005-07-11 2019-02-20 Catheter Precision, Inc. Remotely controlled catheter insertion system
JP2009507617A (en) 2005-09-14 2009-02-26 ネオガイド システムズ, インコーポレイテッド Method and apparatus for performing transluminal and other operations
US7643862B2 (en) 2005-09-15 2010-01-05 Biomet Manufacturing Corporation Virtual mouse for use in surgical navigation
US7885707B2 (en) 2005-09-15 2011-02-08 St. Jude Medical, Atrial Fibrillation Division, Inc. Method of scaling navigation signals to account for impedance drift in tissue
JP4763420B2 (en) 2005-10-27 2011-08-31 オリンパスメディカルシステムズ株式会社 Endoscope operation assistance device
US7945546B2 (en) 2005-11-07 2011-05-17 Google Inc. Local search and mapping for mobile devices
DE102005054575B3 (en) 2005-11-16 2007-04-26 Deutsches Zentrum für Luft- und Raumfahrt e.V. Robot arm regulating method, for medical engineering, involves utilizing redundancy of hinges to optimize quality factor to adjust hinges at angle that is perpendicular to instrument axis, where force caused by regulating hinges is zero
US8190238B2 (en) 2005-12-09 2012-05-29 Hansen Medical, Inc. Robotic catheter system and methods
EP2289455B1 (en) 2005-12-30 2019-11-13 Intuitive Surgical Operations, Inc. Modular force sensor
EP1815950A1 (en) 2006-02-03 2007-08-08 The European Atomic Energy Community (EURATOM), represented by the European Commission Robotic surgical system for performing minimally invasive medical procedures
US9910497B2 (en) 2006-02-08 2018-03-06 Oblong Industries, Inc. Gestural control of autonomous and semi-autonomous systems
EP1986563B1 (en) 2006-02-22 2012-12-26 Hansen Medical, Inc. System and apparatus for measuring distal forces on a working instrument
US8989528B2 (en) 2006-02-22 2015-03-24 Hansen Medical, Inc. Optical fiber grating sensors and methods of manufacture
US20080091169A1 (en) 2006-05-16 2008-04-17 Wayne Heideman Steerable catheter using flat pull wires and having torque transfer layer made of braided flat wires
ATE472980T1 (en) 2006-05-17 2010-07-15 Hansen Medical Inc ROBOTIC INSTRUMENT SYSTEM
WO2007136769A2 (en) 2006-05-19 2007-11-29 Mako Surgical Corp. Method and apparatus for controlling a haptic device
EP2040635A1 (en) 2006-06-14 2009-04-01 MacDonald Dettwiler & Associates Inc. Surgical manipulator with right-angle pulley drive mechanisms
US20080058595A1 (en) 2006-06-14 2008-03-06 Snoke Phillip J Medical device introduction systems and methods
US9579088B2 (en) 2007-02-20 2017-02-28 Board Of Regents Of The University Of Nebraska Methods, systems, and devices for surgical visualization and device manipulation
US20080013809A1 (en) 2006-07-14 2008-01-17 Bracco Imaging, Spa Methods and apparatuses for registration in image guided surgery
US20080112842A1 (en) 2006-11-09 2008-05-15 Advanced Medical Optics, Inc. Monitor drape with vents
US8543338B2 (en) 2007-01-16 2013-09-24 Simbionix Ltd. System and method for performing computerized simulations for image-guided procedures using a patient specific model
US20080249536A1 (en) 2007-02-15 2008-10-09 Hansen Medical, Inc. Interface assembly for controlling orientation of robotically controlled medical instrument
KR100954594B1 (en) 2007-02-23 2010-04-26 (주)티피다시아이 Virtual keyboard input system using pointing device used in digital equipment
US8560118B2 (en) 2007-04-16 2013-10-15 Neuroarm Surgical Ltd. Methods, devices, and systems for non-mechanically restricting and/or programming movement of a tool of a manipulator along a single axis
WO2008133956A2 (en) 2007-04-23 2008-11-06 Hansen Medical, Inc. Robotic instrument control system
US8317711B2 (en) 2007-06-16 2012-11-27 St. Jude Medical, Atrial Fibrillation Division, Inc. Oscillating phased-array ultrasound imaging catheter system
JP4092365B2 (en) 2007-07-05 2008-05-28 株式会社東芝 Medical manipulator
TW200907764A (en) 2007-08-01 2009-02-16 Unique Instr Co Ltd Three-dimensional virtual input and simulation apparatus
WO2009021179A1 (en) 2007-08-09 2009-02-12 Volcano Corporation Controller user interface for a catheter lab intravascular ultrasound system
EP2626030A3 (en) 2007-08-14 2017-03-08 Koninklijke Philips N.V. Robotic instrument systems and methods utilizing optical fiber sensors
KR101442542B1 (en) 2007-08-28 2014-09-19 엘지전자 주식회사 Input device and portable terminal having the same
US9050120B2 (en) 2007-09-30 2015-06-09 Intuitive Surgical Operations, Inc. Apparatus and method of user interface with alternate tool mode for robotic surgical tools
JP5154961B2 (en) 2008-01-29 2013-02-27 テルモ株式会社 Surgery system
US8926511B2 (en) 2008-02-29 2015-01-06 Biosense Webster, Inc. Location system with virtual touch screen
US8641663B2 (en) 2008-03-27 2014-02-04 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system input device
US8317745B2 (en) 2008-03-27 2012-11-27 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter rotatable device cartridge
US9161817B2 (en) 2008-03-27 2015-10-20 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system
US8317744B2 (en) 2008-03-27 2012-11-27 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter manipulator assembly
US8343096B2 (en) 2008-03-27 2013-01-01 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system
EP2331203B1 (en) * 2008-04-29 2016-03-23 Medtronic, Inc. Therapy program modification based on therapy guidelines
CN101615102A (en) 2008-06-26 2009-12-30 鸿富锦精密工业(深圳)有限公司 Touch screen based input method
KR101840405B1 (en) 2008-07-10 2018-03-20 리얼 뷰 이미징 리미티드 Broad viewing angle displays and user interfaces
US8332072B1 (en) 2008-08-22 2012-12-11 Titan Medical Inc. Robotic hand controller
EP2320990B2 (en) 2008-08-29 2023-05-31 Corindus, Inc. Catheter control system and graphical user interface
US8390438B2 (en) 2008-09-24 2013-03-05 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system including haptic feedback
US8315720B2 (en) 2008-09-26 2012-11-20 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US20100079386A1 (en) 2008-09-30 2010-04-01 Scott Steven J Human-machine interface having multiple touch combinatorial input
JP5587331B2 (en) 2008-11-21 2014-09-10 ストライカー コーポレイション Wireless operating room communication system
WO2010068783A1 (en) 2008-12-12 2010-06-17 Corindus Inc. Remote catheter procedure system
TWI378382B (en) 2009-02-13 2012-12-01 Htc Corp Method, apparatus and computer program product for preventing on-screen buttons from being mistakenly touched
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
EP2408509B1 (en) 2009-03-18 2023-08-09 Corindus, Inc. Remote catheter system with steerable catheter
CN107510506A (en) 2009-03-24 2017-12-26 伊顿株式会社 Utilize the surgical robot system and its control method of augmented reality
US8996173B2 (en) 2010-09-21 2015-03-31 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US9734285B2 (en) * 2010-05-20 2017-08-15 General Electric Company Anatomy map navigator systems and methods of use
US20120133600A1 (en) * 2010-11-26 2012-05-31 Hologic, Inc. User interface for medical image review workstation
US20130154913A1 (en) 2010-12-16 2013-06-20 Siemens Corporation Systems and methods for a gaze and gesture interface
US8920368B2 (en) 2011-12-22 2014-12-30 St. Jude Medical, Atrial Fibrillation Division, Inc. Multi-user touch-based control of a remote catheter guidance system (RCGS)
US9625993B2 (en) 2012-01-11 2017-04-18 Biosense Webster (Israel) Ltd. Touch free operation of devices by use of depth sensors
US9931154B2 (en) 2012-01-11 2018-04-03 Biosense Webster (Israel), Ltd. Touch free operation of ablator workstation by use of depth sensors

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6985865B1 (en) * 2001-09-26 2006-01-10 Sprint Spectrum L.P. Method and system for enhanced response to voice commands in a voice command platform
US20090138280A1 (en) * 2007-11-26 2009-05-28 The General Electric Company Multi-stepped default display protocols
US20100053085A1 (en) * 2008-08-29 2010-03-04 Siemens Medical Solutions Usa, Inc. Control System for Use Within a Sterile Environment
US20100131482A1 (en) * 2008-11-26 2010-05-27 General Electric Company Adaptive user interface systems and methods for healthcare applications

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140255890A1 (en) * 2013-03-07 2014-09-11 Hill-Rom Services, Inc. Patient support apparatus with physical therapy system
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep

Also Published As

Publication number Publication date
US9330497B2 (en) 2016-05-03
WO2013025257A1 (en) 2013-02-21
US20130041243A1 (en) 2013-02-14

Similar Documents

Publication Publication Date Title
US9330497B2 (en) User interface devices for electrophysiology lab diagnostic and therapeutic equipment
US10357322B2 (en) System and method for controlling a remote medical device guidance system in three-dimensions using gestures
US11960665B2 (en) Systems and methods of steerable elongate device
JP7075439B2 (en) Cardiac analysis user interface system and method
US8368649B2 (en) Control system for use within a sterile environment
US12114955B2 (en) Dynamic scaling of surgical manipulator motion based on surgeon stress parameters
US20230157757A1 (en) Extended Intelligence for Pulmonary Procedures
US8923959B2 (en) Methods and system for real-time cardiac mapping
US9241768B2 (en) Intelligent input device controller for a robotic catheter system
EP2096523B1 (en) Location system with virtual touch screen
US20190164633A1 (en) System and method for interactive event timeline
US8708902B2 (en) Catheter configuration interface and related system
JP7282495B2 (en) Interactive display of selected ECG channel
US20140058246A1 (en) System and methods for real-time cardiac mapping
JP2018518244A (en) Ultrasonic sequencing system and method
EP2931160B1 (en) Position determination apparatus
US20140171785A1 (en) Recognizing which instrument is currently active
US20210212773A1 (en) System and method for hybrid control using eye tracking
EP3944834A1 (en) Navigation operation instructions
Kogkas A Gaze-contingent Framework for Perceptually-enabled Applications in Healthcare

Legal Events

Date Code Title Description
AS Assignment

Owner name: ST. JUDE MEDICAL, ATRIAL FIBRILLATION DIVISION, IN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BYRD, CHARLES B.;BETZLER, ERIC;DANI, SANDEEP;AND OTHERS;SIGNING DATES FROM 20111010 TO 20111018;REEL/FRAME:039604/0465

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION