WO2023091570A1 - Déclencheurs automatisés pour assistance sur cas d'électrophysiologie - Google Patents

Déclencheurs automatisés pour assistance sur cas d'électrophysiologie Download PDF

Info

Publication number
WO2023091570A1
WO2023091570A1 PCT/US2022/050247 US2022050247W WO2023091570A1 WO 2023091570 A1 WO2023091570 A1 WO 2023091570A1 US 2022050247 W US2022050247 W US 2022050247W WO 2023091570 A1 WO2023091570 A1 WO 2023091570A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
computing apparatus
instructions
workflow
request
Prior art date
Application number
PCT/US2022/050247
Other languages
English (en)
Inventor
Hae Won Lim
Gregory Scott Brumfield
William E. ROWLAND
Qingguo Zeng
Quin LOU
David A. Simon
Timothy G. Laske
Original Assignee
Cardioinsight Technologies Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/986,473 external-priority patent/US20230162854A1/en
Application filed by Cardioinsight Technologies Inc. filed Critical Cardioinsight Technologies Inc.
Publication of WO2023091570A1 publication Critical patent/WO2023091570A1/fr

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/353Detecting P-waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/367Electrophysiological study [EPS], e.g. electrical activation mapping or electro-anatomical mapping
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7246Details of waveform analysis using correlation, e.g. template matching or determination of similarity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/25Bioelectric electrodes therefor
    • A61B5/279Bioelectric electrodes therefor specially adapted for particular uses
    • A61B5/28Bioelectric electrodes therefor specially adapted for particular uses for electrocardiography [ECG]
    • A61B5/283Invasive
    • A61B5/287Holders for multiple electrodes, e.g. electrode catheters for electrophysiological study [EPS]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6846Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
    • A61B5/6847Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
    • A61B5/6852Catheters

Definitions

  • the present technology is generally related to automated triggers for case support that can occur during electrophysiology procedures.
  • Electrophysiology procedures are used to analyze, diagnose and/or treat cardiac electrical activities. Electrophysiology procedures usually take place in an electrophysiology (EP) lab or a catheterization (Cath) lab at a hospital or other medical facility.
  • an EP mapping procedure can be performed in an invasive procedure in which one or more electrode catheters are placed in or on the heart to measure electrophysiology signals.
  • the EP mapping procedure may be performed using a non-invasive arrangement of electrodes distributed across an outer surface of the patient’s body (e.g., on the thorax).
  • a clinical specialist may be present to provide guidance and technical support before, during, and/or after the EP procedure. The use of clinical specialists at the site of an EP procedure can result in significant expense due to salaries and travel and often is an inefficient use of the mapping specialist's time, especially in simple cases or when a case is canceled.
  • the techniques of this disclosure generally relate to systems and methods for implementing automated triggers for case support during electrophysiology procedures.
  • the present disclosure provides one or more non-transitory machine-readable media to store data and instructions executable by one or more processors.
  • the instructions are programmed to analyze workflow data for a given phase of a plurality of phases of an ongoing electrophysiology (EP) workflow implemented using a first computing apparatus.
  • the instructions can also determine at least one support event based on the analysis of the workflow data and send a request to at least one remote support specialist responsive to determining the at least one support event.
  • the instructions further can establish a communication link between the first computing apparatus and a second computing apparatus, which is associated with a respective remote support specialist, and enable user interaction with and control of the machine-readable EP workflow instructions on the first computing apparatus responsive to a user input by the respective remote support specialist at the second computing apparatus.
  • FIG. 1 is a flow diagram that illustrates a method for triggering remote support for an electrophysiology encounter.
  • FIG. 2 is a block diagram that illustrates a system configured to implement automated triggers for electrophysiology case support.
  • FIG. 3 is a block diagram that illustrates software code programmed to implement automated triggers for electrophysiology case support.
  • FIG. 4 is a block diagram that illustrates further software functions programmed to implement the event detector of FIG. 3.
  • FIG. 5 is a block diagram that illustrates an example system architecture to implement automated triggers for electrophysiology case support.
  • FIG. 6 is a block diagram that illustrates another example system architecture with multiple remote specialists to implement electrophysiology case support.
  • a computing apparatus e.g., and EP computer workstation
  • the EP workflow session can have a plurality of phases, such as a pre-procedure phase and an intraprocedure (e.g., mapping and analysis) phase.
  • the pre-procedure phase can involve electrodes being applied to the patient for measuring EP signals and geometry data being generated (e.g., from imaging or other geometry generating mechanism) to enable reconstruction of EP signals on a surface of interest.
  • the intraprocedure phase can involve acquiring and processing of measured EP signals, selecting signals of interest and generating one or more EP maps on the surface of interest.
  • Systems and methods disclosed herein can be configured to implement automated triggers for support by a remote support specialist.
  • one or more support events can be determined based on an analysis of the workflow data for a given phase of the plurality of phases.
  • the workflow data can relate to a user condition (e.g., stress level, body temperature or the like), user actions (e.g., user performing illogical steps, physiological changes in the patient (e.g., unexpected rhythm change or a change to a rhythm type related to a condition in which the user is not an expert.
  • the determination of a given support event can be used as a trigger for issuing a request for a remote support specialist.
  • a request e.g., a notification or message
  • the request can be sent to one or more remote support specialists in response to the support event depending on the workflow data and the phase of the EP workflow.
  • the support event may be detected based on determining an exception, complexity or ambiguity from the workflow data.
  • a respective remote support specialist responding to the request e.g., accepting the request from a second computing apparatus
  • a communication link can be established between the first computing apparatus and the second computing apparatus.
  • the first computing apparatus can be configured to enable remote user interaction with and control of the machine-readable EP workflow instructions on the first computing apparatus responsive to user inputs by the respective remote support specialist at the second computing apparatus.
  • the control of the EP workflow instructions can be transferred from the local operator to the remote support specialist.
  • the remote support specialist can remotely control EP workflow instructions on the first computing apparatus in response to user input data provided at the second computing apparatus, such as to execute an upcoming event in the EP workflow and/or for revise (or regenerate) previously generated workflow data.
  • the types of controls that may be implemented can vary depending on the phase of the EP workflow as well as can be implemented at the discretion of the remote support specialist.
  • a bidirectional video and/or audio connection can also be used (e.g., over the same or another communications link) to enable real-time verbal communication between the remote support specialist and the local operator.
  • the bidirectional video and/or audio connection can be implemented independently from the remote control function, which is activated responsive to determining a support event. Alternatively, the bidirectional video and/or audio connection can be provided each time a support event is detected.
  • the systems and methods disclosed herein thus can reduce costs in the overall EP workflow session because a reduced number of mapping specialists may be used to service a greater number of EP workflow sessions without requiring travel.
  • a given remote specialist further can implement remote control from the second computing apparatus for multiple instances of EP workflow sessions running on respective computing apparatuses (e.g., for different patients), which remote control functions can be implemented concurrently or sequentially.
  • the approach described herein can enable worldwide assistance by connecting specific specialists having particular expertise relating to a situation or problem encountered.
  • FIG. 1 is a flow diagram that illustrates an example method 100 for implementing automated triggers for case support by a remote support specialist. While for purposes of simplicity of explanation, the example method of FIG. 1 is shown and described as executing serially, the example method 100 is not limited by the illustrated order, as some actions could in other examples occur in different orders, multiple times and/or concurrently from that shown and described herein.
  • the method 100 is implemented as machine-readable instructions executed by a processor, such as by a computing apparatus (e.g., a local computer, a cloud-based computer or a computing apparatus that include hardware and/or software distributed between a local premise and a cloud computing architecture).
  • a computing apparatus e.g., a local computer, a cloud-based computer or a computing apparatus that include hardware and/or software distributed between a local premise and a cloud computing architecture.
  • the method 100 can be implemented by instructions being executed by one or more processors of a cardiac mapping system, such as the CardioInsight® mapping products (e.g., mapping vest and workstation) available from Medtronic of Minneapolis, MN USA.
  • a cardiac mapping system such as the CardioInsight® mapping products (e.g., mapping vest and workstation) available from Medtronic of Minneapolis, MN USA.
  • FIG. 1 A block diagram illustrating an exemplary cardiac mapping system
  • the method 100 can be implemented by instructions being executed by one or more processors of a cardiac mapping system, such as the CardioInsight® mapping products (e.g., mapping vest and workstation) available from Medtronic of Minneapolis, MN USA.
  • the method 100 includes analyzing workflow data during an EP workflow session, such as workflow data for a given phase of the plurality of phases of the EP workflow session.
  • the workflow data can include any data relating to a given electrophysiology (EP) workflow session that is being implemented for a respective patient using a first computing apparatus (e.g., a local computer or workstation).
  • a first computing apparatus e.g., a local computer or workstation.
  • workflow data can include patient data (e.g., patient demographic information, health history information and the like), image data (e.g., 3D anatomical image of the patient, such as including an arrangement of sensors on the patient’s thorax), geometry data (e.g., describing spatial relationship between a patient’s body surface and a surface of interest) and electrophysiological data (e.g., EP signal measurements) for a patient.
  • patient data e.g., patient demographic information, health history information and the like
  • image data e.g., 3D anatomical image of the patient, such as including an arrangement of sensors on the patient’s thorax
  • geometry data e.g., describing spatial relationship between a patient’s body surface and a surface of interest
  • electrophysiological data e.g., EP signal measurements
  • the workflow data can also relate to the user, such a user condition and/or user actions (or omissions).
  • the workflow data can be obtained during the EP workflow session or prior to the workflow session, depending
  • workflow data describing a user condition can be based on determining a level of user stress (e.g., above a stress threshold).
  • Workflow data describing user actions the user performing illogical steps within the software (e.g., indicating confusion or lack of knowledge on how to operate the system).
  • the system can track and store information, as part of the workflow data, representative of an experience and/or expertise of users and specialists.
  • the system can determine a support event based on workflow data indicate a user (or local specialist) is not an expert or otherwise lacks experience related a patient condition (e.g., unexpected rhythm change or a change to a rhythm type).
  • a support event is determined based on the analysis of the workflow data.
  • the support event can represent any condition or part of the EP workflow session in response to which assistance from a remote specialist is to be requested.
  • a given EP workflow session can include predetermined portions of the workflow session, each of which are represented by workflow status data indicating when such portion of the workflow is active or is expected to be active (e.g., it will be implemented next in the workflow).
  • Certain portions of the workflow thus can be designated as requiring review and/or approval by a mapping specialist before proceeding to a next portion of the session, and a request (e.g., a notification) can be sent to obtain such approval from a remote specialist in response to detecting (based on the workflow status data specifying) when such portion of the workflow session is approaching or imminent.
  • a portion of the workflow that can determine a support event is a final check of one or more maps or other generated EP data that is expected to be used to drive an intervention (e.g., treatment or therapy or implant).
  • the system may allow the user to define which step or combination of steps will automatically generate a request for support based upon the expertise of the local team and/or user.
  • Support events further can also be determined from analysis of workflow data that includes EP information that is generated or derived during a respective phase in overall EP workflow session.
  • support events can include determining a need for assistance in anatomic segmentation of pre-procedure images in support of a given patient’s case based on detecting an atypical anatomy for the patient (e.g., from image analysis or patient condition data).
  • a confidence e.g., a statistical value
  • a confidence value can be computed with respect to the segmented image data, and a confidence value that is below a threshold value can specify a support event.
  • a support event regarding image segmentation can be determined based on the image segmentation process taking an amount of time that exceeds a normal time period (e.g., which may be a statistical value or be an operator specific value).
  • the pre-procedure phase can also determine support events associated with the improper placement and/or contact of sensors placed on the patient’s body surface such as based on a determination of low channel integrity.
  • support events can also be determined at 104 based on which one or more mapping functions the operator has invoked in response to a user input.
  • the operator data can specify a qualification (or experience) level with respect to each of the available mapping and analysis functions implemented by the computing apparatus.
  • a support event thus can be determined at 104 in response to determining that the operator has selected a mapping or analysis function for which the operator has no experience or has an experience level that is below a threshold value.
  • a support event can be determined at 104 in response to determining that an electroanatomic map may be ambiguous (e.g., in which the map is determined to have a low confidence and/or incoherent portions).
  • the method 100 can access stored electrophysiological data that is provided based on electrophysiological signals measured from a plurality of sensors positioned on an outer surface of the patient’s body. Electrophysiological signals can be reconstructed on a surface of interest within the patient’s body (e.g., by solving the inverse problem) based on the electrophysiological data and geometry data. Examples of inverse algorithms that can be utilized to reconstruct the EP signals on a surface of interest are disclosed in U.S. Pat. Nos. 7,983,743 and 6,772,004, which are incorporated herein by reference.
  • An indication of noise for the reconstructed electrophysiological signals on the surface of interest further can be computed and, if the indication of noise for the reconstructed electrophysiological signals exceeds a noise threshold, the method can determine a support event. Additionally, one or more graphical maps can be generated based on the reconstructed EP signals on the surface of interest, and a physiological coherence can be calculated from the graphical map. If the physiological coherence (e.g., noise) computed for the graphical map describes a coherence value that is below a respective threshold, the method can determine the support event
  • the method includes sending a request to one or more remote support specialists responsive to determining at least one support event (at 104).
  • the request can be sent from the computing apparatus (e.g., which is executing instructions to perform the method 100) directly to one or more remote support specialists, or the request can be sent to a server (or other messaging platform) to manage sending the request to one or more remote support specialists.
  • One or more remote support specialists can receive the request at respective computing apparatuses, which can vary depending on the protocol used to send the request to the remote support specialists.
  • the request can be sent as an email (e.g., using Simple Mail Transfer Protocol (SMTP), Internet Message Access Protocol (IMAP), or other email protocol), text message (e.g., using short message service (SMS) protocol) or other messaging formats.
  • the second computing apparatus which is used by the remote support specialist, can be a smart phone or device, a desktop computer, a laptop computer, a tablet computer or a workstation.
  • the computing apparatus includes a user input device (e.g., touchscreen, keyboard, mouse and/or other) to enable the remote support specialist to accept the request in response to a user input through the user input device.
  • the method 100 includes establishing a communication link between the first computing apparatus and the second computing apparatus, which is associated with a respective remote support specialist.
  • the second computing apparatus can be the same computing apparatus at which the specialist received the request that was sent at 106. Alternatively, the second computing apparatus can be different from the computing apparatus at which the specialist received the request that was sent at 106.
  • the communications link can provide a communications path through one or more networks, which can include local area networks, wide area networks or a combination of networks.
  • the communications path can further include one or more of wireless data communications (e.g., Bluetooth, WiFi, cellular data) and/or communications over physical paths (e.g., electrically conductive wires or traces and/or optical fibers).
  • the communications path can include one or more secure channels, such as using a secure shell (SSH) protocol, transport layer security and/or secure socket layer (SSL), to provide for bi-directional communications between the respect computing apparatuses.
  • SSH secure shell
  • SSL secure socket layer
  • the first computing device includes remote desktop software (e.g., machine readable instructions) programmed to implement remote desktop protocol (RDP) or other application sharing protocol (commercially available or proprietary) that is tunneled over SSH.
  • the second computing apparatus thus connects to the first computing apparatus through the communications link.
  • RDP remote desktop protocol
  • an unencrypted protocol can be used to transmit information between the first and second computing apparatuses.
  • the remote support specialist can know the type of support that is needed at the current stage of the EP workflow session in advance.
  • a bidirectional video and/or audio connection is established, through which respective users can discuss current circumstances or other issues relating to the EP workflow session, including while rendering the remote support.
  • the method includes enabling user interaction with and control of the machine-readable EP workflow instructions running on the first computing apparatus responsive to user input by the respective remote support specialist at the second computing apparatus.
  • the first computing apparatus implements the application sharing code to provide the second computing apparatus (e.g., the remote support specialist) a graphical user interface through which the remote support specialist can access and utilize the EP workflow software system running on the first computing apparatus.
  • control is transferred to the remote support specialist in response to a user input by the operator at the first computing apparatus.
  • control is transferred to the remote support specialist automatically responsive to the remote specialist invoking activation instructions provided with the request.
  • the first computing apparatus can receive instructions in response to user input instructions provided at the second computing apparatus to control execution the EP workflow instructions and/or workflow data at the first computing for an upcoming event or, if some instructions or functions need to be re-executed, to redo some or all of a past event. Additionally, in some examples, after the control is transferred, some level of controls can remain active at each of the first and second computing apparatuses (e.g., some or all control remains with the local operator), which further can enable collaboration in response to user inputs provided at each of the computing apparatuses.
  • the graphics and control afforded the remote support specialist at the second (remote) computing apparatus can be the same or comparable to that which are rendered locally on the first computing apparatus (e.g., as if the specialist were co-located with the operator). After the remote support specialist has rendered assistance, full control may be returned to the local operator at the first computing apparatus. Alternatively, when the control is transferred, controls can remain active at second computing apparatus until the communication link has been terminated (e.g., in response to a user input by either user).
  • FIG. 2 illustrates an example computing system 200 that can implement the method 100.
  • the description of FIG. 2 also refers to the method of FIG. 1.
  • further examples of support events that can be determined as well as further examples of workflow instructions that can be controlled by the support specialist through the communications link, which can be included in the method 100 may be better appreciated with reference to the following description of FIGS. 2-8.
  • the computing system 200 of FIG. 2 includes an EP system 202.
  • the EP system 202 can be implemented as any system configured to perform stimulation, perform ablation, measure EP signals, and/or perform mapping based on signal measurements, such as can be used in an EP laboratory. Examples of some types of EP system 202 are disclosed in U.S. Patent Nos.
  • the system includes a sensor array 204 that includes one or more sensors configured measure EP signals from patient’s body.
  • the sensor array 204 includes an arrangement of body surface sensors to measure EP signals non-invasively from an external body surface.
  • body surface sensors One example of an arrangement of body surface sensors that can be used to implement the sensor array 204 is disclosed in U.S. Patent No. 9,655,561, which is incorporated herein by reference. Other arrangements of body surface sensors can be used in other examples.
  • the sensor array 204 can measure EP signals from within the patient’s body, such as from a cardiac surface (e.g., by contact or non-contact sensors).
  • the sensor array 204 includes a plurality input channels configured to receive electrical signals from respective electrical sensors (e.g., electrodes).
  • the input channels provide electrical signals representing EP signal measurements according to the location where the sensors are placed (e.g., non-invasively on the body surface and/or invasively within the patient’s body).
  • Each of the electrical sensors in the array 204 is coupled to an amplifier/interface 206 such through one or more connectors.
  • the amplifier/interface 206 thus can receive signal measurements from any sensors in the array.
  • the amplifier/interface 206 can be configured to amplify the signals from each of the sensors and provide a set of amplified electrical signals via an output to a computing apparatus 208.
  • the amplifier/interface 206 can include signal processing circuitry, such as for filtering signals to remove noise, and/or circuitry configured to convert the measured EP signals to corresponding digital signals.
  • the computing apparatus 208 includes one or more processors 210, memory 212 and a communications interface 214.
  • the memory 212 can include one or more forms of memory (e.g., one or more non-transitory computer readable media) configured to store data and machine readable instructions that can be accessed by the processor 210 through a bus structure.
  • the processor thus can access data and execute respective instructions, including EP measurement and analysis software instructions, to perform functions and methods disclosed herein (see, e.g., FIGS. 1, 3 and 4).
  • a display 216 can be coupled with the computing apparatus 208.
  • the display can be coupled to a video interface (e.g., hardware and software) of the computing apparatus 208 through a cable.
  • the display can be integral with the computing apparatus 208.
  • the hardware and software implemented by the computing apparatus 208 are configured to control output data (e.g., information and graphics) that is provided to the display 216 for visualization.
  • the display 216 can include one or more of a monitor, projector, virtual reality headset or other type of display device.
  • the system 202 can also include one or more input devices 218, such as a pointing device (e.g., a mouse or touch screen) and/or other input device (e.g., a keyboard or gesture control).
  • a user thus can use the input devices 218 to interact with and control the computing apparatus 208 and instructions (e.g., GUI or other functions and methods) executed by the processor 210.
  • instructions in the memory 212 include a user interface (e.g., a graphical user interface) that can enable a user to control the data acquisition process, analysis and generation of EP maps and other outputs.
  • the display 216 may present the GUI and other visualization, and the user can enter commands and information into the computing apparatus through the input device 218 to enable interaction with the GUI and other functions and methods.
  • the EP system 202 can also include a camera.
  • the camera can include a microphone and be configured to capture real-time video and audio according to the field of view.
  • the video and audio can include a local operator at the site of the system 202, such as for use during a video session with one or more remote support specialists (at another computing apparatus 222).
  • the audio and/or video can be compressed to facilitate streaming thereof over a communications link.
  • the communications interface 214 is configured to connect the computing apparatus to a network 224 through a physical or wireless link, shown as 226.
  • the network 224 can include one or more local area network (LAN), a wide area network (WAN), such as the Internet, and/or any other type of or combination of computer networks.
  • LAN local area network
  • WAN wide area network
  • the computing apparatus 208 can include more than one communications interface 214 to enable connections to the network 224 through one or more respective communication links according to the physical and data link layer (e.g., Ethernet, WiFi, cellular data) being implemented by the respective interface(s).
  • the computing apparatus in response to the computing apparatus determining a support event based on workflow data (e.g., stored in memory 212), a request is sent to one or more remote support specialists through the link (or through multiple links) 226.
  • one or more remote support specialists can each have access to one or more remote computing apparatus 222.
  • the remote computing apparatus 222 can include a processor 228, memory 230 and a communications interface 232.
  • the computing apparatus 222 can be coupled to the network 224 through a respective communications link, shown as 233.
  • a secure tunneling e.g., using SSH
  • SSH Secure Shell
  • the computing apparatus 222 can have the same or different configuration or form factor from the computing apparatus 208. Additionally, the computing apparatus 222 can include or be coupled to a display 234 as well as one or more user input devices 236. In some examples, the computing apparatus 222 can include or be coupled to a camera 238, such as to provide for bi-directional communications with the computing apparatus 208 during an EP workflow session.
  • the memory 230 can include instructions and data (e.g., a remote desktop client) configured to view and control the EP system 202 from the remote computing apparatus 222 (e.g., through a secure connection) in response to user inputs entered through the user input device 236.
  • FIG. 3 is a block diagram of the memory 212 that illustrates an example of data and instructions that can be stored in the memory.
  • the data includes workflow data 302.
  • the workflow data 302 can include any information associated with and/or describing electrical signal measurements, the sensing function or location of electrical sensors, one or more patient conditions, operator characteristics as well as other information that is input into the computing apparatus or derived from such information for the given EP workflow session.
  • the workflow data includes patient data 304, image data 306, geometry data 308 and EP data 310.
  • Patient data 304 can include patient demographics, such as can be retrieved from an electronic health record or entered by the operator or assistant.
  • the patient data 304 can describe a known physiological condition or disease, and can include electrical and/or anatomical features.
  • the patient data can specify special circumstances indicative of a likely increased complex EP study due to complicating patient conditions.
  • the patient condition can describe an abnormal heart condition (e.g., Brugada syndrome).
  • the patient condition data can specify a structural complicating issue, such as a congenital or other deformity, scar tissue from previous procedure or injury or an implanted medical device (e.g., pacemaker and/or defibrillator).
  • the image data 306 can include two-dimensional or three-dimensional image data acquired for the patient using a medical imaging modality.
  • imaging modalities include ultrasound, computed tomography (CT), 3D Rotational angiography (3DRA), magnetic resonance imaging (MRI), x-ray, positron emission tomography (PET), fluoroscopy, and the like.
  • CT computed tomography
  • DRA 3D Rotational angiography
  • MRI magnetic resonance imaging
  • PET positron emission tomography
  • fluoroscopy fluoroscopy
  • the image data 306 is acquired using an imaging modality while a sensor array 204 is placed on the patient’s body. In this way, the resulting image(s) in imaging data include both the patient’s anatomy and the sensors in the array 204.
  • the geometry data 308 is derived from the image data 306 to describe the spatial relationship for portions of the patient anatomy and the sensors in the array 204.
  • the geometry data 308 can include a representation of one or more geometrical surfaces, such as including a portion of an outer surface of the patient’s body and a surface of interest within the patient’s body.
  • the geometry data 308 can describe 3D spatial coordinates for each of the sensors (e.g., centroids of respective sensors) as well as describe a surface of interest within the patient’s body in a common coordinate system.
  • the sensors can include body surface sensors in the sensor array 204 as well one or more invasive sensors positioned within the patient’s body (e.g., affixed to a catheter or other probe).
  • the surface of interest can correspond to a three-dimensional surface geometry corresponding to a surface of the patient's heart, which surface can be epicardial and/or endocardial.
  • the surface of interest can correspond to a geometric surface that resides between the epicardial surface of a patient's heart and the surface of the patient's body where a sensor array has been positioned.
  • the geometry data 308 can correspond to actual patient anatomical geometry, a preprogrammed generic model, or a combination thereof (e.g., a model that is modified based on patient anatomy).
  • the geometry data 308 may be in the form of a graphical representation of the patient's torso.
  • a geometry manager 312 can be configured to manage and generate the geometry data 308.
  • the geometry manager 312 includes or otherwise is configured to use a segmentation control programmed to extract and segment anatomical features, including one or more organs and other structures, from a digital image set corresponding to the image data 306.
  • the segmentation control 314 can perform thresholding or other operations to identify respective surfaces and surface boundaries in the image data 306, including for the surface interest and the outer surface where the sensors are positioned.
  • the segmentation control 314 can be programmed to extract and segment each of the sensors in the sensor array 204, such as an automatic segmentation method and/or in response to a user input, and store respective sensor locations in the geometry data 308.
  • Other non-imaging based techniques can also be utilized to obtain the position of the sensors in the sensor array, such as a digitizer or manual measurements, and the sensor locations can be registered in the spatial coordinate system with the surface of interest.
  • the geometry manager 314 can be programmed to identify locations on the surface of interest from the image data 306, such as in response to a user input or through automated extraction and segmentation methods. Resulting spatial coordinates for the sensor locations and surface(s) of interest can be stored as the geometry data 308.
  • the geometry manager 312 can provide the geometry data 308 as a mathematical model (e.g., a mesh having respective vertices connected by line segments to define each surface), such as can be a generic model or a model that has been constructed based on the image data 306 for the patient.
  • a mathematical model e.g., a mesh having respective vertices connected by line segments to define each surface
  • Appropriate anatomical or other landmarks, including locations for the sensors in the sensor array, can be identified and stored in the geometry data 308. The identification of such landmarks can be done manually (e.g., in response to a user input) or automatically (e.g., via image processing techniques) and, in some examples, can be implemented by segmentation control 314 or other functions in the geometry manager 312.
  • the EP data 310 includes electrophysiological signals measured by the sensors of the array 204.
  • the EP data 310 can also include electrophysiological signals measured by one or more sensors within the patient’s body.
  • the electrophysiological signals are acquired in real-time, such as during a procedure or study.
  • the EP data 310 can correspond to a real time data flow that can be acquired by non-invasive (e.g., body surface) sensors during a procedure such as during an electrophysiological study as well as during a treatment procedure that can include cardiac ablation.
  • the EP data 310 includes electrophysiological measurements acquired over an extended period of time prior to a procedure, such as by a Holter monitoring system or the like.
  • a signal processing function 316 can be applied to the EP data 310.
  • the EP data can be raw EP signals or can have had pre-processing applied.
  • pre-processing can include line-filtering, offset correction, analog-to- digital conversion and the like to remove selected noise components from the respective input channels such that the EP data 310 includes digital representations of the pre- processed signals for each input channel.
  • the signal processing function 316 can be configured to perform additional filtering and signal analysis methods, such as can be configured in response to a user input (e.g., though user input device 218).
  • the filtering can include notch, bandpass, high-pass and/or low-pass filtering.
  • Signal analysis can include fast Fourier transform, and frequency domain analysis. Other signal processing functions can be implemented in other examples.
  • the workflow data 302 can also include operator data for one or more operators utilizing the EP system 202.
  • a user-operator can use the user input device 218 to login to the computing apparatus 208 and to an EP application, which can retrieve operator characteristics from memory (local or remote).
  • the operator can manually enter relevant characteristics through the user input device 218.
  • Operator characteristics can include a qualification (or training) level of the operator with respect to each phase of the EP workflow session or conditions of the operator that can be sensed during the session.
  • an operator profile can be retrieved based on the operator login information, which includes respective qualification levels.
  • cameras e.g., infrared and/or visible light cameras
  • cameras 220 can be used to monitor operator gestures and/or facial expressions (e.g., change in skin hue, temperature changes, heart rate changes) that are indicative of stress and/or anxiety.
  • the instructions also include a map generator 318 configured to generate a graphical map representing reconstructed electrophysiological signals on the surface of interest for one or more intervals.
  • the map generator 318 can include a reconstruction engine 320 configured to reconstruct the electrophysiological signals on a surface of interest based on the geometry data 308 and the EP data 310.
  • the reconstruction engine 320 can be configured to compute an inverse solution to reconstruct electrophysiological signals on the surface of interest within the body of the patient based on the electrophysiological data 310 and the geometry data 308.
  • solutions to the inverse problem include a boundary element method (BEM) or a method of fundamental solution (MFS).
  • the inverse calculation can employ a transformation matrix, which is derived from the geometry data 308, to reconstruct electrical activity sensed on the patient's body onto the surface of interest.
  • Examples of inverse algorithms that can be utilized by the reconstruction engine 320 to implement the inverse solution include those disclosed in U.S. Pat. Nos. 7,983,743 and 6,772,004, each of which is incorporated herein by reference.
  • the reconstruction engine 320 thus can reconstruct a body surface electrical activity measured via the sensors (e.g., the sensor array) on a body of the patient onto a multitude of locations (e.g., nodes distributed across on the surface of interest (e.g., an epicardial and/or endocardial surface).
  • the map generator 318 further may compute or derive one or more graphical maps from the reconstructed electrophysiological signals to provide an output on the display 216 and visualize features of the reconstructed signals across the surface of interest for one or more respective time intervals.
  • the map can include a potential map, a butterfly map, a phase map, a propagation map, and/or an activation map or other map, which type of map can be selected in response to a user input (e.g., though input device 218).
  • the EP system 202 can be configured to provide the map (or maps) and can modify feature of such maps, such as in response to a user input selecting the type of map, one or more output parameters or signal intervals for processing and inclusion in the selected map.
  • the map generator 318 can generate the map from real-time EP signals that are being acquired, or the map can be generated for a prior time interval, such as can be selected in response to a user input (e.g., using input device 218).
  • the instructions also include an event detector 322 programmed to detect a support event associated with the EP workflow based on the workflow data 302.
  • the event detector 322 can be programmed to detect support events based on the workflow data 302 for one or more phases of the EP workflow process, such as including patient/sensor setup, segmentation of imaging data, generating geometry data, EP data acquisition and mapping phases. Thus, the event detector 322 can generate support event data to describe the detected event.
  • the support event data can be stored in the memory 212, such as part of log for the EP workflow.
  • the support event data can include an event name, supporting workflow data (or the results of analyzing the workflow data) to specify the cause (or causes) that triggered the event.
  • Additional workflow data 302 can be store in memory 212 for the event part of the support event data, including logistic information (e.g., time, date, etc.), patient information (e.g., patient ID, condition, health information etc.), operator information (e.g., operator name/ID, affiliation, etc.) and the like.
  • logistic information e.g., time, date, etc.
  • patient information e.g., patient ID, condition, health information etc.
  • operator information e.g., operator name/ID, affiliation, etc.
  • the event detector can provide an indication of the detected support event can be provided to the operator, such as by providing a text, graphical and/or audio message on the display or other output of the system 202.
  • a graphical element can be provided on a GUI of the display 216 to notify the operator of the detected support event.
  • the graphical element can be a button or other GUI element that the operator can activate in response to a user input (e.g., using user input device 218), such as to decline the request, accept the request and/or acknowledge the support event.
  • a support request engine 324 can be programmed to generate and send one or more requests in response to the event detector detecting a support event.
  • the request can include data specifying the support event that the event detector 322 has determined, such as based on the event support data.
  • the request data (and/or event support data) can be used to identify one or more support specialists who are qualified to render remote support for the detected support event.
  • the request can also include a resource locator (e.g., a uniform resource identifier, such as a URL or other link) that can be used (in response to a user input) by a respective remote support specialist, who receives the request, to accept and initiate providing remote support.
  • a support session can be established in response to (or when) the request is generated, in response to detecting the support event or in response to a remote specialist activating the session in response to a user input (e.g., through user input device 236 of a remote computing apparatus 222).
  • the request can also specify operator information (e.g., operator identity, experience level etc.) as well as data describing the patient’s condition.
  • the request engine 324 can be configured to remove any personal and health-related information of the patient necessary to maintain HIPPA compliance and/or institutional security.
  • the event detector 322 can be programmed to (as a default condition) designate one or more portions of the workflow as requiring review and/or approval by a mapping specialist before proceeding to a next portion of the session.
  • the event detector 322 can trigger a support event, in response to which the support request engine 324 sends a request (e.g., a notification) to obtain approval from a remote specialist.
  • the event detector can trigger the support event before reaching the designated event (e.g., generate the support event when the event is approaching or imminent).
  • Another example of a portion of the workflow that can determine a support event is a final check of one or more maps or other generated EP data that is expected to be used to drive an intervention (e.g., treatment or therapy or implant).
  • the support request engine 324 is resident on the local computing apparatus 208. In other examples, the support request engine 324 resides on a remote computer, such as a server or cloud. In either configuration, the support request engine 324 can be configured to determine to which support specialists each request is sent. Alternatively, such as in the example where support request engine 324 resides locally on computing apparatus 208, the support request engine 324 can send the request to a message service to control distribution of the request to one or more support specialist.
  • the message service can be fully automated or it can be semi-automated such as using a person to select and control where each request is sent. The request can be sent based on availability, work schedules and other human resource parameters.
  • the support request engine 324 is configured send the request to the respective remote support specialist based on the remote support specialist being a subject matter expert with respect to a condition of the patient, such as a particular abnormal heart condition.
  • the instructions can also include a user interface control 326.
  • the user interface control 326 is configured to establish a communication link between the first computing apparatus and a second computing apparatus, which used by a respective remote support specialist.
  • the remote specialist can activate a link (e.g., URL or other resource identifier), which is provided in the request received by the remote support specialist, to accept and activate the request.
  • the request can be sent using a messaging protocol to send the request (e.g., an email, text, SMS or the like) to one or more remote support specialists.
  • the communication link can be established to remotely connect the specialist’s computing apparatus to the EP system 202, such as described herein.
  • the user interface control 326 can also be configured to enable user interaction with and control of the machine-readable EP workflow instructions (in memory 212) executing on the computing apparatus 208 responsive to a user input (through user input device 236) by the respective remote support specialist at the second computing apparatus 222.
  • the computing apparatus 208 of the EP system 202 executes the user interface control 326 to implement an application sharing protocol (e.g., RDP) that is tunneled over SSH through a network architecture.
  • RDP application sharing protocol
  • the remote support specialist can control the functions of the EP system 202 running on the local computing apparatus 208.
  • the event detector 322 further can include analysis functions configured to determine the occurrence of one or more support events based on workflow data 302 during respective phases in EP workflow session.
  • the event detector 322 includes a segmentation quality calculator 402.
  • the segmentation quality calculator 402 is programmed to compute a quality score based on a segmentation performed (e.g., by segmentation control 314) on the image data 306, which was acquired pre-procedure.
  • the segmentation quality calculator 402 is configured to evaluate the segmentation quality based on one or more of following characteristics of the segmented structure or structures: unknown structure and/or shapes (e.g., valves, LADs, CS); smoothness of cardiac surfaces; a relative alignment with CT volume; an alignment with one or more templates; the time period to complete the segmentation, etc.
  • the scoring methods can be updated as more templates and/or more segmented geometries are scored.
  • the segmentation quality calculator 402 can compare the quality score relative to a segmentation quality threshold and determine the occurrence of a segmentation support event responsive to the comparison. For example, consistency of the segmentation can be compared to a particular gray scale threshold on the medical imaging modality used. Alternately or additionally, the rendered anatomy can be compared to a database to seek deviations beyond a particular threshold for surface area, volume, or similar parameters.
  • the instructions in memory 212 can enable the respective remote specialist to implement the segmentation control 314 to perform another image segmentation on the image data 306 through the communication link responsive to a user input at the remote computing apparatus 222.
  • the image data can be segmented remotely to provide segmented image data that is stored in memory 212 of the first computing apparatus 208.
  • the remote support specialist can describe the process and educate the operator to reduce the likelihood of the same support event being triggered again.
  • the event detector 322 can also include a channel integrity function 404.
  • the channel integrity function 404 is programmed to determine one or more indications of channel integrity for each of the input signal channels providing EP signals for respective sensors of array 204.
  • the channel integrity function 404 uses the signal processing function 316 to analyze the EP data 310 to ascertain the integrity of respective sensors positioned on an outer surface of a patient’s body.
  • the signal processing function thus can perform signal processing on the EP data 310 to extract one or more components from the measured electrophysiological signals.
  • the event detector 322 thus can trigger the support event to cause the request to be sent to the respective remote support specialist based on the extracted one or more components.
  • the channel integrity function 404 can be configured to determine a value representing the acceptability and/or integrity of some or each of the respective channels that provide the EP data 310.
  • the channel integrity function 404 can be programmed to determine channel integrity for some or all EP sensors in the array 204 that have been placed on the patient’s body, such as according to any of the approaches described in U.S. Patent No. 9,977,060 and/or U.S. Patent No. 10,874,318, each of which is incorporated herein by reference.
  • the event detector 322 can trigger the support event to cause the request to be sent to the respective remote support specialist based on the determined channel integrity.
  • the event detector 322 can also include a channel noise calculator 406.
  • the signal processing function 316 can be programmed to compute an indication of signal to noise (SNR) for respective signal channels (a selected set or all of the channels).
  • the channel noise calculator 406 can evaluate the computed SNR values relative to a threshold to determine if channel noise exceeds a noise threshold. If the noise exceeds the noise threshold, the event detector 322 thus can trigger the support event to cause the request to be sent to the respective remote support specialist based on the noise level.
  • SNR signal to noise
  • the channel noise calculator 406 can be programmed (e.g., include a sensor confidence map generator) to generate a confidence map of the respective sensors positioned on the outer surface of the patient’s body based on the computed SNR or other indication of channel noise.
  • the event detector can determine a support event based on the confidence map indicating a low confidence level across the body surface (as compared to a confidence threshold).
  • the confidence threshold can vary depending on the type of mapping functions being implemented and/or the region of interest within the patient’s body. For example, a low confidence in one region of the body surface may adversely affect a map of a first region of the heart (e.g., noisy section of the output map) but have little or no impact on the map within a second (different) region of the heart.
  • the channel noise calculator 406 can invoke the signal processing function 316 to perform signal processing extract P-waves from the electrophysiological signals based on amplitude and frequency thereof.
  • the signal processing function 316 can also perform FFT on the extracted P-waves and analyze spectral features in the extracted P-waves.
  • the channel noise calculator 406 can be programmed to identify baseline noise in the electrophysiological signals (e.g., by averaging noise across the body surface or a portion thereof) and compare the amplitude of the P-waves to with respect to the baseline noise.
  • a support event can be determined if the P-wave amplitudes are too small (e.g., below a threshold).
  • the P-wave threshold can be applied locally or regionally in neighborhoods, such as to account for far- field interference or other localized noise features.
  • the event detector 322 can also include a map evaluator function 410 programmed to determine the occurrence (or non-occurrence) of a support event based on an evaluation of one or more maps generated by the map generator 318.
  • the electrophysiological data 310 is provided based on electrophysiological signals measured from sensors of the array 204 positioned on an outer surface of a patient’s body.
  • the map generator 318 employs reconstruction engine 320 to reconstruct electrophysiological signals on a surface of interest within the patient’s body based on the EP data 310 and geometry data 308.
  • One or more maps can be generated with or derived from the reconstructed EP signals.
  • the map evaluator function 410 can evaluate such maps or derivations thereof to determine whether to trigger a support event.
  • the map evaluator function 410 can include a map noise calculator 412 programmed to compute an indication of noise for the reconstructed electrophysiological signals on the surface of interest.
  • the noise can be a global indication of noise across the surface of interest or it can be computed regionally for respective regions of interest distributed across the surface of interest.
  • the event detector 322 thus can determine a support event to trigger sending a request to respective remote support specialist(s) based on the indication of noise computed for the reconstructed electrophysiological signals.
  • the map noise calculator 412 can be programmed to compute a physiological coherence across a graphical EP map according to arrangement and distribution of pixel values (e.g., set according to a color scale).
  • the event detector 322 thus can determine a support event to trigger sending a to the respective remote support specialist based on the physiological coherence of the graphical map.
  • the map evaluator function 410 can also include a signal -to-map correlation function 414 programmed to compute a correlation between the EP data 310 and a map generated from the EP data for a common time interval (or intervals).
  • the correlation function 414 For example, EP signals for a subset of sensors distributed across the body surface can be correlated with respect to signals for a subset of nodes across the surface of interest. The numbers of signals can be the same or different, and may include less than all or up to all the respective measured and reconstructed signals.
  • the correlation function 414 can invoke the map generator to generate a phase map for the set of reconstructed electrophysiological signals on the surface of interest.
  • the correlation function can compute a correlation between the phase map and the measured EP signals.
  • the event detector 322 can determine a support event to trigger the request engine to send a request to one or more support specialists based on the correlation between the phase map and the measured electrophysiological signals.
  • the support request engine 324 can be programmed to route the request to a respective remote support specialist based on a type of map that the operator has selected from a plurality of available map types (e.g., in response to a user input instruction to the map generator 318).
  • the request can include data specifying a type of the map, such that support specialists having a higher level of experience with certain types of maps can be matched with requests specifying those types of maps.
  • the map generator 318 includes GUI elements to enable an operator of the first computing apparatus 208 to retain or discard each map that is generated in response to a user review input instruction (e.g., using input device 218).
  • the EP application further can include instructions to determine a number of maps retained by the operator relative to a number of maps that the operator has discarded or tagged as being undecided in response to user review input instructions.
  • the event detector 322 further can be programmed to determine a support event and trigger the request engine to send a request to one or more support specialists based on the number or ratio of maps being retained and discarded or tagged as undecided.
  • FIGS. 5 and 6 are block diagrams showing respective system architectures for different scenarios.
  • a given remote specialist is shown to provide remote support from a respective computing apparatus with respect to multiple EP systems being implemented concurrently at different locations.
  • multiple (P) remote support specialists are shown for providing remote support from respective computing apparatuses with respect to multiple (M) EP systems being implemented concurrently at different locations, where M and be can be the same or different.
  • M and be can be the same or different.
  • the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit.
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.

Abstract

L'invention fait intervenir un ou plusieurs supports non transitoires lisibles par machine pour stocker des données et des instructions exécutables par un ou plusieurs processeurs. Les instructions sont programmées pour analyser des données de flux de travail pour une phase donnée parmi une pluralité de phases d'un flux de travail d'électrophysiologie (EP) en cours mis en œuvre à l'aide d'un premier appareil informatique. Les instructions peuvent également déterminer au moins un événement d'assistance d'après l'analyse des données de flux de travail et envoyer une demande à au moins un spécialiste d'assistance à distance en réaction à la détermination de l'événement ou des événements d'assistance. Les instructions peuvent en outre établir une liaison de communications entre le premier appareil informatique et un second appareil informatique qui est associé à un spécialiste respectif d'assistance à distance, et permettre une interaction d'utilisateur avec les instructions lisibles par machine de flux de travail d'EP sur le premier appareil informatique et une commande de celles-ci en réaction à une entrée d'utilisateur par le spécialiste respectif d'assistance à distance au niveau du second appareil informatique.
PCT/US2022/050247 2021-11-19 2022-11-17 Déclencheurs automatisés pour assistance sur cas d'électrophysiologie WO2023091570A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163281310P 2021-11-19 2021-11-19
US63/281,310 2021-11-19
US17/986,473 2022-11-14
US17/986,473 US20230162854A1 (en) 2021-11-19 2022-11-14 Automated triggers for electrophysiology case support

Publications (1)

Publication Number Publication Date
WO2023091570A1 true WO2023091570A1 (fr) 2023-05-25

Family

ID=84689142

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/050247 WO2023091570A1 (fr) 2021-11-19 2022-11-17 Déclencheurs automatisés pour assistance sur cas d'électrophysiologie

Country Status (1)

Country Link
WO (1) WO2023091570A1 (fr)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6772004B2 (en) 1997-07-31 2004-08-03 Case Western Reserve University System and method for non-invasive electrocardiographic imaging
US20080194950A1 (en) * 2007-02-13 2008-08-14 General Electric Company Ultrasound imaging remote control unit
US7983743B2 (en) 2005-07-22 2011-07-19 Case Western Reserve University System and method for noninvasive electrocardiographic imaging (ECGI)
US8478393B2 (en) 2008-11-10 2013-07-02 Cardioinsight Technologies, Inc. Visualization of electrophysiology data
US9078573B2 (en) 2010-11-03 2015-07-14 Cardioinsight Technologies, Inc. System and methods for assessing heart function
US9427166B2 (en) 2012-09-21 2016-08-30 Cardioinsight Technologies, Inc. Physiological mapping for arrhythmia
US9655561B2 (en) 2010-12-22 2017-05-23 Cardioinsight Technologies, Inc. Multi-layered sensor apparatus
US9977060B2 (en) 2012-05-09 2018-05-22 Cardioinsight Technologies, Inc. Channel integrity detection
US10076260B2 (en) 2014-02-04 2018-09-18 Cardioinsight Technologies, Inc. Integrated analysis of electrophysiological data
CN111513855A (zh) * 2020-04-28 2020-08-11 绍兴梅奥心磁医疗科技有限公司 心脏内科介入导管手术系统及其应用方法
US10874318B2 (en) 2018-03-06 2020-12-29 Cardioinsight Technologies, Inc. Channel integrity detection and reconstruction of electrophysiological signals

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6772004B2 (en) 1997-07-31 2004-08-03 Case Western Reserve University System and method for non-invasive electrocardiographic imaging
US7983743B2 (en) 2005-07-22 2011-07-19 Case Western Reserve University System and method for noninvasive electrocardiographic imaging (ECGI)
US20080194950A1 (en) * 2007-02-13 2008-08-14 General Electric Company Ultrasound imaging remote control unit
US8478393B2 (en) 2008-11-10 2013-07-02 Cardioinsight Technologies, Inc. Visualization of electrophysiology data
US9078573B2 (en) 2010-11-03 2015-07-14 Cardioinsight Technologies, Inc. System and methods for assessing heart function
US9655561B2 (en) 2010-12-22 2017-05-23 Cardioinsight Technologies, Inc. Multi-layered sensor apparatus
US9977060B2 (en) 2012-05-09 2018-05-22 Cardioinsight Technologies, Inc. Channel integrity detection
US9427166B2 (en) 2012-09-21 2016-08-30 Cardioinsight Technologies, Inc. Physiological mapping for arrhythmia
US10076260B2 (en) 2014-02-04 2018-09-18 Cardioinsight Technologies, Inc. Integrated analysis of electrophysiological data
US10874318B2 (en) 2018-03-06 2020-12-29 Cardioinsight Technologies, Inc. Channel integrity detection and reconstruction of electrophysiological signals
CN111513855A (zh) * 2020-04-28 2020-08-11 绍兴梅奥心磁医疗科技有限公司 心脏内科介入导管手术系统及其应用方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PAUL LICARI: "How to remote control my office computer using Microsoft Teams?", 25 June 2021 (2021-06-25), XP093024547, Retrieved from the Internet <URL:https://web.archive.org/web/20210625130648/https://blog.nettconn.net/how-to-remote-control-my-office-computer-using-microsoft-teams> [retrieved on 20230216] *

Similar Documents

Publication Publication Date Title
US11957470B2 (en) Method and system for evaluation of functional cardiac electrophysiology
US20210022616A1 (en) Method and system for visualization of heart tissue at risk
US9750940B2 (en) System and methods to facilitate providing therapy to a patient
US9610023B2 (en) System and methods for computing activation maps
Berger et al. Single-beat noninvasive imaging of cardiac electrophysiology of ventricular pre-excitation
US20170235915A1 (en) Personalized model with regular integration of data
US11935656B2 (en) Systems and methods for audio medical instrument patient measurements
Parmar et al. Poor scar formation after ablation is associated with atrial fibrillation recurrence
US20190302210A1 (en) System and Method for Phase Unwrapping for Automatic Cine DENSE Strain Analysis Using Phase Predictions and Region Growing
JP2022545355A (ja) 医療機器を識別、ラベル付け、及び追跡するためのシステム及び方法
US20230162854A1 (en) Automated triggers for electrophysiology case support
US20230036977A1 (en) Systems and Methods for Electrocardiographic Mapping and Target Site Identification
WO2023091570A1 (fr) Déclencheurs automatisés pour assistance sur cas d&#39;électrophysiologie
Sra et al. Identifying the third dimension in 2D fluoroscopy to create 3D cardiac maps
Yang et al. Activation recovery interval imaging of premature ventricular contraction
US11694806B2 (en) Systems and methods for grouping brain parcellation data
US20220238203A1 (en) Adaptive navigation and registration interface for medical imaging
US20220335612A1 (en) Automated analysis of image data to determine fractional flow reserve
WO2023014637A1 (fr) Systèmes et procédés de cartographie électrocardiographique et d&#39;identification de site cible
JP2024015883A (ja) 情報処理装置、情報処理方法、プログラム、学習済みモデルおよび学習モデル生成方法
WO2023119137A1 (fr) Cartographie automatisée et/ou traitement de signal en réponse à des caractéristiques de signal cardiaque
JP2022171345A (ja) 医用画像処理装置、医用画像処理方法及びプログラム
WO2023178252A1 (fr) Modélisation de paramètres cardiaques basée sur la physique de l&#39;intelligence artificielle
CN116779135A (zh) 用于计算血液储备分数的方法、装置、计算设备及介质
KR20240009348A (ko) 심전도 판독에 기반한 시각화 콘텐츠를 제공하는 방법,프로그램 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22834787

Country of ref document: EP

Kind code of ref document: A1