WO2011116340A2 - Cadre de gestion de contexte pour télémédecine - Google Patents
Cadre de gestion de contexte pour télémédecine Download PDFInfo
- Publication number
- WO2011116340A2 WO2011116340A2 PCT/US2011/029077 US2011029077W WO2011116340A2 WO 2011116340 A2 WO2011116340 A2 WO 2011116340A2 US 2011029077 W US2011029077 W US 2011029077W WO 2011116340 A2 WO2011116340 A2 WO 2011116340A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- sensor data
- action
- user
- sensor
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
Definitions
- This application relates to the field of health care, and in particular, to devices and techniques for monitoring and recording patient information.
- Patient monitoring systems are used extensively to monitor vital signs of patients in multiple medical settings, including out-of-hospital care by emergency medical technicians and paramedics, out-patient settings such as urgent care clinics and freestanding surgical centers, intermediate care facilities such as nursing homes, and hospital facilities such as emergency departments, intensive care units, and hospital "step-down units" providing a transition from the ICU to general medical and surgical units.
- Such systems acquire data from medical sensors to measure patient vital signs such as cardiac activity, blood-oxygenation, and blood pressure.
- False alarms may detract medical personnel from providing patient care, forcing the personnel to pay more attention to the patient monitoring system instead of the patient herself. Eventually, false alarms may lead to alarm fatigue, a situation in which subsequent alarms are sometimes ignored, at times in critical cases.
- a remote physician or other caregiver is often dependent on a local caregiver to act as a proxy, interacting with the patient and relaying physiological information, patient history, and procedures.
- EMS emergency medical services
- paramedics may treat many routine patients independently, for more complex cases they may call upon a remote physician or other base station clinician for advice.
- physiological information is captured on electronic monitoring devices, while patient history, procedures performed, and medications administered are often captured on paper, or even on a caregivers hand or glove.
- FIG. 1 is a block diagram illustrating relationships between components of a medical alarm control system in accordance with various embodiments of the present disclosure
- Figures 2 and 3 are example screen shots of interfaces presented to a user of a medical communication system in accordance with various embodiments
- Figure 4 is a block diagram illustrating relationships between components of a coordinated medical visualization system in accordance with various embodiments of the present disclosure
- Figure 5 is an example screen shot of an interface presented to a user for display of medical data in accordance with various embodiments
- Figure 6 is a block diagram illustrating relationships between components of a medical voice control system in accordance with various embodiments of the present disclosure
- FIGS 7-9 are diagrams of systems employing medical control and communication systems and techniques in accordance with various embodiments.
- Figure 10 is an example computing environment in accordance with various embodiments.
- Illustrative embodiments of systems and techniques of the present disclosure include, but are not limited to, systems and methods for efficient collection and delivery of patient information.
- the systems include automated field data collection and transmission to supporting clinicians at remote locations for real-time monitoring.
- Embodiments of the present disclosure allow physiological vital signs, history, procedures performed, drugs adrninistered, and physical assessments, to be combined into a single chronological record of care. This record can be referenced in the field and delivered in real time to a supporting physician or receiving hospital.
- aspects of the present disclosure include real-time delivery of contextual information streams, hands-free acquisition of patient information, and a rule-based smart alarm system that avoids alarm fatigue.
- on-body sensors monitor and record physiological waveforms, which are aggregated on a mobile device and transmitted either in real time or periodically to a remote physician. This provides for a flexible and scalable platform for medical applications, including in telemedicine.
- data may be correlated from multiple sensor streams to validate measurements made by the sensors.
- systems and techniques may use prior- determined probabilities based on validated clinical data, patient history, and treatments administered to determine subsequent actions after an alarm is triggered. In various embodiments, these actions may include, but are not limited to, when to alarm, when to acquire more data, and one or more protocols to recommend to clinicians. Additionally, in various embodiments, systems and techniques may archive user responses, triggered alarms and related data so that the decision-making process surrounding alarm triggering may be refined during post-analysis.
- techniques and systems may enable coordinated representation of a set of medical data by visually presenting data related to the events using two or more correlated, i.e., linked, representations.
- the data may include multiple continuous streams and/or discrete events from multiple sources and/or include tracking of changes of data read.
- multiple visualizations may be displayed, controlled, and/oT reviewed with respect to a common time axis.
- such a display will allow for time-consistent review, such that moving forward in time in one of the visualizations will scroll others to a similar time interval, providing clearer correlation of events to medical personnel.
- systems and techniques may enable a user, such as medical field personnel, to use vocal expressions to input data to the system to capture and/or categorize the nature of the care being provided (such as, for example, drugs administered, procedures being performed, physical observations, patient history, etc.).
- the captured information may be further used to annotate physiological patient data streaming from patient sensors.
- a remote clinician may be provided contemporaneous and consolidated updates on patients' states.
- medical monitoring alarms may be controlled so as to reduce the number of false alarms.
- the control of alarms may include the use of one or more of data correlation, prior probability review, and archival capability.
- FIG. 1 is a block diagram that illustrates various aspects of medical alarm control according to various embodiments.
- sensor data may flow from sensors at block 3 and from profile data at block 2 to a computing device at block 1.
- profile data such as that in block 2
- the computing device at block 1 may be a decision engine and may apply rules, which may be pre-defined by medical personnel, to determine an action to take based on the sensor data.
- the action determined by the rule may be to identify and/or validate new facts and store them in the profile data at block 2, to determine additional sensor measurements to take which may be taken at block 6, and/or to determine alarms that may be issued.
- a selector (such as at block 7) may determine which actuation mechanism (illustrated at block 10) will be most appropriate and effective.
- the user may then respond to alarms at block 8.
- actions that have occurred such as, for example, the sensing of data, the making of decisions, initiated alarms, and user response may be stored at block 9.
- Prior probabilities as will be described below may be injected into the decision engine at block 5 based on profile data from block 2.
- software upgrades may be applied via block 11 to the prior probabilities and the decision engine rules.
- Various aspects of the actions described above are described below with reference to data correlation, prior probability review, and archiving.
- systems and techniques described herein utilize correlated data from multiple sensor inputs to determine an action to take, such as (a) when to alarm, and (b) when to acquire more data.
- This data correlation may be directed through the use of one or more pre-determined rules.
- the rule may correlate sensor data from a first sensor measuring a first parameter of the patient with sensor data from a second sensor measuring a second parameter of the patient, where the first parameter is different from the second parameter. That is, the first sensor and the second sensor may be different types of sensors that measure a different characteristic of the patient. For example, if medical personnel seek to validate that a complete set of ECG leads have been properly placed on a patient, sensor data from the ECG monitor and plethysmography sensors may be correlated using the following rule, presented here in vernacular language:
- heart rate based on plethysmography waveform If heart rate based on plethysmography is normal and 02 sat >92%, do not alarm. But if plethysmography signal is clear and the heart rate based on
- plethysmography does not match the heart rate from ECG, signal that there is an artifact so paramedics can adjust leads
- rules may have different outcomes.
- the rule recited above both determines whether or not to trigger an alarm and records validated sensor data.
- the systems and techniques may also leverage prior probabilities, such as at block S, to determine an action to take, such as (a) when to alarm, (b) when to acquire more data, and (c) what protocol to suggest to the clinician.
- This provides several advantages over systems that treat facts as being either true or false, e.g., binary triggering. This binary triggering may produce an incorrect assessment of facts that may contribute to false alarms.
- prior-determined probabilities such as those based on profile data related to the patient, such as validated clinical data, history, and treatments administered, may be assigned to trigger conditions and may be revised as new profile data and/or sensor data emerge. Using these probabilities may make the system adaptable to the changing conditions. For example:
- recent history of one or more alarm conditions which have occurred may be used to alter the assumed probability that an alarm condition from another sensor is valid and clinically important. For example, if a patient's blood pressure is low, a system may set a higher probability that an abnormal heart rate or a low SP02 is real and could be more likely to alarm when a low SP02 is detected.
- prior-determined probabilities may be used to calculate ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇
- an abnormal heart rate or cardiac rhythm may trigger an automatic 12-lead ECG.
- prior- determined probabilities may be used to determine recommended medical protocols. For example, if an observed cardiac rhythm is ventricular tachycardia, the clinician may be recommended to check blood pressure, PPG wave form, and SP02. Then, if BP is normal, the PPG wave form is normal, and SP02 is within acceptable range, the system may alarm and recommend drug treatment. Alternatively, if BP is dangerously low, PPG wave form is abnormal, or SP02 is low, the system may charge an associated defibrillator and
- data related to triggered alarms, facts and conditions which caused the alarms to trigger, rules in the computing device that were used during decision-making, and/or user response/feedback may be archived, i.e., stored, in a persistent store in the system, such as at block 9 in Figure 1.
- user response(s) may cause decision rules and prior probabilities to be adjusted. This adjustment, according to various embodiments, may either be in real time by the system or at a later time for a next firmware or software release. For example, if a specific alarm has triggered multiple times, and each time a user of the system responds by silencing the alarm, the system may choose to archive the user response. Using the archived responses, a manufacturer or other system maintainer may look for conditions that caused the false alarm to trigger and modify firmware or software to minimize future occurrences.
- firmware or software is illustrated at block 11 of Figure 1.
- the system may automatically modify the rules and/or prior probability data based on the user response data.
- Figure 1 illustrates a decision engine, at block 1, which may be a computing device.
- the decision engine contains rules which may be used to determine actions to be taken. For example, the rule may evaluate alarm trigger conditions.
- the engine may make decisions by evaluating input from sources such as: sensor data from one or more sensors (illustrated at block 3), profile data, such as patient information and patient history (illustrated at block 2), threshold settings for sensor data (illustrated at block 4), and prior- determined probabilities (illustrated at block 5).
- the decision engine may compare sensor data against a set of rules, which comprise embedded medical knowledge, so that the decision engine may determine when to alarm. Prior probabilities are used by these rules to determine whether and what severity of alarm or other action is required.
- actions determined from the decision engine may include: alarms triggered from evaluating one or more inputs, acquiring additional data such as retaking a blood pressure measurement if it is low, and feedback related to new facts identified from validated sensor data.
- the decisions made may be represented as decision -making code.
- a rule determining alarms related to low blood pressure may comprise the following:
- a rule relating to alarms for heart rate may comprise one or more of the following:
- Figure 1 also illustrates profile data related to the patient at block 2.
- the profile data may be fed into the decision engine of block 1 as a part of the decision making process.
- Prior probabilities such as at block 5, may be continuously updated as new profile data emerge.
- profile data may include information such as patient demographics (such as gender, age, chief complaint, patient history, etc.), medical protocols (for example suggested procedures to be followed to treat a patient), drugs administered to the patient during treatment, procedures administered to the patient, (for example CPR or defibrillation), and validated sensor data.
- the validated sensor data may be distinct from raw sensor data received by the system. For example, plethysmography data read from a sensor is raw data representing the SP02 waveform. This waveform may be analyzed to derive a pulse rate and this pulse rate may be validated against an actual, observed pulse rate. This validated data may subsequently be used by the decision engine for future decisions.
- the data may be validated through correlation of data from multiple sensors, as described herein.
- Figure 1 also illustrates sensors at block 3.
- the sensors may be medical sensors that are operatively coupled to a patient to monitor the patient. Examples include ECG sensors to monitor the heart, SP02 sensors to monitor blood oxygenation, blood pressure sensors, ETC02 sensors to monitor breathing, etc.
- sensors may be wired or wireless and the data coming in may be scalar (e.g., blood pressure) or streaming (e.g. cardiac rhythm).
- sensor data may be input to the computing device, e.g., decision engine.
- FIG. 1 illustrates thresholds at block 4.
- these represent threshold settings for actions such as alarm triggers.
- Various settings may be pre-programmed or defined by a user. For example, an alarm may trigger if an observed actual blood pressure reading is below or above determined low/high thresholds for BP.
- normal values for sensor data may vary by age. Threshold settings may thus be adjusted according to profile data, such as age of the patient.
- Figure 1 illustrates prior-determined probabilities at block 5.
- this illustrated component may use prior-determined probabilities to assess conditions that could lead to a triggering of an alarm.
- the prior- determined probabilities may be adjusted to improve the decision making process. These probabilities may be fed into the decision engine as inputs to the rules.
- Figure 1 illustrates "actions" at block 6.
- this component may effectuate some or all actions triggered by the decision engine. For example, if the decision engine comprises a rule to take a 12-lead ECG reading if blood pressure is low, the action component may send a command to the ECG sensor to take the ECG reading.
- Figure 1 illustrates a selector at block 7.
- a user when alarms are triggered by the decision engine, a user may be notified of an alarm in various ways. For example, embodiments may alarm visually by flashing readings on the display, embodiments may play audible beeps or alarm voice playback, etc.
- the selector component may be used to select the appropriate method of actuating alarms triggered, such as at block 10, by the decision engine. In various embodiments, the selector may also use knowledge of user response, such as at block 8, to decide on an actuation method.
- Figure 1 illustrates a user response component at block 8.
- this component may accept, record, and act on one or more user responses to alarms.
- Such responses may include a user acknowledging alarms, silencing specific alarms, re-activating alarms, etc.
- User response may also include feedback from a user to let the system know the user's impression of the alarm.
- Such feedback may include indications such as: this alarm is real, this is a false alarm, do not trigger, and/or, this is a critical alarm, always trigger.
- user response may feed into the selector component, such as at block 7, to select an appropriate way to notify a user. For instance, if a user has silenced the BP alarm, the selector would not use audio to notify the user of a BP alarm. User feedback about false alarms, for example, may also be sent to the decision engine to modify its decision making process appropriately. Finally, user feedback may be recorded in a persistent store, such as at block 9, to enable future adjustment of processes and techniques as described herein.
- Figure 1 illustrates a persistent store at block 9.
- this component may store a record of alarms which have been triggered, user response to each of the triggered alarms, when applicable, and the actuating mechanism which was used to notify the user for each alarm triggered.
- Other relevant information may also be archived. For example, a snapshot of sensor data at a time an alarm was triggered may be archived; the archived data may include a determined amount of time before/after the moment the alarm was triggered.
- embodiments may archive rules and/or decisions that caused an alarm to trigger and/or user response and facts known at the time of the triggering.
- the records stored may be used by a manufacturer or other maintainer, for example to improve a rule base in the decision engine to make better decisions in future revisions of the software or firmware.
- Figure 1 also illustrates actuators at block 10. In various aspects
- these illustrated actuators represent different mechanisms by which alarms may be effected.
- alarms may be rendered, according to various embodiments, on the display through flashing icons, by displaying sensor readings in different colors, by using audible beeps or a voice playback to notify the user about a specific alarm, and/or by other techniques such as flashing LEDs on the display.
- Figure 1 also illustrates a software upgrade block at block 11.
- this component may be used by a device manufacturer to upgrade software and/or firmware to include refinements and/or improvements to the decision making process. Upgrades may include, for example, revisions to rules in the decision engine and/or revisions to prior-determined probability tables;
- Figures 2 and 3 show example screenshots illustrating one implementation of a graphical application operating according to the techniques and systems described herein.
- Figure 2 illustrates a screenshot from a Current Vitals tab 202 in an application.
- Figure 2 shows vital signs as an ECG waveform 204 and an SP02 waveform 206, and as scalar data for cardiac rhythm 208, SP02 210, and ETC02 212.
- Alarms 214 and 216 have been triggered for low SP02 and pulse readings, respectively.
- the illustrated screen shot of Figure 2 is divided into three regions: a top region 218, a middle region 220, and a bottom region 222.
- the top region 218 and bottom region 222 may be consistent and may be kept visible.
- the top region 218, as illustrated, contains critical patient information, including patient identifiers and chief complaints (which are often used to identify patients, e.g., "65 year old man with chest pain"), current vital sign information, and alarm status.
- the middle region may contain navigation tabs.
- the first three tabs— Current Vitals 202, Historical Data 224, and 12-lead ECG 226— may display physiological sensor data.
- the bottom region 222 may provide visual feedback for the audio context system, as described herein, and may include the current voice-to-text translation and system state.
- Figure 3 illustrates a screenshot from an Alarms tab 336 in an application. As illustrated, Figure 3 shows that a user may be allowed to set alarm thresholds 340, such as, for example, thresholds for high and low blood pressure 342 and end tidal C02 344. Additionally, in the illustrated implementation, an alarm history 346 of activation and deactivation of various triggered alarms is also shown.
- alarm thresholds 340 such as, for example, thresholds for high and low blood pressure 342 and end tidal C02 344.
- an alarm history 346 of activation and deactivation of various triggered alarms is also shown.
- the systems and techniques described herein may provide a facility for visualizing medical data from multiple sources in a coordinated representation.
- data is correlated and represented using time as a common axis. This may include continuous data (from multiple sources), discrete events which occur
- data may be acquired in various formats.
- data may include:
- data may be presented as a continuous plot versus time, such as ECG, SP02 waveform, etc.
- data may represent non- medical measurements, such as, for example, speed of an object flying or bandwidth usage on a network link.
- Such data may be, in one embodiment, displayed using a ticker with changing text.
- Examples of data sampled on a non-continuous basis include stock prices, gas counters at gas pumps, and/or news
- These data may also be grouped in a visualization by virtue of being human-generated, thus giving a consistent record of human-produced activities. Examples of such data include a person logging into a machine, or someone activating a feature using voice.
- the techniques may visualize events that are initiated by the execution of a process. These events may read data that fells under various event categories discussed herein, including, in some embodiments, other process-generated data, and produce log events. For example a system may include a security flag to be logged if a person tries to login 20 times with a wrong password on multiple machines inside a company within 10 minutes, or may record an alarm generated due to a patient's heart rate suddenly dropping sharply.
- FIG. 4 illustrates a block diagram of one implementation of a system 400 for coordinated visualization of data from multiple sources.
- sensor data 402 may be obtained from a plurality of sensors, as described above.
- the sensor data 402, in whole or in part, may be stored on a database or server 404, as illustrated.
- the system 400 may attach a time stamp to the sensor data 402.
- the data when visualized (e.g., by visual representations 406, 408, 410, and/or 412), the data may not be currently-observed data, but may instead be historical data which has been saved on the database or server.
- the system may associate a portion of the data having time stamps within a time interval as treatment session data to be played back later.
- a user of these techniques may play back a non-current session, including respective timestamps, using data fed from the database 404 instead of the sensors. Additionally, some of the visualized data may come from one or more processes 414 which may operate on the sensor data 402 before generating data to visualize, as illustrated.
- the systems and techniques described herein may be implemented in an emergency medicine usage model.
- This implementation may present data coming from different sources and showing the data to a user in one combined view using a multitude of tools, tylng them together using time as a common axis.
- Figure S illustrates an example screenshot 500 of such an
- data may be visualized according to different visualization techniques.
- visualization techniques may include continuous data plots 502, one or more visual timelines 504 representing a treatment session, in whole or in part, one or more textual charts 506, and one or more text tickers 508 with changing text
- the system may present data to the user using a plurality of visualization techniques:
- coordination may start by capturing
- the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
- the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
- the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
- the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
- the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
- the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveform
- sensors such as, for example, blood pressure. These are represented in the example of Figure 5 as the continuous graphs 502 in the mid- left area of the screenshot, as well as the ticker 508 which runs near the bottom of the screen shot, to the left of the illustrated buttons.
- annotations correspond to events such as voice inputs (for example as described below), or alarms created by the values of physiological data.
- the annotations may also include events, for example taking specific measurements such as blood pressure.
- the annotations are shown in text chart 506 on the right side of the screenshot, where time, category, and detail are shown for each annotation.
- data may be presented as a timeline.
- the left hand side of the timeline represents the start time of monitoring and the right hand side represents a most recent time.
- this most recent time comprises a current time if the session is still active; in another, if the data- gathering session has ended, the most recent time comprises an end time.
- colored markers may be added to the timeline as visual clues to indicate major events, such as, for example, alarms or voice inputs.
- a snapshot a snapshot of a data stream.
- timeline 504 is shown near the bottom of the screen, representing a timeline between 13:37:59 and 13:46:11. Additionally, in the
- the screenshot includes a button 510 for bringing the system up to a live timepoint.
- all three of the visualization techniques described above may be linked together using time, e.g., may have a common time axis.
- time e.g., may have a common time axis.
- systems and techniques may provide for user input to capture and categorize medical information and direct care, and in particular to be used in telemedicine.
- embodiments may be directed to the creation and processing of voice annotations, which record medical information, such as decisions, drugs, procedures and the like, or which cause events to occur.
- embodiments may also provide for text annotations.
- some embodiments may convert the voice annotations to text.
- FIG. 6 is a block diagram which illustrates embodiments of a voice control system 600 for integrating voice control into medical decision-making. Aspects of the figure, systems, and techniques will be described with reference to five elements:
- phase of the system (shown as functional block 628 in Figure 6).
- the systems and techniques described herein compose these elements into a mobile platform for use by a health practitioner in the field.
- embodiments may enable context aspects sought to be explained during treatment, known colloquially as "who, what, when, how, and why" to be more easily captured in the field.
- 600 include aspects relating to determining when a speech recognizer may start listening for annotations during treatment. In various embodiments, this determination may be made according to different mechanisms. For example, such mechanisms may include:
- a speech recognition engine may recognize this command and cause the system to enter a listening mode.
- a speech recognition engine may recognize this command and cause the system to enter a listening mode.
- the system may turn off the listening mode.
- a hardware push-button accessible to the user such as
- this push-button may be located on a device implementing the techniques discussed herein, in some embodiments the push-button may be worn by a user, such as, for example, on a ring or a wrist band. In various embodiments, the push-button may be wirelessly connected to the system and, upon activation, cause the system to toggle between start/stop listening modes.
- a software push-button such as at block 604.
- the systems and techniques may display a software-based push-button on an associated display.
- this push-button When this push-button is activated, the system may toggle between start/stop listening modes.
- these components may determine one or more categories of an annotation to be captured, thereby capturing the "what" aspect of a treatment being provided.
- these components may operate through the use of pre-defined categories.
- the categories may be stored at block 630 in Figure 6. For example, one or more of the following five categories may be defined through which annotations may be categorized.
- DEMO for patient demographics (age, gender, chief
- the categories are oriented toward emergency medicine, and may be extended to telemedicine; however, in other contexts, other categories may be utilized. In some embodiments, in addition to the categories described above, other categories may be used. For example, a system may use a MEMO category for recording audio memos and/or an EVTS category, such as for voice commands as will be described below.
- a user may say a category-based word, such as "Drug,” before recording any drug-related annotations.
- buttons may be used, and the combination of their activation determines an intended annotation category.
- these push-buttons may be located on a system display and there may be a separate one for each pre-defined
- voice activation and deactivation may be combined with categorization, to simplify the experience for the user.
- activation and categorization may be voice-based. For example, a user may say "Enter Drug.” If the system recognizes this command, it may
- Figure 6 also illustrates, at block 624, components directed to capturing raw audio annotations by recording audio between, for example, voice activation and deactivation.
- This recording may be deactivated, according to various embodiments, through a deactivation mechanism as described above, or via a timeout mechanism, such as at block 612.
- this raw audio may be further analyzed by a speech recognition (SR) engine to convert the annotation to text
- SR speech recognition
- text annotation may be performed on a database server, such as is illustrated in Figure 6, particularly if system performance is an issue.
- time stamps for the annotations are also attached, i.e., associated, with the annotations, such as at block 608.
- speech recognition accuracy may be increased and latency decreased by defining one or more category-specific limited grammars or vocabularies (such as block 632 in Figure 6) that are activated based upon a category identified during a categorization stage.
- a limited vocabulary may be defined, such as for categories like DRUG, PROC, PHYS, and DEMO discussed above, based on the feedback from actual users, such as paramedics and doctors involved in emergency medicine. While such a vocabulary may cover only specific medical scenarios under which it was created, such a vocabulary can be easily extended.
- a category such as HIST discussed above may take the form of a large dictation vocabulary because of the open-ended nature of the annotations in such a category.
- Examples of the limited grammar/vocabulary include:
- PROC category (procedures performed)
- DEMO category (patient information)
- HIST category (patient history)
- annotations may capture different aspects of care being provided.
- a "what" context may be provided from the DRUG/PROC annotations, a "why" context from the PHYS/HIST annotations, a "how” context from the DRUG/PROC annotation, a "when” context from the recorded timestamp of the annotation, and a "who" from logged user information, such as at block 601.
- the grammar described above may be extended to cover situations in which annotations are recorded later than the actual time care was given.
- the grammar may be extended to accept an annotation of "2 minutes ago" for a procedure.
- extensions may allow for a scenario when a secondary user who is different from the person doing annotations, such as a secondary field personnel, is actually carrying out the care.
- such an extension may take the form of ⁇ what> ["at" ⁇ when>] ['lay" ⁇ who>].
- an annotation may take the form of "[PROC] intubated at 3.15 by Joe.”
- embodiments of the system 600 and techniques may include a block 626 with components (such as block 613) that combine information, such as what, when, how, why, who, of a captured annotation and store it into a database 634.
- the text annotations may be further analyzed by one or more understanding components, such as at block 615, which may enter information into a report and/or provide context to other system components.
- the understanding component may cause a certain action (block 614) to be performed such as parse patient information (e.g., DEMO) to fill in age, gender, and/or complaint fields of a report and also to display such information on a display 636 for a user.
- an understanding component may recognize a CPR procedure annotation "start CPR / stop CPR" such that the component may start or stop a CPR timer.
- various embodiments may enable usage of audio to give commands to the system to perform certain actions to control the system.
- the system may recognize commands to navigate to certain GUI screens, to control alarm activation and/or silencing, and to initiate pre-defined sensor readings.
- a user may say "take BP" to take a blood pressure reading or "take 12 lead” to take an ECG reading.
- this mode may utilize a pre-defined command category, such as that discussed above.
- the database 634 of the system may also contain time-stamped sensor data from physiological sensors that have been connected to monitor a patient, This sensor data may also account partly for the "why" aspect of the patient treatment.
- a remote system for example at a consulting clinician's facility, may pull data, along with the annotations, and present consolidated and contemporaneous information on a particular patient treatment. The consulting clinician can thus provide more effective guidance to the local medical personnel without the need for lengthy and potentially incomplete conversations with the local personnel. Examples of Annotation State Feedback
- the system 600 and techniques may utilize a feedback component 628 which makes a user aware of a state of the system.
- Such feedback may include, for example, whether voice control/input has been activated (block 609) and categorization has been successful, whether audio is being recorded and audio annotation capture is in progress, when speech-to-text conversion for text annotation is successful (block 610), and when audio recording is complete (block 611),
- this type of feedback is useful in order for the user to know whether they need to repeat any processing stage.
- different feedback mechanisms may be used. Two examples amenable to mobile telemedicine and emergency medicine scenarios are described herein. Either of these mechanisms may make it possible for feedback to be part of the user's environment and enable the user to be aware of an annotation state while allowing the user to continue to perform other activities.
- a first mechanism 638 includes feedback using a plurality of differently colored LEDs mounted on a mobile device implementing the techniques or on safety glasses of a medical care-giver. Any combination of colors of LEDs and associated meanings may be used. For example, in one implementation, four colored LEDS may be used - blue, yellow, green, and red with the following associated meanings:
- a second mechanism 640 utilizes audio feedback with distinct sounds to identify a state. For example, the system may use one distinct sound to indicate that voice activation and categorization has succeeded and a second distinct sound to indicate that the voice recognition has been deactivated. Examples of Use
- one paramedic used an audio headset to create audio annotations. Prior to the sessions the paramedics underwent an hour- long voice training session. The training created a voice model for the speaker, and provided hands-on experience for the user in system usage and grammar/vocabulary. During each simulation session, paramedics created audio annotations, which were recorded in the database along with the textual translations. The sessions were also video-taped for additional ground truth.
- the systems and techniques described herein may be focused around an aggregator device that is associated with a given person and collects sensor data from various sensors associated with that person.
- the aggregator device may be a computing device, and may allow the sensor data to be processed, stored, and displayed locally, as well as delivered to remote telemedicine clients.
- Figure 7 presents an example of a high-level system architecture 700 according to the techniques described herein.
- the architecture 700 is partitioned into three parts: one or more sensors (e.g., sensors 702, 704, 706, and 708), an aggregator 710, and a back end 712.
- the back end 712 may include a computer 714 and/or a database server 716.
- the aggregator 710 may be a decision engine as described above,
- the aggregator 710 may be implemented on a mobile platform, such as a PDA, smart phone, or tablet.
- the aggregator 710 may be paired with on-body or environmental sensors, from which it receives data via a Wireless Personal Area Network (WPAN).
- WPAN Wireless Personal Area Network
- the aggregator 710 and sensors 702, 704, 706, and 708 may communicate via short-range wireless protocols such as, for example, Bluetooth.
- the aggregator 710 may control the sensors 702, 704, 706, and 708 and receive, process, store, and/or display sensor data. Sensor data may then be replicated to back end telemedicine clients through a wide area wireless network.
- the system 700 may also have associated one or more PC platforms 714 which receive, store, analyze and/or display data from each connected aggregator 710. Communication between each aggregator 710 and the back end platform 712 may take place through a storage subsystem, such as database server 716, using a database replication and data streaming architecture to synchronize data across the platforms.
- a storage subsystem such as database server 716
- FIG. 8 illustrates a high-level end-to-end view of an example software architecture 800 implementing embodiments described herein.
- the architecture may be symmetric between the aggregator 810 and the back end 812.
- the aggregator 810 may include a plurality of platforms, including a data acquisition block 820, a user interface block 822, and a storage subsystems block 824.
- the back end 812 may include a data acquisition block 826, a user interface block 828, and a storage subsystems block 830.
- the systems of the aggregator 810 and back end 812 may be encapsulated in an application object 832 and 834, respectively.
- the application object 832 and 834 manages core components, reducing the overhead if the system is to be customized to a specific application. Accordingly, for a specific application, a system designer may only need to create and manage application-specific data analysis components and a custom user interface.
- the application object 832 may manage communication with wireless sensors, such as BluetoothTM sensors. Acquired data may be stored via the storage subsystem 824 as well as streamed to the analysis components 836 and to the user interface 822.
- the storage subsystem 824 may include a replication capability that is a communication mechanism between the aggregator 810 and the back end 812.
- the application object 832 receives data via the storage subsystem 824 and may stream it to the analysis components 838 and the back end user interface application 828.
- the user interface 822 and 828 may receive real-time data from the application object 832 and 834, respectively, via a streaming interface. It may also access historical data from the storage subsystem 824 and 830, respectively, via a data access layer (DAL).
- DAL data access layer
- the application object 932 may provide functions of data acquisition. Such functions may include: 1) a publish/subscribe based reconfigurable data path 940 that may be configurable at system integration time, 2) an extensible sensing component 942 that may interface with sensors, 3) analysis components 944 for real-time data analysis, and 4) a database bridge 946 (DB) component to record incoming data streams in a local database.
- DB database bridge 946
- Figure 10 illustrates, for one embodiment, an example system 1000 comprising one or more processor(s) 1004, system control logic 1008 coupled to at least one of the processor(s) 1004, system memory 1012 coupled to system control logic 1008, nonvolatile memory (NVM)/storage 1016 coupled to system control logic 1008, and one or more communications interface(s) 1020 coupled to system control logic 1008.
- processor(s) 1004 processor(s) 1004
- system control logic 1008 coupled to at least one of the processor(s) 1004
- system memory 1012 coupled to system control logic 1008, nonvolatile memory (NVM)/storage 1016 coupled to system control logic 1008, and one or more communications interface(s) 1020 coupled to system control logic 1008.
- NVM nonvolatile memory
- System control logic 1008 may include any suitable interface controller to provide for any suitable interface to at least one of the processor (s) 1004 and/or to any suitable device or component in communicatioh with system control logic 1008.
- System control logic 1008 for one embodiment may include one or more memory controller(s) to provide an interface to system memory 1012.
- System memory 1012 may be used to load and store data and/or instructions, for example, for system 1000.
- System memory 1012 for one embodiment may include any suitable volatile memory, such as suitable dynamic random access memory
- System control logic 1008 may include one or more input/output (I/O) controller(s) to provide an interface to NVM/storage 1016 and communications interface(s) 1020.
- I/O controller(s) may be included in system control logic 1008 for one embodiment to provide an interface to NVM/storage 1016 and communications interface(s) 1020.
- NVM/storage 1016 may be used to store data and/or instructions, for example.
- NVM/storage 1016 may include any suitable non-volatile memory, such as flash memory, for example, and/or may include any suitable non-volatile storage device(s), such as one or more hard disk drive(s) (HDD(s)), one or more solid-state drive(s), one or more compact disc (CD) drive(s), and/or one or more digital versatile disc (DVD) drive(s) for example.
- HDD hard disk drive
- CD compact disc
- DVD digital versatile disc
- the NVM/storage 1016 may include a storage resource physically part of a device on which the system 1000 is installed or it may be accessible by, but not necessarily a part of, the device.
- the NVM/storage 1016 may be accessed over a network via the communications interface(s) 1020.
- System memory 1012 and NVM/storage 1016 may include, in particular, temporal and persistent copies of medical communication logic 1024, respectively.
- the medical communication logic 1024 may include instructions that when executed by at least one of the processor(s) 1004 result in the system 1000 performing automated security setup actions as described in conjunction with, for example, the wireless setup or certificate setup applications described herein.
- the medical communication logic 1024 may additionally/alternatively be located in the system control logic 1008.
- Communications interface(s) 1020 may provide an interface for system 1000 to communicate over one or more network(s) and/or with any other suitable device.
- Communications interface(s) 1020 may include any suitable hardware and/or firmware.
- Communications interface(s) 1020 for one embodiment may include, for example, a network adapter, a wireless network adapter, a telephone modem, and/or a wireless modem.
- communications interface(s) 1020 for one embodiment may use one or more antenna(s).
- At least one of the processors) 1004 may be packaged together with logic for one or more controlJer(s) of system control logic 1008.
- at least one of the processor(s) 1004 may be packaged together with logic for one or more controllers of system control logic 1008 to form a System in Package (SiP).
- SiP System in Package
- at least one of the processor(s) 1004 may be integrated on the same die with logic for one or more controller(s) of system control logic 1008.
- at least one of the processors) 1004 may be integrated on the same die with logic for one or more controllers) of system control logic 1008 to form a System on Chip (SoC).
- SoC System on Chip
- system 1000 may have more or less components, and/or different architectures.
- references throughout this specification to "one embodiment” or “an embodiment” mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation encompassed within the present invention. Thus, appearances of the phrase “one embodiment” or “in an embodiment” are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be instituted in other suitable forms other than the particular embodiment(s) illustrated.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- Primary Health Care (AREA)
- Entrepreneurship & Innovation (AREA)
- General Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Game Theory and Decision Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Marketing (AREA)
- Educational Administration (AREA)
- Development Economics (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
La présente invention concerne un système d'alarme intelligent à base de règles pour un système de surveillance médicale qui évite la fatigue d'alarme. L'invention porte en outre sur l'émission de données médicales coordonnées provenant de plusieurs sources, et sur des procédés destinés à faciliter l'entrée de données par un soignant dans le domaine. Dans divers modes de réalisation, les données peuvent être corrélées depuis plusieurs flux de capteur pour permettre la validation des mesures réalisées par des capteurs. En outre, dans divers modes de réalisation, des systèmes et des techniques peuvent utiliser des probabilités déterminées au préalable à partir de données cliniques validées, des antécédents médicaux d'un patient, et de traitements administrés, pour permettre de déterminer des actions subséquentes après le déclenchement d'une alarme. Dans divers modes de réalisation, ces actions peuvent comprendre, sans caractère limitatif, les éléments suivants : quand déclencher l'alarme ; quand réaliser l'acquisition de davantage de données ; et un ou plusieurs protocoles à recommander aux cliniciens. En outre, dans divers modes de réalisation, des systèmes et des techniques peuvent archiver des réponses d'utilisateur, des alarmes déclenchées et des données afférentes, de sorte que le processus décisionnel entourant le déclenchement de l'alarme puisse être affiné durant la post-analyse. En outre, divers modes de réalisation, techniques et systèmes peuvent permettre une représentation coordonnée d'un ensemble de données médicales par présentation visuelle de données liées aux événements, au moyen de deux représentations corrélées ou davantage.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US31535810P | 2010-03-18 | 2010-03-18 | |
US61/315,358 | 2010-03-18 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2011116340A2 true WO2011116340A2 (fr) | 2011-09-22 |
WO2011116340A3 WO2011116340A3 (fr) | 2011-11-17 |
Family
ID=44649856
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2011/029077 WO2011116340A2 (fr) | 2010-03-18 | 2011-03-18 | Cadre de gestion de contexte pour télémédecine |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2011116340A2 (fr) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014110280A2 (fr) | 2013-01-11 | 2014-07-17 | Zoll Medical Corporation | Interface de support de décision basé sur un sem, histoire d'événements et outils associés |
US9007207B2 (en) | 2013-01-22 | 2015-04-14 | General Electric Company | Dynamic alarm system for operating a power plant and method of responding to same |
CN105224383A (zh) * | 2015-08-21 | 2016-01-06 | 上海理工大学 | 心肺复苏术模拟系统 |
CN105224383B (zh) * | 2015-08-21 | 2018-08-31 | 上海理工大学 | 心肺复苏术模拟系统 |
JP2019523926A (ja) * | 2016-05-24 | 2019-08-29 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 患者モニタのためのカスタマイズされた設定を提供するための方法及びシステム |
US10504036B2 (en) | 2016-01-06 | 2019-12-10 | International Business Machines Corporation | Optimizing performance of event detection by sensor data analytics |
US10720240B2 (en) | 2015-02-26 | 2020-07-21 | Koninklijke Philips N.V. | Context detection for medical monitoring |
CN114842935A (zh) * | 2022-04-29 | 2022-08-02 | 中国人民解放军总医院第六医学中心 | 一种用于医院夜间查房的智能检测方法及系统 |
CN116453637A (zh) * | 2023-03-20 | 2023-07-18 | 杭州市卫生健康事业发展中心 | 一种基于区域大数据的健康数据治理方法和系统 |
US11925439B2 (en) | 2018-10-23 | 2024-03-12 | Zoll Medical Corporation | Data playback interface for a medical device |
US12073928B2 (en) | 2019-03-22 | 2024-08-27 | Zoll Medical Corporation | Handling of age transmitted data in medical device system |
US12080391B2 (en) | 2020-08-07 | 2024-09-03 | Zoll Medical Corporation | Automated electronic patient care record data capture |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107451401A (zh) * | 2017-07-11 | 2017-12-08 | 武汉金豆医疗数据科技有限公司 | 一种医保智能审核方法和系统 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080015903A1 (en) * | 2005-12-09 | 2008-01-17 | Valence Broadband, Inc. | Methods for refining patient, staff and visitor profiles used in monitoring quality and performance at a healthcare facility |
US20080214904A1 (en) * | 2005-06-22 | 2008-09-04 | Koninklijke Philips Electronics N. V. | Apparatus To Measure The Instantaneous Patients' Acuity Value |
US20090036757A1 (en) * | 2004-07-12 | 2009-02-05 | Cardiac Pacemakers, Inc. | Expert system for patient medical information analysis |
US7552101B2 (en) * | 2003-10-31 | 2009-06-23 | Vigimedia S.A.S. | Health monitoring system implementing medical diagnosis |
-
2011
- 2011-03-18 WO PCT/US2011/029077 patent/WO2011116340A2/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7552101B2 (en) * | 2003-10-31 | 2009-06-23 | Vigimedia S.A.S. | Health monitoring system implementing medical diagnosis |
US20090036757A1 (en) * | 2004-07-12 | 2009-02-05 | Cardiac Pacemakers, Inc. | Expert system for patient medical information analysis |
US20080214904A1 (en) * | 2005-06-22 | 2008-09-04 | Koninklijke Philips Electronics N. V. | Apparatus To Measure The Instantaneous Patients' Acuity Value |
US20080015903A1 (en) * | 2005-12-09 | 2008-01-17 | Valence Broadband, Inc. | Methods for refining patient, staff and visitor profiles used in monitoring quality and performance at a healthcare facility |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110277160B (zh) * | 2013-01-11 | 2023-07-28 | 卓尔医学产品公司 | 代码查看医疗事件的系统和决策支持方法 |
EP2943926A4 (fr) * | 2013-01-11 | 2018-05-23 | Zoll Medical Corporation | Interface de support de décision basé sur un sem, histoire d'événements et outils associés |
WO2014110280A2 (fr) | 2013-01-11 | 2014-07-17 | Zoll Medical Corporation | Interface de support de décision basé sur un sem, histoire d'événements et outils associés |
CN110277160A (zh) * | 2013-01-11 | 2019-09-24 | 卓尔医学产品公司 | 代码查看医疗事件的系统和决策支持方法 |
US11816322B2 (en) | 2013-01-11 | 2023-11-14 | Zoll Medical Corporation | EMS decision support interface, event history, and related tools |
US10976908B2 (en) | 2013-01-11 | 2021-04-13 | Zoll Medical Corporation | EMS decision support interface, event history, and related tools |
US9007207B2 (en) | 2013-01-22 | 2015-04-14 | General Electric Company | Dynamic alarm system for operating a power plant and method of responding to same |
US10720240B2 (en) | 2015-02-26 | 2020-07-21 | Koninklijke Philips N.V. | Context detection for medical monitoring |
CN105224383A (zh) * | 2015-08-21 | 2016-01-06 | 上海理工大学 | 心肺复苏术模拟系统 |
CN105224383B (zh) * | 2015-08-21 | 2018-08-31 | 上海理工大学 | 心肺复苏术模拟系统 |
US10504036B2 (en) | 2016-01-06 | 2019-12-10 | International Business Machines Corporation | Optimizing performance of event detection by sensor data analytics |
JP2019523926A (ja) * | 2016-05-24 | 2019-08-29 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 患者モニタのためのカスタマイズされた設定を提供するための方法及びシステム |
US11626207B2 (en) * | 2016-05-24 | 2023-04-11 | Koninklijke Philips N.V. | Methods and systems for providing customized settings for patient monitors |
US20190295696A1 (en) * | 2016-05-24 | 2019-09-26 | Koninklijke Philips N.V. | Methods and systems for providing customized settings for patient monitors |
US11925439B2 (en) | 2018-10-23 | 2024-03-12 | Zoll Medical Corporation | Data playback interface for a medical device |
US12073928B2 (en) | 2019-03-22 | 2024-08-27 | Zoll Medical Corporation | Handling of age transmitted data in medical device system |
US12080391B2 (en) | 2020-08-07 | 2024-09-03 | Zoll Medical Corporation | Automated electronic patient care record data capture |
CN114842935A (zh) * | 2022-04-29 | 2022-08-02 | 中国人民解放军总医院第六医学中心 | 一种用于医院夜间查房的智能检测方法及系统 |
CN114842935B (zh) * | 2022-04-29 | 2024-01-23 | 中国人民解放军总医院第六医学中心 | 一种用于医院夜间查房的智能检测方法及系统 |
CN116453637A (zh) * | 2023-03-20 | 2023-07-18 | 杭州市卫生健康事业发展中心 | 一种基于区域大数据的健康数据治理方法和系统 |
CN116453637B (zh) * | 2023-03-20 | 2023-11-07 | 杭州市卫生健康事业发展中心 | 一种基于区域大数据的健康数据治理方法和系统 |
Also Published As
Publication number | Publication date |
---|---|
WO2011116340A3 (fr) | 2011-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011116340A2 (fr) | Cadre de gestion de contexte pour télémédecine | |
US11776669B2 (en) | System and method for synthetic interaction with user and devices | |
US20220037038A1 (en) | Operating room checklist system | |
US11301680B2 (en) | Computing device for enhancing communications | |
US11963924B2 (en) | Tools for case review performance analysis and trending of treatment metrics | |
US10698983B2 (en) | Wireless earpiece with a medical engine | |
TW201327460A (zh) | 用於語音輔助醫療診斷的裝置與方法 | |
CN107910073A (zh) | 一种急诊预检分诊方法及装置 | |
WO2015145424A1 (fr) | Système de réalisation d'un examen physique à distance | |
US20210298711A1 (en) | Audio biomarker for virtual lung function assessment and auscultation | |
WO2014042878A1 (fr) | Procédé, système et appareil pour traiter un problème de communication | |
US20220059238A1 (en) | Systems and methods for generating data quality indices for patients | |
Pereira et al. | DigiScope—Unobtrusive collection and annotating of auscultations in real hospital environments | |
JP2023539874A (ja) | 呼吸器疾患モニタリングおよびケア用コンピュータ化意思決定支援ツールおよび医療機器 | |
WO2024015885A1 (fr) | Systèmes et procédés destinés à fournir un guidage contextuel pour le traitement médical d'un patient | |
Dhakal et al. | IVACS: I ntelligent v oice A ssistant for C oronavirus Disease (COVID-19) S elf-Assessment | |
JP2005512608A (ja) | 患者監視におけるセンサ信号と主観的情報の相関 | |
Rabbani et al. | Towards developing a voice-activated self-monitoring application (VoiS) for adults with diabetes and hypertension | |
US20170354383A1 (en) | System to determine the accuracy of a medical sensor evaluation | |
US11501879B2 (en) | Voice control for remote monitoring | |
WO2024038439A1 (fr) | Système et procédé d'évaluation d'un état cognitif et physiologique d'un sujet | |
Wouhaybi et al. | A context-management framework for telemedicine: an emergency medicine case study | |
KR20130115706A (ko) | 스마트 기기를 이용한 사용자 목소리에 기초한 건강 상태 정보 서비스 제공 방법 | |
Wouhaybi et al. | Experiences with context management in emergency medicine | |
US20220392593A1 (en) | Medical Surgery Recording, Processing and Reporting System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11757096 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11757096 Country of ref document: EP Kind code of ref document: A2 |