WO2016001647A1 - Capture de données cliniques avec apprentissage automatique - Google Patents

Capture de données cliniques avec apprentissage automatique Download PDF

Info

Publication number
WO2016001647A1
WO2016001647A1 PCT/GB2015/051903 GB2015051903W WO2016001647A1 WO 2016001647 A1 WO2016001647 A1 WO 2016001647A1 GB 2015051903 W GB2015051903 W GB 2015051903W WO 2016001647 A1 WO2016001647 A1 WO 2016001647A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
input
user interface
inputs
weighted link
Prior art date
Application number
PCT/GB2015/051903
Other languages
English (en)
Inventor
Brian James
Alexandros MARINOS
Original Assignee
Digital Clipboard Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=51410324&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2016001647(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Digital Clipboard Limited filed Critical Digital Clipboard Limited
Publication of WO2016001647A1 publication Critical patent/WO2016001647A1/fr

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/20ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present invention relates to a user interface having content determined by machine learning.
  • Electronic data capture is becoming more widespread as it replaces the manual recording of information, for example in the form of handwritten notes.
  • the ability to capture data electronically is becoming more important in a wide variety of fields. Not only can the information be input electronically but the submission of such information is nowadays increasingly performed electronically. It is important that the input and submission of such data is performed in a consistent and reliable manner.
  • Electronic data capture is of particular interest in the field of clinical notetaking.
  • Clinical notetaking is a distinct and complex discipline that requires considerable training and skill. It is a multi-stage process that combines the ability to direct a patient to provide pertinent information, record findings of an examination and engage in the thought processes of clinical reasoning to form an effective treatment plan. Throughout the process the clinician is also under critical pressure to ensure notes are clear and accurate and completed quickly enough so as not to interrupt the
  • a first aspect of the invention provides a computer-implemented method of determining the content of a user interface to be displayed as part of a clinical data- capture application, the method comprising detecting a first user input and a second user input at a user interface, analysing the first and second user inputs at a machine learning module to determine a weighted link between respective attributes of the first and second user inputs, and using the weighted link between the respective attributes of the first and second user inputs to populate the application user interface with updated content based on the analysis.
  • the first user input may relate to a first parameter and the second user input may relate to a second parameter and the weighted link maybe between the first parameter and the second parameter.
  • Using the weighted link to populate the application user interface with content based on the analysis may comprise displaying a plurality of selectable options determined to be relevant to the user.
  • the first and second inputs may be in different input fields.
  • the method may further comprise determining a weighted link for each of a plurality of pairs of parameters.
  • the method may further comprise ranking suggested inputs in accordance with the weighted link between each suggested input and another input.
  • Populating the application user interface with content based on the analysis may comprise presenting a navigation menu containing a plurality of workflow sections or subsections in an order determined to be relevant to the user.
  • the first user input may be a user selection of a first workflow section and the second user input may be a user selection of a second workflow section and the weighted link maybe between the first workflow section and the second workflow section.
  • the method may further comprise determining a second weighted link between a user input of an input parameter and a workflow order selected by the user and using the second weighted link between the user input of an input parameter and the workflow order selected by the user to populate the navigation menu with updated content.
  • the first user input may be a selection of a body part displayed as part of a body map.
  • the first user input may correspond to a user manipulation of a chart line.
  • the machine learning module may update the user interface to be displayed
  • the user interface may be displayed on a touch-sensitive display of a tablet computer.
  • the machine learning module may draw on inputs from a plurality of users over a network.
  • a second aspect of the invention provides a computer program comprising computer- readable instructions that, when executed by a processor, cause the processor to perform the method.
  • a third aspect of the invention provides an apparatus comprising at least one processor and at least one memory having computer-readable code stored thereon which, when executed, controls the at least one processor to determine the content of a user interface to be displayed as part of a clinical data-capture application by detecting a first user input and a second user input at a user interface, analysing the first and second user inputs at a machine learning module to determine a weighted link between respective attributes of the first and second user inputs, and using the weighted link between the respective attributes of the first and second user inputs to cause the application user interface to be populated with updated content based on the analysis.
  • a fourth aspect of the invention provides a non-transitory computer-readable storage medium having stored thereon computer-readable code which, when executed by computing apparatus, causes the computing apparatus to determine the content of a user interface to be displayed as part of a clinical data-capture application by detecting a first user input and a second user input at a user interface, analysing the first and second user inputs at a machine learning module to determine a weighted link between respective attributes of the first and second user inputs, and using the weighted link between the respective attributes of the first and second user inputs to cause the application user interface to be populated with updated content based on the analysis.
  • a fifth aspect of the invention provides a apparatus for determining the content of a user interface to be displayed as part of a clinical data-capture application, the apparatus comprising means for detecting a first user input and a second user input at a user interface, means for analysing the first and second user inputs at a machine learning module to determine a weighted link between respective attributes of the first and second user inputs, and means for using the weighted link between the respective attributes of the first and second user inputs to populate the application user interface with updated content based on the analysis.
  • Figure l shows a system in accordance with embodiment of the invention
  • Figures 2-5 show screenshots of embodiments of the invention
  • Figures 6 and 7 are flow diagrams illustrating embodiments of the invention.
  • Figure 8 is a weighted graph illustrating embodiments of the invention.
  • Figure 9 is a schematic diagram of a user interface according to embodiments of the invention.
  • Figures 10 and 11 show screenshots of embodiments of the invention.
  • Embodiments of the invention are described herein which deal with the difficulty of clinical data entry on electronic devices, where the scope of potential inputs is large, the workflow is varied and the speed of data capture is important. It should however be borne in mind that the invention is not limited to clinical notetaking and that alternative embodiments can relate to any field where reliable, consistent and timely data capture is desirable.
  • Figure 1 shows a system 100.
  • the system 100 comprises at least one user device 101.
  • the user device 101 may be a personal computer, tablet device, smartphone or any other computing device.
  • the user device 101 may be provided with a processor, a memory, input hardware, output hardware and a network interface.
  • the memory of the user device 101 may have an operating system, such as a Windows, Apple or Linux operating system, stored therein. Also stored in the memory may be applications such as a web browser that allows access to a network 102, such as the internet.
  • an operating system such as a Windows, Apple or Linux operating system
  • applications such as a web browser that allows access to a network 102, such as the internet.
  • Input hardware may take any suitable form and may comprise at least one of a touchscreen display, a keyboard, a mouse, an optical tracking ball, a touchpad and so forth to allow a user to input information to the user device 101.
  • the output hardware may comprise a display screen, speakers and so forth so that information may be output to a user.
  • the input hardware and output hardware may be integrated and may take the form of a touch sensitive display.
  • the term 'user' as used herein signifies a person using the user device 101 to enter information.
  • the network interface allows the user device 101 to access the network 102, for example the internet or a local area network.
  • the network interface may comprise a modem to allow access to the internet.
  • Connection to the network 102 may be wireless or through a wired connection.
  • a wireless network interface card may be provided.
  • the user device 101 may be provided with a transceiver to allow access to a wireless router having a connection to the network 10 or to other devices in a local area network.
  • the processor is connected to and controls operation of the other components of the user device 101.
  • the processor may execute software stored in the memory and/or control the user device 101 to execute instructions received over the network 102.
  • the network 102 may be the internet.
  • the reference numeral 102 should be understood as including elements such as routers and servers conventionally found in an internet architecture.
  • the system 100 also comprises an application server 103.
  • the application server 103 comprises hardware and software components to allow the user device 101 to access the application.
  • the application server 103 may comprise a memory for storing application software and at least one processor for controlling the server.
  • the application server 103 may be a single entity or it may comprise several entities that interact to perform the role of a server.
  • the application server 103 has an application module 104 and a machine learning module 105 stored in the memory of the application server 103.
  • the machine learning module is a software module containing one or more machine learning algorithms.
  • An exemplary algorithm is a Bayesian machine learning algorithm. However, it should be borne in mind that other machine learning algorithms could be used.
  • the application software module stored in the application server 103 may provide the logic and routines that enables the application server 103 to perform the functionality described below.
  • the application software may be pre-programmed into the application server 103. Alternatively, they may arrive at the application server 103 via an electromagnetic carrier signal or be copied from a physical entity such as a computer program product, a non-volatile electronic memory device (e.g. flash memory) or a record medium such as a CD-ROM or DVD. They may for instance be downloaded to the application server 103 from a server.
  • the processor may be any type of processing circuitry.
  • the processing circuitry may be a programmable processor that interprets computer program instructions and processes data.
  • the processing circuitry may include plural programmable processors.
  • the processing circuitry may be, for example, programmable hardware with embedded firmware.
  • the processing circuitry or processor may be termed processing means.
  • the term 'memory when used in this specification is intended to relate primarily to memory comprising both non-volatile memory and volatile memory unless the context implies otherwise, although the term may also cover one or more volatile memories only, one or more non-volatile memories only, or one or more volatile memories and one or more non-volatile memories. Examples of volatile memory include RAM, DRAM, SDRAM etc. Examples of non-volatile memory include ROM, PROM,
  • EEPROM electrically erasable programmable read-only memory
  • flash memory electrically erasable programmable read-only memory
  • optical storage optically erasable programmable read-only memory
  • magnetic storage etc.
  • the application may be a web-based application that is accessible to the user device 101 using the hypertext transfer protocol (HTTP).
  • HTTP hypertext transfer protocol
  • Application content is thereby displayed by the user device 101, on a web browser, as a web page in hypertext mark-up language (HTML) or any other suitable language.
  • HTML hypertext mark-up language
  • the web-based application takes the form of a series of pages displayed to the user on a user device.
  • the application is displayed to the user as a series of pages that the user can navigate by selecting various navigation options. As such, the user can navigate between different sections of the workflow.
  • the user can register themselves to the application using a registration page so that they can subsequently log in.
  • the registration page contains several input fields where the user can input information about themselves. For example, the user can state their profession such as osteopath, paediatrician, psychiatrist etc and personal details. Once the user has logged in to the application they can then access their preferences and settings. The user can also alter these settings once they have logged in.
  • the user can log in to their account and alter the workflow in accordance with their preferences. These preferences are stored at the application server.
  • the application server passes these preferences on to the machine learning module 105.
  • the consultation workflow can comprise various sections.
  • Example sections include patient details, symptoms, medical history, patient exam, treatment and diagnosis. It is possible to navigate between these sections by selecting the target section from a navigation menu 201 (see Figure 2) that is provided as part of the user interface. As will be described in more detail below, the order of these sections is customisable depending on variables such as profession or personal preference.
  • Each section of the workflow may be divided into subsections.
  • the "Symptoms” section maybe divided into subsections having titles such as “Site of Pain”, “Daily Pattern”, “Onset & Progression” and “Factors & Symptoms”. A user can navigate between these subsections by selecting a different subsection title.
  • the various fields that are completed during the consultation can include text fields and selectable drop-down menus that allow the user to input clinical inputs. Diagrams can also be displayed which a user can then annotate. Furthermore, the user interface can display several selectable options for a particular clinical input field. The options that are displayed in this way are determined by the application using the machine learning module. The options may be ranked so that the most likely option, as determined by the machine learning module is displayed first in a list with less likely options appearing below. This is shown in Figure 3 where, in an "Aggravating Factors" input field two selectable options are displayed. The most likely predicted option 301 is displayed before the second most likely predicted option 302.
  • the user wishes to enter an input that is not one of the displayed options then can type the input instead, as shown in Figure 4.
  • a list 401 of predictive options is displayed with the most likely options listed above those considered less likely. The user may then select one of the displayed options. If the user does not wish to select one of the displayed options, they can continue to enter their input by typing it into the input field.
  • the application creates a summary report 501 of the patient visit as shown in Figure 5.
  • This report summarises the findings of the visit into a single report.
  • the report can include text data, charts, diagrams and annotations, as pictures and videos taken during the consultation, exercise recommendations and other information deemed relevant for the patient by the clinician.
  • the application can store the report in an electronic medical record system.
  • This system can allow stored records to be searchable, can provide permanent storage, synchronisation with cloud based storage systems and can allow accessibility in multiple locations and by multiple people.
  • the user may be a professional clinician.
  • the application is a notetaking platform that provides a flexible and customisable structure for workflows to be adapted to the particular profession or style of the user.
  • workflow refers to the order in which information is inputted during a consultation. It should be appreciated that the order in which a clinician inputs information varies considerably.
  • Some examples of clinical notetaking formats include the SOAP note, where the note is organized into Subjective, Objective, Assessment and Plan sections.
  • Another example is the DART system which is organised into Description, Assessment, Response and Treatment.
  • the workflow may also be varied by the user during a session.
  • the workflow may be varied by the clinician using the application depending on their clinical reasoning during the patient assessment. For instance, if the clinician observes factors which indicate that the patient may have a genetic or system disease for example, the user may navigate through different sections and subsections of the workflow so that this finding could be examined in greater detail.
  • the application is a notetaking platform that provides a flexible and customisable structure for workflows to be adapted to the particular profession or style of the user.
  • Machine learning for predicting clinical inputs deal with the difficulty of clinical data entry on electronic devices where the scope of potential inputs is large and the speed of data capture is important.
  • the machine learning aspects of the application can provide a method for predicting clinical inputs that uses contextual information and a machine- learning module to make patient specific inferences and predict input options available to the user.
  • the method is applicable for an electronic medical recording system in which a user records primary clinical data from a patient interaction into their medical record.
  • This method is also applicable to the a number of different types of digitized health records such as a personal health records (PHR), which contain health-related documentation maintained by the individual to which it pertains, and electronic health records (EHR) which are official health records for an individual that is shared among multiple facilities and agencies.
  • PHR personal health records
  • EHR electronic health records
  • the method is applicable to the numerous workflows and formats of clinical note taking processes found within the healthcare environment. It can also be applied in fields outside the medical field where information needs to be input quickly, yet in a consistent manner.
  • the process begins at step 601.
  • the user makes a first input of data into the electronic medical recording system through the user interface at step 602.
  • This input may be in relation to a first parameter.
  • the user may enter the profession of the patient as a 'teacher'.
  • the user can then move on through the consultation workflow to make a second input in relation to a second parameter.
  • the user may navigate to a screen where they can enter a patient's medical condition.
  • the application 104 retrieves predictions from the machine learning module 105 in relation to the second parameter (in this example, a medical condition). These predictions are derived from analysis performed by the machine learning module 105 of previous inputs made by the user and other users of the application. Since the application may be run over a network and can service multiple users, the machine learning analysis can also be based on inputs from other users as well as the previous inputs of the particular user.
  • the machine learning analysis can also be based on inputs from other users as well as the previous inputs of the particular user.
  • the input field may be a text box in which the user can type an input relating to the second parameter.
  • predicted options may be displayed which a user can select by tapping a touchscreen or clicking a mouse.
  • the options may be ranked in accordance with the analysis performed at the machine learning module so that the most likely suggestion is listed first.
  • By displaying a text box a user can enter an input even if none of the predicted options is appropriate.
  • inputs can be made during the earlier part of the application's lifetime, i.e. before significant machine learning has occurred, by typing the input. This second input is recorded by the machine learning module at step 604. These inputs can then inform the machine learning module so that in subsequent instances predicted options may be displayed.
  • the inputted data is used by the application to inform a machine-learning module, which maps the relationship between parameters using machine learning methods, for example Bayesian models. By mapping the relationship between parameters, a weighted link is formed between those parameters. This link is used to populate the user interface when the application is used subsequently. Patient specific inferences and predictive input options can thus be provided.
  • Each input field is modelled as a set of input options.
  • an open-ended input field such as first name
  • the machine learning module stores a set of weighted links between each option of each input field and each option of every other input field. Options of the same input field do not have links between them.
  • the weight value of their link entry is increased by a fixed amount. As such, the weighted link between the two parameters can be updated.
  • weighted links can then be used to generate predictive input suggestions for subsequent patient examinations. For instance, if the profession of a patient is entered as a 'teacher' in the "Patient's Details" section and if amongst consultations involving patients who are teachers 'shoulder pain' is the most common inputted ailment then the next consultation involving a patient who is a teacher, the next teacher to be examined will have the option of shoulder pain presented as an option to the user. All inputs during a previous examination thus affect the predictive input options offered to subsequent
  • the interface can take any number of formats including laptop, desktop computer, mobile devices or tablet.
  • the ability to move easily between stages of the workflow is particularly advantageous on a tablet computer having a touch sensitive display.
  • the system shown in Figure 1 involves a system with multiple users.
  • the predictive capability of the machine-learning module increases the more data it has access to, and therefore is more effective if informed by multiple users through a networked format.
  • the user can change between workflow sections using the navigation bar.
  • the order in which the user selects the different workflow sections can be relayed to the machine learning module 105. Future consultations can then take this selected order into account to suggest a workflow order for the same or different users (preferably who work in the same field) during subsequent consultations.
  • Embodiments of the invention also deal with the difficulty of creating electronic medical recording systems that provide a free form customized workflow process that varies to the user and particular case.
  • Embodiments of the invention provide a method for optimizing the clinical workflow to the particular clinical decision making process of the clinician using predictions from the machine learning module to present the user with the next likely steps in the clinical workflow.
  • the electronic medical recording system provides an interface for the user.
  • the user inputs data in the first step of the clinical process and is provided options as to the next possible steps in the clinical process.
  • This clinical process can be non-sequential there are multiple options for the user.
  • the user selection of the next step in the clinical process informs the machine-learning module.
  • the user navigates to a first workflow section at step 702. This navigation is recorded by the machine learning module at step 703.
  • the user navigates to a second workflow section at step 704.
  • This second selection is recorded by the machine learning module at step 705.
  • a weighted link between the first section and the second section is thereby determined by the machine learning module at step 706.
  • the weighted links are considered different depending on whether they link section A to section B, or section B to section A.
  • the various sections of the user interface may be considered a simple directed weighted graph as shown in Figure 8, with each choice by each user increasing the weight of each corresponding link.
  • the weights of the links are then used to calculate the options offered to the user as well as to other users of the application. These predictions can be used as a component input to the user interface algorithm, along with more heavily weighted input from experts in the field, and other external references. In such occasions, if a user is designated an "expert" by the machine learning module, their choices during the clinical workflow can be given greater weight than standard users. For example, a user may be considered an expert if they enter their profession as an osteopath and then input information from an osteopathy consultation. As such, their choices can be given greater prevalence in the machines learning modules future predictions.
  • the application requests predictions from the machine-learning module as to the next likely step of the clinical process. These predictions may be formed from previous choices made by the user, choices made other users, and contextual data.
  • Predictions may be based on the navigation history between different sections or subsections of the workflow. Additionally, a weighted link may be formed between an input parameter and a user selection of a section or subsection of the workflow. For example, if a user inputs their profession as an osteopath and then navigates between workflow sections and subsections in a particular order, a weighted link may be formed between that input parameter and the workflow order. If another osteopath
  • the workflow may be presented to them in a similar order depending on the strength of the weighted link.
  • the predicted options are presented to the user through navigation options on the user interface as shown in Figure 9.
  • the machine learning module provides the most likely order of steps in the clinical consultation process. When the user or another user uses the application the sections and subsections may be presented to the user in an order which the machine learning module predicts to be the most likely.
  • the clinical process 06 shown in Figure 9 is given a score of 25% which is greater than the score of any of the other clinical processes.
  • Clinical process 06 is thus displayed first.
  • clinical process 10 has the lowest score of 10% and is displayed last.
  • Clinical processes 05, 01 and 09, having intermediate scores are listed between process 06 and process 10 in order.
  • the navigation options are displayed in an order that corresponds to the scores attributed to them.
  • the prediction of the workflow order may be based on directed weighted graphs formed by the machine learning module based on previous user selections of the workflow sections and subsections. Additionally, the prediction of workflow order can use Bayesian models to draw on weighted links between an input parameter and a particular workflow order to present a predicted workflow. For example, the order of the workflow shown in Figure 3 shows subsections "Site of Pain", “Daily Pattern”, “Onset & Progression” and “Factors & Symptoms" listed from left to right respectively. This particular order of the subsections may be determined based on previous user selections of workflow subsections during previous
  • Figure 10 shows a page 1000 that may be displayed to the user during the consultation.
  • a body map 1001 is displayed on screen.
  • the user selects a part of the body where symptoms are being experienced.
  • the user selects the right knee to since this is the site of a symptom.
  • the application 104 records the selected body part and records this as an input.
  • the machine learning module 105 can form a weighted link between the selection of a particular body part and a subsequent input field. For example, if a large number of users select the knee on the body map and then, on a subsequent page select "running" as part of an exercise regime, then the machine learning module 105 may form a weighted link between knee pain and running.
  • a weighted link may be formed between a body part selection and a particular workflow order. For example, a selection of the knee as the site of pain may result in a workflow order consistent with the clinical workflow of a specialist in knee surgery.
  • Figure 11 shows a page 1100 that may be displayed to the user during the consultation.
  • An adjustable chart 1101 is displayed on screen.
  • a user can drag the line of the chart to adjust the input values.
  • the chart 1101 shown in Figure 11 plots severity against time of day. The user can drag the line of the chart to reflect the input they wish to make. This is particularly advantageous on a touch screen display such as a tablet computer.
  • the application 104 records this as an input.
  • the application 104 can form a weighted link between the chart input and a subsequent input field.
  • embodiments of the invention introduce a method for predicting clinical inputs that uses contextual information and a machine-learning module to make patient specific inferences and predictive input options available to the user.
  • embodiments provide a method for optimizing the workflow of an electronic medical recording system that uses contextual information and a machine- learning module to provide the user with the next likely steps in the clinical workflow.
  • Clinical notes are distinct from standard medical records in that they form a legal document that records exactly what occurred during a consultation. Not only does this protect the rights of both the patient and the clinician, but it also allows regulatory bodies to audit their members to ensure standards are maintained and to gather essential information about the profession for the purposes of research.
  • Each healthcare profession has its own distinct requirements in note taking, with specialist shorthand and terminology that have been developed to mirror the methodology of their work.
  • clinicians are taught to use a paper form that contains fields for relevant information for each stage of the consultation.
  • clinicians use shorthand and technical abbreviations commonly recognised by their profession as well as sketches and diagrams. While the forms are designed to mirror the "flow" of a consultation through each stage, clinicians are often required to move back and forth between the fields, as further information is determined in the course of the consultation.
  • Embodiments of the present invention therefore provide a solution by introducing a system built on a flexible, customisable non-sequential clinical workflow which moulds around the particular clinical decision making process of the clinician and the particular consultation at hand.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne une interface d'utilisateur dotée d'un contenu déterminé par apprentissage automatique. L'invention concerne un procédé et un appareil de détermination du contenu d'une interface d'utilisateur à afficher dans le cadre d'une application de capture de données cliniques, le procédé comportant les étapes consistant à détecter une première entrée d'utilisateur et une deuxième entrée d'utilisateur au niveau d'une interface d'utilisateur, à analyser les première et deuxième entrées d'utilisateur au niveau d'un module d'apprentissage automatique pour déterminer une liaison pondérée entre des attributs respectifs des première et deuxième entrées d'utilisateur, et à utiliser la liaison pondérée entre les attributs respectifs des première et deuxième entrées d'utilisateur pour garnir l'interface d'utilisateur d'application avec un contenu actualisé d'après l'analyse.
PCT/GB2015/051903 2014-06-30 2015-06-30 Capture de données cliniques avec apprentissage automatique WO2016001647A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1411565.3A GB201411565D0 (en) 2014-06-30 2014-06-30 Clinical data capture
GB1411565.3 2014-06-30

Publications (1)

Publication Number Publication Date
WO2016001647A1 true WO2016001647A1 (fr) 2016-01-07

Family

ID=51410324

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2015/051903 WO2016001647A1 (fr) 2014-06-30 2015-06-30 Capture de données cliniques avec apprentissage automatique

Country Status (2)

Country Link
GB (1) GB201411565D0 (fr)
WO (1) WO2016001647A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019197222A1 (fr) * 2018-04-11 2019-10-17 Siemens Healthcare Gmbh Procédé pour commander le fonctionnement d'un dispositif médical, unité de commande, système de commande, dispositif médical, programme informatique et support de données à lecture électronique

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060036471A1 (en) * 2004-04-22 2006-02-16 Penguin Medical Systems, Inc. Computerized automation of physician-patient interaction for streamlined physician workflow
US20100127981A1 (en) * 2007-07-24 2010-05-27 Brandt Alexander U Method for the situation-adapted documentation of structured data
US20140136230A1 (en) * 2010-08-06 2014-05-15 Sunjay Berdia System and methods for an intelligent medical practice system employing a learning knowledge base

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060036471A1 (en) * 2004-04-22 2006-02-16 Penguin Medical Systems, Inc. Computerized automation of physician-patient interaction for streamlined physician workflow
US20100127981A1 (en) * 2007-07-24 2010-05-27 Brandt Alexander U Method for the situation-adapted documentation of structured data
US20140136230A1 (en) * 2010-08-06 2014-05-15 Sunjay Berdia System and methods for an intelligent medical practice system employing a learning knowledge base

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WARREN J R ET AL: "Mediface: anticipative data entry interface for general practitioners", COMPUTER HUMAN INTERACTION CONFERENCE, 1998. PROCEEDINGS. 1998 AUSTRAL ASIAN ADELAIDE, SA, AUSTRALIA 30 NOV.-4 DEC. 1998, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 30 November 1998 (1998-11-30), pages 192 - 199, XP010313651, ISBN: 978-0-8186-9206-2, DOI: 10.1109/OZCHI.1998.732214 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019197222A1 (fr) * 2018-04-11 2019-10-17 Siemens Healthcare Gmbh Procédé pour commander le fonctionnement d'un dispositif médical, unité de commande, système de commande, dispositif médical, programme informatique et support de données à lecture électronique
CN111954908A (zh) * 2018-04-11 2020-11-17 西门子医疗有限公司 用于控制医学技术装置的运行的方法、操作设备、操作系统、医学技术装置、计算机程序和电子可读的数据载体
US11298099B2 (en) 2018-04-11 2022-04-12 Siemens Healthcare Gmbh Method for controlling the operation of a medical apparatus, operating device, operating system, medical apparatus, computer program and electronically readable data carrier

Also Published As

Publication number Publication date
GB201411565D0 (en) 2014-08-13

Similar Documents

Publication Publication Date Title
Shah et al. Making machine learning models clinically useful
US20200111578A1 (en) Methods and systems for software clinical guidance
US20190005200A1 (en) Methods and systems for generating a patient digital twin
US8448077B2 (en) Decision support systems for guideline and knowledge navigation over different levels of abstraction of the guidelines
US20230010216A1 (en) Diagnostic Effectiveness Tool
US20090070138A1 (en) Integrated clinical risk assessment system
US20090150183A1 (en) Linking to clinical decision support
JP2017174407A (ja) 患者の診断を支援するシステムおよび方法
Abbasi Getting pharmacogenomics into the clinic
US20080243547A1 (en) Creating computer aided medical recommendations
JP2017519303A (ja) 患者及び臨床医を支援するために、共有される、患者中心の意思決定サポートツールを用いるシステム及び方法
WO2021087317A1 (fr) Réalisation d'opérations de mise en correspondance pour effectuer une intervention
US20150100344A1 (en) Patient health information analysis system
Darlington Designing for explanation in health care applications of expert systems
WO2014147067A1 (fr) Système de médecine personnalisé affichant un calendrier d'informations de patient clinique
Kuehn Clinics aim to improve post-ICU recovery
US20150081328A1 (en) System for hospital adaptive readmission prediction and management
US20150339451A1 (en) System and Method for Providing Mobile Electronic Assistance in Diagnostic and Therapeutic Medical Decisions and Documentation
Ogundipe et al. Health information communication technology evaluation frameworks for pharmacist prescribing: a systematic scoping review
US20140172437A1 (en) Visualization for health education to facilitate planning for intervention, adaptation and adherence
JP7238705B2 (ja) 診療支援方法、診療支援システム、学習モデルの生成方法、および、診療支援プログラム
Alpern et al. Trends in pricing and out-of-pocket spending on entecavir among commercially insured patients, 2014-2018
WO2016001647A1 (fr) Capture de données cliniques avec apprentissage automatique
JP2014098946A (ja) 診療業務支援システムおよびプログラム
JP2014119881A (ja) 情報処理装置、ラベル選択方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15747177

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15747177

Country of ref document: EP

Kind code of ref document: A1