WO2013166146A1 - Neuro-cognitive diagnosis and therapy decision support system - Google Patents

Neuro-cognitive diagnosis and therapy decision support system Download PDF

Info

Publication number
WO2013166146A1
WO2013166146A1 PCT/US2013/039063 US2013039063W WO2013166146A1 WO 2013166146 A1 WO2013166146 A1 WO 2013166146A1 US 2013039063 W US2013039063 W US 2013039063W WO 2013166146 A1 WO2013166146 A1 WO 2013166146A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
knowledge base
conversation
practitioner
extension system
Prior art date
Application number
PCT/US2013/039063
Other languages
French (fr)
Inventor
John Pestian
Tracy Glauser
Stephen PESTIAN
Original Assignee
Cincinnati Children's Hospital Medical Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cincinnati Children's Hospital Medical Center filed Critical Cincinnati Children's Hospital Medical Center
Publication of WO2013166146A1 publication Critical patent/WO2013166146A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/20ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references
    • G16H70/20ICT specially adapted for the handling or processing of medical references relating to practices or guidelines
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring

Definitions

  • the present disclosure relates to communicating and diagnosing patients in a clinical setting.
  • An exemplary virtual physician extension system may include: (a) a patient conversation-driver knowledge base, including conversation trees and decision trees designed for providing initial patient questions, and for providing follow-up questions based upon patient responses to initial patient questions; (b) a therapy knowledge base, a prognosis knowledge base and/or a comorbidity knowledge base; (c) a first computing device providing a computerized patient interface, the first computing device including a display, speaker, a mic, a camera, random access memory, persistent memory, patient interface application software resident on the persistent memory, an external data link, and processing circuitry, having access to the patient conversation-driver knowledge base and operatively coupled to the display, the speaker, the mic, the camera, the random access memory, the external data link, and the persistent memory to operate the patient interface application software that is configured to, (i) provide an avatar on the display and to elicit audible questions appearing to come from the avatar, via the speaker, to a patient based, at least in part, upon conversation trees provided by
  • the practitioner interface application software may be further configured to display prognosis information and/or comorbidities risk information to the practitioner based upon response data as applied with the prognosis knowledge base and/or co-morbidity knowledge base.
  • the system may further include (d) clinical decision support algorithms resident on the second computing device or a server external to the second computing device configured to apply the response data against the therapy knowledge base, prognosis knowledge base and/or co-morbidity knowledge base in the generation of the prognosis information and/or co-morbidities risk information.
  • the clinical decision support algorithms may access universal patient health records in the generation of the prognosis information and/or co-morbidities risk information.
  • the patient interface application software may be further configured to collect response data from visual responses received from the patient via the camera.
  • the response data may be based upon a combination of language and voice inflections received from the patient via the mic.
  • the patient conversation-driver knowledge base may be a dynamic knowledge base configured to adjust using machine learning.
  • the first and/or second computing devices may be a tablet computer, notebook computer, or a hand-held computing device.
  • the patient interface application software may utilize natural language processing algorithms and processes as part of collecting response data.
  • the system may further include patient conversation support algorithms resident on the first computing device or a server external to the first computing device configured to apply the response data against the patient conversation-driver knowledge base and to elicit follow-up questions.
  • patient conversation support algorithms may be configured to pass along patient responses to the practitioner interface falling outside of the conversation trees.
  • the initial patient questions are open-ended questions.
  • the information displayed to the practitioner by the practitioner interface includes patient observations, recommended treatment plans, pharmaceutical information, optional treatments and/or ranges of success.
  • the practitioner interface application may be further configured to receive patient treatment instructions from the practitioner utilizing the practitioner interface and to initiate communication of the patient treatment instructions to a patient record.
  • FIG. 1 is a block-diagram representation of an exemplary physician extension system according to the current disclosure.
  • FIG. 2 is a block-diagram representation of an exemplary computer device and/or computer system according to exemplary embodiments of the current disclosure.
  • the present disclosure relates to system and method for extending captured clinical and scientific knowledge towards practical and useful applications.
  • the systems and method may be embodied in a system 10 that allows a clinician to gather clinical information from a patient using conversational interaction, where the patient's interactive interface 12 captures clinical data from the patient's responses (and possibly from other cues) using machine learning and natural language processing, via a conversation driver knowledge base 28 and patient user-interface engine, for example, and where the interactive interface 12 may even drive the conversation with the patient based upon the responsive data from the patient.
  • the system/method 0 can communicate a prognosis and/or a risk for co-morbidities to the physician and/or clinician, via a physician interface 14, so that such prognosis and/or co-morbidities risks can be acted upon.
  • the system may provide two separate types of interfaces: an interactive interface for the patient 12, and a more detailed and customized interface 14 for the physician and/or clinician.
  • the patient interface may be provided on a computing device (such as a tablet or notebook computer, a hand-held computing device and/or any other type of device providing a
  • the patient interface communicates to the patient utilizing a video image or avatar 18 (which may be in the form of a person, an animal or some other object) and computer generated speech (or recorded speech) that converses with the patient (utilizing a question set that may be dynamic and flexible depending upon the answers and other data received).
  • a video image or avatar 18 which may be in the form of a person, an animal or some other object
  • computer generated speech or recorded speech
  • the pediatric patient would, hopefully, freely converse with the interface and answer the questions posed by the patient interface.
  • the device 16 and/or patient user-interface engine 40 can extract critical diagnostic and other useful information from the words of the patient received and/or recorded by the device 16.
  • the device 16 and/or patient user-interface engine 40 may also be able to detect other vocal cues, such as voice inflections, or signs of fear, anger or stress, for example. Further, by using the computing device's camera 22, the device 16 and/or patient user-interface engine 40 can sense visual cues expressed by the patient, such as shrugs, head-nods, head-shakes and the like. Coordinating these responses and other sensed cues from the patient, the conversation driver knowledge base 28 may be consulted for follow-up questions and/or alternate discussion trees.
  • the physician interface 14 can be any type of interface designed to pass along the prognosis and/or co-morbidities risk information developed based upon the data gathered by the patient interface (and applied, utilizing decision support algorithms 30, against the therapy knowledge base 32, prognosis knowledge base 34 and/or a co-morbidity knowledge base 36).
  • the knowledge base(s) are dynamic knowledge bases that adjust using machine learning (such as active learning and/or supervised learning) that may be operating along with the patient and/or physician interface.
  • machine learning such as active learning and/or supervised learning
  • a "Virtual Human” 18 is rendered on the patient interface 12, and "Virtual Physician Knowledge” is provided as part of a greater “Knowledge project.”
  • Dynamic knowledge bases and interactive technology combine clinical decision support and universal health records (e.g., from electronic patient records database 38) to form a corpus of diverse knowledge, including a therapy knowledge base 32, a prognosis knowledge base 34, and a co-morbidity knowledge base 36.
  • the system 10 integrates and decision supports active learning and supervised learning.
  • This system 10 may actually teach the machine while pursing meanings of a variable. If the variable is unknown, the machine may send the patient's full response directly to the physician or alternate researcher.
  • Therapy knowledge base 32 is also much like, for example, an epilepsy knowledge base.
  • a prognosis and risk for co-morbidity is provided (e.g. risk for depression, percentile changes).
  • FIG. 2 illustrates an exemplary environment 1600 for implementing and/or controlling various components (such as devices 16 or 46,or components of central system 26) of an example embodiment that includes a computer 1602, the computer 1602 including a processing unit 1604, a system memory 1606 and a system bus 1608.
  • the system bus 1608 couples system components including, but not limited to, the system memory 1606 to the processing unit 1604.
  • the processing unit 1604 can be any of various commercially available processors. Dual microprocessors and other multi-processor architectures may also be employed as the processing unit 1604.
  • the system bus 1608 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
  • the system memory 1606 includes read only memory (ROM) 1610 and random access memory (RAM) 1612.
  • ROM read only memory
  • RAM random access memory
  • a basic input/output system (BIOS) is stored in a non-volatile memory 1610 such as ROM, EPROM, EEPROM, which BIOS contains the basic routines that help to transfer information between elements within the computer 1602, such as during start-up.
  • the RAM 1612 can also include a high-speed RAM such as static RAM for caching data.
  • the computer 1602 further includes an internal hard disk drive (HDD) 1614 (e.g., EIDE, SATA; or, alternatively, suitable solid-state drive(s) SSDs), which internal hard disk drive 1614 may also be configured for external use in a suitable chassis (not shown), a magnetic floppy disk drive (FDD) 1616, (e.g., to read from or write to a removable diskette 1618) and an optical disk drive 1620, (e.g., reading a CD-ROM disk 1622 or, to read from or write to other high capacity optical media such as the DVD).
  • HDD internal hard disk drive
  • FDD magnetic floppy disk drive
  • an optical disk drive 1620 e.g., reading a CD-ROM disk 1622 or, to read from or write to other high capacity optical media such as the DVD.
  • the hard disk drive 161 , magnetic disk drive 1616 and optical disk drive 1620 can be connected to the system bus 1608 by a hard disk drive interface 1624, a magnetic disk drive interface 1626 and an optical drive interface 1628, respectively.
  • the interface 1624 for external drive implementations includes at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface
  • the drives and media accommodate the storage of any data in a suitable digital format.
  • computer-readable media refers to a HDD, a removable magnetic diskette, and a removable optical media such as a CD or DVD, it should be appreciated by those skilled in the art that other types of media which are readable by a computer, such as zip drives, solid-state drives, magnetic cassettes, flash memory cards, cartridges, and the like, may also be used in the exemplary operating environment, and further, that any such media may contain computer-executable instructions for performing the methods of an example system.
  • a number of program modules can be stored in the drives and RAM 1612, including an operating system 1630, one or more application programs 1632, other program modules 1634 and program data 1636. All or portions of the operating system, applications, modules, and/or data can also be cached in the RAM 1612. It is appreciated that an example system can be implemented with various
  • a user can enter commands and information into the computer 1602 through one or more wired/wireless input devices, e.g., a keyboard 1638 and a pointing device, such as a mouse 1640.
  • Other input devices may include a microphone, an IR remote control, a joystick, a game pad, a stylus pen, touch screen, or the like.
  • These and other input devices are often connected to the processing unit 1604 through an input device interface 1642 that is coupled to the system bus 1608, but can be connected by other interfaces, such as a parallel port, an IEEE 1394 serial port, a game port, a USB port, an IR interface, etc.
  • a monitor 1644 or other type of display device is also connected to the system bus 1608 via an interface, such as a video adapter 1646.
  • a computer typically includes other peripheral output devices (not shown), such as speakers, printers, etc.
  • the computer 1602 may operate in a networked environment using logical connections via wired and/or wireless communications to one or more remote computers, such as a remote computer(s) 1648.
  • the remote computer(s) 1648 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1602, although, for purposes of brevity, only a memory storage device 1650 is illustrated.
  • the logical connections depicted include wired/wireless connectivity to a local area network (LAN) 1652 and/or larger networks, e.g., a wide area network (WAN) 1654.
  • LAN and WAN networking environments are commonplace in offices, and companies, and facilitate enterprise- wide computer networks, such as intranets, all of which may connect to a global communication network, e.g., the Internet.
  • the computer 1602 When used in a LAN networking environment, the computer 1602 is connected to the local network 1652 through a wired and/or wireless communication network interface or adapter 1656.
  • the adaptor 1656 may facilitate wired or wireless communication to the LAN 1652, which may also include a wireless access point disposed thereon for communicating with the wireless adaptor 1656.
  • the computer 1602 can include a modem 1658, or is connected to a communications server on the WAN 1654, or has other means for establishing communications over the WAN 654, such as by way of the Internet.
  • the modem 1658 which can be internal or external and a wired or wireless device, is connected to the system bus 608 via the serial port interface 1642.
  • program modules depicted relative to the computer 1602, or portions thereof can be stored in the remote memory/storage device 1650. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
  • the computer 1602 is operable to communicate with any wireless devices or entities operatively disposed in wireless communication, e.g., a printer, scanner, desktop and/or portable computer, portable data assistant, communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone.
  • any wireless devices or entities operatively disposed in wireless communication e.g., a printer, scanner, desktop and/or portable computer, portable data assistant, communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone.
  • the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices.
  • Wi-Fi, or Wireless Fidelity allows connection to the Internet from a couch at home, a bed in a hotel room, or a conference room at work, without wires.
  • Wi-Fi is a wireless technology similar to that used in a cell phone that enables such devices, e.g., computers, to send and receive data indoors and out; anywhere within the range of a base station.
  • Wi-Fi networks use radio technologies called IEEE 802.11 (a, b, g, etc.) to provide secure, reliable, fast wireless connectivity.
  • IEEE 802.11 a, b, g, etc.
  • a Wi-Fi network can be used to connect computers to each other, to the Internet, and to wired networks (which use IEEE 802.3 or Ethernet).
  • Wi-Fi networks operate in the unlicensed 2.4 and 5 GHz radio bands, at an 11 Mbps (802.11a) or 54 Mbps
  • the first step is an information gathering phase.
  • a pediatric patient operates a device 16, such as a wireless tablet computer, provided by a technician.
  • the patient initiates an application program (or multiple programs) on the device 16 and/or on a patient interface engine 40 (residing on a central system 26, for example) communicatively coupled to the device 16, such as an application program in the usual means of operation for the particular tablet.
  • an application program or multiple programs
  • a patient interface engine 40 residing on a central system 26, for example
  • the Virtual Physician 10 in this embodiment, provides an avatar 18 resembling a human female, given a name of "Christine.”
  • the Virtual Physician 10 is able to gather information (input) from the patient using the tablet camera 22 and microphone 20, and is able to prompt the patient for further interaction (input) by using speakers 24.
  • the Virtual physician 10 may be able to provide real time (or near real-time) questions and prompts to the patient that may be determined by the input and feedback it receives from the patient by comparing the input against a conversation driver knowledge base 28.
  • the input may be transmitted to a central system 26 other than the tablet, where the knowledge database 28 may be accessed or may reside.
  • a technician may pre-program the Virtual Physician 10 to the proper patient to access the patient's files and history, via an appropriate electronic patient records database 38, for example.
  • the Virtual Physician 10 may be set to ask the patient about a certain condition, for example, epilepsy.
  • the Virtual Physician may greet the patient by name to establish rapport. It may then observe the mood of the patient by comparing the optical cues of the patient obtained by the camera 22 sensor to its conversation driver knowledge base 28.
  • the Virtual Physician 10 may prompt the patient to provide input with an open-ended question (not just a yes-or-no question).
  • the Virtual Physician 10 may be programmed to appear to be concerned about the patient.
  • the patient responds to the question (prompts) with conversational sentences, for example, about how he/she feels.
  • the input is gathered though the microphone 20 and transmitted to a processor operating the patient user-interface engine 40, for example, where it is compared to the conversation driver knowledge database 28.
  • a response, and further prompts based on the previous patient input, is selected to be communicated to the patient through the Virtual Physician using the avatar's 18 movements, and speakers 24.
  • the responses of the Virtual Physician 10 are based on the input of the patient, and further prompts are likewise based on the input of the patient in order to obtain valuable diagnosis information.
  • the Virtual Physician 10 may change the subject and start new prompt chains to obtain other information from the patient.
  • the system is capable of processing open ended conversation questions, and asks "anything else" to capture information which might not have been covered.
  • Responses not in the conversation driver knowledge base may be passed directly on to a real provider or physician's interface 14.
  • the Virtual Physician 10 is also able to issue questionnaires, and is able to educate the patient on the treatment(s) and/or procedure(s) involved in treatment, where such educational scripts may also be stored on the conversation driver knowledge base 28, for example.
  • a central decision support engine 30 may store the input provided by the patient in the first phase.
  • the central decision support engine 30 may also compare the various inputs collected (questions, questionnaire, observations) data gathered, inputted and stored in its continually growing knowledge database(s) (such as the therapy knowledge base, the prognosis knowledge base and/or the co-morbidity knowledge bases 32, 34 and 36).
  • knowledge database(s) such as the therapy knowledge base, the prognosis knowledge base and/or the co-morbidity knowledge bases 32, 34 and 36.
  • the central decision support engine 30 for example, the patient input as compared to the knowledge database(s) 32, 34 and 36 may ascribe success rate percentage ranges for the particular patient with each treatment plan. It can weigh the treatments against each other for this particular patient, and recommend the viable medical treatment options.
  • a medical provider may access the central system 26, via a physician user-interface engine 44, with a computing device 46, such as a wireless tablet.
  • the physician user-interface engine 44 may include one or more
  • the Virtual Physician 10 is able to determine the medical provider's credentials.
  • the medical provider is able to inform the Virtual Physician 10, via the physician interface 14, his/her observations with the patient, and requests a recommended treatment plan, with available parameters (for example, a provider may indicate the maximum dosage for a particular medicine, and it is up to the processor to determine if this medicine is a viable option).
  • the Virtual Physician 10 may then provide several options of treatment plans, including the reasoning behind the particular course of action. If medicine is involved, the physician interface 14 may display the
  • the system 10 through the Virtual Physician application(s) may also provide alternate options, including surgery, and the success rate of such a surgery for this particular patient (for example, taking in severity of symptoms of this particular patient, and comparing the input to the database of similar patients).
  • the provider may also ask questions, via the interface 14, beyond treatment plans and options, including what the patient's chances are to be cured.
  • the Virtual Physician may be able to provide percentage ranges of success through access to the various knowledge bases 32-36.
  • the provider may also ask open ended questions for medical subjects other than epilepsy, which is merely the primary medical issue in this example.
  • the Virtual Physician 10 can relate to the provider what the patient has said, as well as relay other observations about the patient.
  • the provider can respond with instructions to the Virtual Physician. For example, the provider may ask the Virtual Physician to perform related tasks, such as initiate information videos on the patient's device 16. It can be seen that the Virtual PhysicianIO is interactive with the patient and the provider, can offer comprehensive medical diagnosis and treatment, and can even go beyond the medical issue to help enhance patient's well- being.
  • an exemplary virtual physician extension system 10 may include: (a) a patient conversation-driver knowledge base 28, including conversation trees and decision trees designed for providing initial patient questions, and for providing follow-up questions based upon patient responses to initial patient questions; (b) a therapy knowledge base 32, a prognosis knowledge base 34 and/or a co-morbidity knowledge base 36; (c) a first computing device 16 providing a computerized patient interface 12, the first computing device 16 including a display, speaker, a mic 20, a camera 22, random access memory, persistent memory, patient interface application software resident (at least in part) on the persistent memory, an external data link, and processing circuitry, having access to the patient conversation-driver knowledge base 28 and operatively coupled to the display, the speaker 24, the mic 20, the camera 22, the random access memory, the external data link, and the persistent memory to operate the patient interface application software that is configured to, (i) provide an avatar 18 on the display and to elicit audible questions appearing to come from the avatar 8, via the speaker 24,

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Strategic Management (AREA)
  • Biomedical Technology (AREA)
  • Human Resources & Organizations (AREA)
  • Data Mining & Analysis (AREA)
  • Pathology (AREA)
  • Operations Research (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Bioethics (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

A virtual physician extension system may include: (a) a patient conversation- driver knowledge base, including conversation trees and decision trees designed for providing initial patient questions, and for providing follow-up questions based upon patient responses to initial patient questions; (b) a therapy knowledge base, a prognosis knowledge base and/or a co-morbidity knowledge base; (c) a first computing device providing a computerized patient interface and processing circuitry, having access to the patient conversation-driver knowledge base to operate the patient interface application software that is configured to, (i) provide an avatar on the display and to elicit audible questions appearing to come from the avatar, via the speaker, to a patient based, at least in part, upon conversation trees provided by the patient conversation-driver knowledge base, and (ii) collect response data from audible responses received from the patient via the mic and provide the response data to the patient conversation-driver knowledge base, and (d) a second computing device providing a computerized practitioner interface and processing circuitry, having access to therapy knowledge base, prognosis knowledge base and/or co-morbidity knowledge base to operate the practitioner interface application software that is configured to display information to a practitioner based upon response data as applied with the therapy knowledge base, prognosis knowledge base and/or co-morbidity knowledge base.

Description

TITLE: NEURO-COGNITIVE DIAGNOSIS AND THERAPY DECISION SUPPORT SYSTEM
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The current application claims the benefit of U.S. provisional application, Ser. No. 61/641 ,275, filed May 1 , 2012, the disclosure of which is incorporated herein by reference.
FIELD OF THE INVENTION
[0002] The present disclosure relates to communicating and diagnosing patients in a clinical setting.
SUMMARY
[0003] An exemplary virtual physician extension system according to the current disclosure may include: (a) a patient conversation-driver knowledge base, including conversation trees and decision trees designed for providing initial patient questions, and for providing follow-up questions based upon patient responses to initial patient questions; (b) a therapy knowledge base, a prognosis knowledge base and/or a comorbidity knowledge base; (c) a first computing device providing a computerized patient interface, the first computing device including a display, speaker, a mic, a camera, random access memory, persistent memory, patient interface application software resident on the persistent memory, an external data link, and processing circuitry, having access to the patient conversation-driver knowledge base and operatively coupled to the display, the speaker, the mic, the camera, the random access memory, the external data link, and the persistent memory to operate the patient interface application software that is configured to, (i) provide an avatar on the display and to elicit audible questions appearing to come from the avatar, via the speaker, to a patient based, at least in part, upon conversation trees provided by the patient conversation-driver knowledge base, and (ii) collect response data from audible responses received from the patient via the mic and provide the response data to the patient conversation-driver knowledge base, and (d) a second computing device providing a computerized practitioner interface, the second computing device including a display, random access memory, persistent memory, practitioner interface application software resident on the persistent memory, an external data link, and processing circuitry, having access to therapy knowledge base, prognosis knowledge base and/or co-morbidity knowledge base, and operatively coupled to the display, the random access memory, the external data link, and the persistent memory to operate the practitioner interface application software that is configured to display information to a practitioner based upon response data as applied with the therapy knowledge base, prognosis knowledge base and/or co-morbidity knowledge base. In a more detailed embodiment, the avatar provided by the patient interface application software may be a human avatar or an animal avatar.
[0004] In a more detailed embodiment the practitioner interface application software may be further configured to display prognosis information and/or comorbidities risk information to the practitioner based upon response data as applied with the prognosis knowledge base and/or co-morbidity knowledge base. In a further detailed embodiment, the system may further include (d) clinical decision support algorithms resident on the second computing device or a server external to the second computing device configured to apply the response data against the therapy knowledge base, prognosis knowledge base and/or co-morbidity knowledge base in the generation of the prognosis information and/or co-morbidities risk information. In a further detailed embodiment, the clinical decision support algorithms may access universal patient health records in the generation of the prognosis information and/or co-morbidities risk information.
[0005] In a more detailed embodiment the patient interface application software may be further configured to collect response data from visual responses received from the patient via the camera. Alternatively, or in addition, the response data may be based upon a combination of language and voice inflections received from the patient via the mic. Alternatively, or in addition, the patient conversation-driver knowledge base may be a dynamic knowledge base configured to adjust using machine learning. Alternatively, or in addition, the first and/or second computing devices may be a tablet computer, notebook computer, or a hand-held computing device. Alternatively, or in addition, the patient interface application software may utilize natural language processing algorithms and processes as part of collecting response data. [0006] Alternatively, or in addition, the system may further include patient conversation support algorithms resident on the first computing device or a server external to the first computing device configured to apply the response data against the patient conversation-driver knowledge base and to elicit follow-up questions. In a more detailed embodiment, the patient conversation support algorithms may be configured to pass along patient responses to the practitioner interface falling outside of the conversation trees.
[0007] Alternatively, or in addition, the initial patient questions are open-ended questions.
[0008] Alternatively, or in addition, the information displayed to the practitioner by the practitioner interface includes patient observations, recommended treatment plans, pharmaceutical information, optional treatments and/or ranges of success. Alternatively, or in addition, the practitioner interface application may be further configured to receive patient treatment instructions from the practitioner utilizing the practitioner interface and to initiate communication of the patient treatment instructions to a patient record.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] Fig. 1 is a block-diagram representation of an exemplary physician extension system according to the current disclosure; and
[0010] Fig. 2 is a block-diagram representation of an exemplary computer device and/or computer system according to exemplary embodiments of the current disclosure.
DETAILED DESCRIPTION
[0011] The present disclosure relates to system and method for extending captured clinical and scientific knowledge towards practical and useful applications. For example, as shown in FIG 1 , the systems and method may be embodied in a system 10 that allows a clinician to gather clinical information from a patient using conversational interaction, where the patient's interactive interface 12 captures clinical data from the patient's responses (and possibly from other cues) using machine learning and natural language processing, via a conversation driver knowledge base 28 and patient user-interface engine, for example, and where the interactive interface 12 may even drive the conversation with the patient based upon the responsive data from the patient. Then, utilizing clinical decision support algorithms 30 applied against a therapy knowledge base 32, a prognosis knowledge base 34 and/or a co-morbidity knowledge base 36 (a dynamic corpus of disease knowledge), the system/method 0 can communicate a prognosis and/or a risk for co-morbidities to the physician and/or clinician, via a physician interface 14, so that such prognosis and/or co-morbidities risks can be acted upon.
[0012] In an embodiment as shown in FIG 1 , the system may provide two separate types of interfaces: an interactive interface for the patient 12, and a more detailed and customized interface 14 for the physician and/or clinician. The patient interface may be provided on a computing device (such as a tablet or notebook computer, a hand-held computing device and/or any other type of device providing a
computerized interface) 16 where the patient interface communicates to the patient utilizing a video image or avatar 18 (which may be in the form of a person, an animal or some other object) and computer generated speech (or recorded speech) that converses with the patient (utilizing a question set that may be dynamic and flexible depending upon the answers and other data received). Perhaps being more comfortable conversing with virtual beings, the pediatric patient would, hopefully, freely converse with the interface and answer the questions posed by the patient interface. By using natural language processing (and, possibly, machine learning) the device 16 and/or patient user-interface engine 40 can extract critical diagnostic and other useful information from the words of the patient received and/or recorded by the device 16. Further, by using the mic 20 or other audio sensing capabilities, the device 16 and/or patient user-interface engine 40 may also be able to detect other vocal cues, such as voice inflections, or signs of fear, anger or stress, for example. Further, by using the computing device's camera 22, the device 16 and/or patient user-interface engine 40 can sense visual cues expressed by the patient, such as shrugs, head-nods, head-shakes and the like. Coordinating these responses and other sensed cues from the patient, the conversation driver knowledge base 28 may be consulted for follow-up questions and/or alternate discussion trees. [0013] The physician interface 14 can be any type of interface designed to pass along the prognosis and/or co-morbidities risk information developed based upon the data gathered by the patient interface (and applied, utilizing decision support algorithms 30, against the therapy knowledge base 32, prognosis knowledge base 34 and/or a co-morbidity knowledge base 36).
[0014] In an embodiment, the knowledge base(s) are dynamic knowledge bases that adjust using machine learning (such as active learning and/or supervised learning) that may be operating along with the patient and/or physician interface.
[0015] By linking multiple medical centers, using the disclosed embodiments together, the power of the disclosed embodiments would increase dramatically.
[0016] In an embodiment, a "Virtual Human" 18 is rendered on the patient interface 12, and "Virtual Physician Knowledge" is provided as part of a greater "Knowledge project."
[0017] This system 10 goes beyond relational databases and incorporates the use of knowledge bases. The computation algorithm utilizes machine wisdom when understanding principles. Dynamic knowledge bases and interactive technology combine clinical decision support and universal health records (e.g., from electronic patient records database 38) to form a corpus of diverse knowledge, including a therapy knowledge base 32, a prognosis knowledge base 34, and a co-morbidity knowledge base 36.
[0018] In this way, the system 10 integrates and decision supports active learning and supervised learning. This system 10 may actually teach the machine while pursing meanings of a variable. If the variable is unknown, the machine may send the patient's full response directly to the physician or alternate researcher. Therapy knowledge base 32 is also much like, for example, an epilepsy knowledge base. In this example, a prognosis and risk for co-morbidity is provided (e.g. risk for depression, percentile changes).
[0019] The system can also propose data input by using visual cues taken in by the computer, as well as vocal cues, and phrases. It is patient interactive as well as physician interactive, with a selectable ability to interact. [0020] On the processing end, FIG. 2 illustrates an exemplary environment 1600 for implementing and/or controlling various components (such as devices 16 or 46,or components of central system 26) of an example embodiment that includes a computer 1602, the computer 1602 including a processing unit 1604, a system memory 1606 and a system bus 1608. The system bus 1608 couples system components including, but not limited to, the system memory 1606 to the processing unit 1604. The processing unit 1604 can be any of various commercially available processors. Dual microprocessors and other multi-processor architectures may also be employed as the processing unit 1604.
[0021] The system bus 1608 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures. The system memory 1606 includes read only memory (ROM) 1610 and random access memory (RAM) 1612. A basic input/output system (BIOS) is stored in a non-volatile memory 1610 such as ROM, EPROM, EEPROM, which BIOS contains the basic routines that help to transfer information between elements within the computer 1602, such as during start-up. The RAM 1612 can also include a high-speed RAM such as static RAM for caching data.
[0022] The computer 1602 further includes an internal hard disk drive (HDD) 1614 (e.g., EIDE, SATA; or, alternatively, suitable solid-state drive(s) SSDs), which internal hard disk drive 1614 may also be configured for external use in a suitable chassis (not shown), a magnetic floppy disk drive (FDD) 1616, (e.g., to read from or write to a removable diskette 1618) and an optical disk drive 1620, (e.g., reading a CD-ROM disk 1622 or, to read from or write to other high capacity optical media such as the DVD). The hard disk drive 161 , magnetic disk drive 1616 and optical disk drive 1620 can be connected to the system bus 1608 by a hard disk drive interface 1624, a magnetic disk drive interface 1626 and an optical drive interface 1628, respectively. The interface 1624 for external drive implementations includes at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface
technologies.
[0023] The drives and their associated computer-readable media provide
nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For the computer 1602, the drives and media accommodate the storage of any data in a suitable digital format. Although the description of computer-readable media above refers to a HDD, a removable magnetic diskette, and a removable optical media such as a CD or DVD, it should be appreciated by those skilled in the art that other types of media which are readable by a computer, such as zip drives, solid-state drives, magnetic cassettes, flash memory cards, cartridges, and the like, may also be used in the exemplary operating environment, and further, that any such media may contain computer-executable instructions for performing the methods of an example system.
[0024] A number of program modules can be stored in the drives and RAM 1612, including an operating system 1630, one or more application programs 1632, other program modules 1634 and program data 1636. All or portions of the operating system, applications, modules, and/or data can also be cached in the RAM 1612. It is appreciated that an example system can be implemented with various
commercially available operating systems or combinations of operating systems.
[0025] A user can enter commands and information into the computer 1602 through one or more wired/wireless input devices, e.g., a keyboard 1638 and a pointing device, such as a mouse 1640. Other input devices (not shown) may include a microphone, an IR remote control, a joystick, a game pad, a stylus pen, touch screen, or the like. These and other input devices are often connected to the processing unit 1604 through an input device interface 1642 that is coupled to the system bus 1608, but can be connected by other interfaces, such as a parallel port, an IEEE 1394 serial port, a game port, a USB port, an IR interface, etc.
[0026] A monitor 1644 or other type of display device is also connected to the system bus 1608 via an interface, such as a video adapter 1646. In addition to the monitor 1644, a computer typically includes other peripheral output devices (not shown), such as speakers, printers, etc.
[0027] The computer 1602 may operate in a networked environment using logical connections via wired and/or wireless communications to one or more remote computers, such as a remote computer(s) 1648. The remote computer(s) 1648 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1602, although, for purposes of brevity, only a memory storage device 1650 is illustrated. The logical connections depicted include wired/wireless connectivity to a local area network (LAN) 1652 and/or larger networks, e.g., a wide area network (WAN) 1654. Such LAN and WAN networking environments are commonplace in offices, and companies, and facilitate enterprise- wide computer networks, such as intranets, all of which may connect to a global communication network, e.g., the Internet.
[0028] When used in a LAN networking environment, the computer 1602 is connected to the local network 1652 through a wired and/or wireless communication network interface or adapter 1656. The adaptor 1656 may facilitate wired or wireless communication to the LAN 1652, which may also include a wireless access point disposed thereon for communicating with the wireless adaptor 1656.
[0029] When used in a WAN networking environment, the computer 1602 can include a modem 1658, or is connected to a communications server on the WAN 1654, or has other means for establishing communications over the WAN 654, such as by way of the Internet. The modem 1658, which can be internal or external and a wired or wireless device, is connected to the system bus 608 via the serial port interface 1642. In a networked environment, program modules depicted relative to the computer 1602, or portions thereof, can be stored in the remote memory/storage device 1650. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
[0030] The computer 1602 is operable to communicate with any wireless devices or entities operatively disposed in wireless communication, e.g., a printer, scanner, desktop and/or portable computer, portable data assistant, communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone. This includes at least Wi-Fi and Bluetooth.TM. wireless technologies. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices. [0031] Wi-Fi, or Wireless Fidelity, allows connection to the Internet from a couch at home, a bed in a hotel room, or a conference room at work, without wires. Wi-Fi is a wireless technology similar to that used in a cell phone that enables such devices, e.g., computers, to send and receive data indoors and out; anywhere within the range of a base station. Wi-Fi networks use radio technologies called IEEE 802.11 (a, b, g, etc.) to provide secure, reliable, fast wireless connectivity. A Wi-Fi network can be used to connect computers to each other, to the Internet, and to wired networks (which use IEEE 802.3 or Ethernet). Wi-Fi networks operate in the unlicensed 2.4 and 5 GHz radio bands, at an 11 Mbps (802.11a) or 54 Mbps
(802.11 b) data rate, for example, or with products that contain both bands (dual band), so the networks can provide real-world performance similar to the basic 10BaseT wired Ethernet networks used in many offices.
[0032] Referring back to FIG. 1 , in an example embodiment, the first step is an information gathering phase. A pediatric patient operates a device 16, such as a wireless tablet computer, provided by a technician. The patient initiates an application program (or multiple programs) on the device 16 and/or on a patient interface engine 40 (residing on a central system 26, for example) communicatively coupled to the device 16, such as an application program in the usual means of operation for the particular tablet. In this exemplary embodiment, the entire system 10 will be referred to as "The Virtual Physician." The Virtual Physician 10 in this embodiment provides an avatar 18 resembling a human female, given a name of "Christine." The Virtual Physician 10 is able to gather information (input) from the patient using the tablet camera 22 and microphone 20, and is able to prompt the patient for further interaction (input) by using speakers 24. The Virtual physician 10 may be able to provide real time (or near real-time) questions and prompts to the patient that may be determined by the input and feedback it receives from the patient by comparing the input against a conversation driver knowledge base 28. The input may be transmitted to a central system 26 other than the tablet, where the knowledge database 28 may be accessed or may reside.
[0033] A technician may pre-program the Virtual Physician 10 to the proper patient to access the patient's files and history, via an appropriate electronic patient records database 38, for example. Alternately, the Virtual Physician 10 may be set to ask the patient about a certain condition, for example, epilepsy. Upon being activated, for example, the Virtual Physician may greet the patient by name to establish rapport. It may then observe the mood of the patient by comparing the optical cues of the patient obtained by the camera 22 sensor to its conversation driver knowledge base 28. The Virtual Physician 10 may prompt the patient to provide input with an open-ended question (not just a yes-or-no question). The Virtual Physician 10 may be programmed to appear to be concerned about the patient. The patient responds to the question (prompts) with conversational sentences, for example, about how he/she feels. The input is gathered though the microphone 20 and transmitted to a processor operating the patient user-interface engine 40, for example, where it is compared to the conversation driver knowledge database 28. A response, and further prompts based on the previous patient input, is selected to be communicated to the patient through the Virtual Physician using the avatar's 18 movements, and speakers 24.
[0034] The responses of the Virtual Physician 10 are based on the input of the patient, and further prompts are likewise based on the input of the patient in order to obtain valuable diagnosis information. Once the Virtual Physician 10 reaches the end of a conversation tree (receives the information it needs) it may change the subject and start new prompt chains to obtain other information from the patient. The system is capable of processing open ended conversation questions, and asks "anything else" to capture information which might not have been covered.
Responses not in the conversation driver knowledge base (such as those not typically related to epilepsy, for example) may be passed directly on to a real provider or physician's interface 14. The Virtual Physician 10 is also able to issue questionnaires, and is able to educate the patient on the treatment(s) and/or procedure(s) involved in treatment, where such educational scripts may also be stored on the conversation driver knowledge base 28, for example.
[0035] In a second phase, a central decision support engine 30 may store the input provided by the patient in the first phase. In this phase, the central decision support engine 30 may also compare the various inputs collected (questions, questionnaire, observations) data gathered, inputted and stored in its continually growing knowledge database(s) (such as the therapy knowledge base, the prognosis knowledge base and/or the co-morbidity knowledge bases 32, 34 and 36). Using algorithms the central decision support engine 30, for example, the patient input as compared to the knowledge database(s) 32, 34 and 36 may ascribe success rate percentage ranges for the particular patient with each treatment plan. It can weigh the treatments against each other for this particular patient, and recommend the viable medical treatment options.
[0036] In a third phase, a medical provider may access the central system 26, via a physician user-interface engine 44, with a computing device 46, such as a wireless tablet. The physician user-interface engine 44 may include one or more
applications that may exist on the central system 26 and/or the computing device 46. The Virtual Physician 10 is able to determine the medical provider's credentials. The medical provider is able to inform the Virtual Physician 10, via the physician interface 14, his/her observations with the patient, and requests a recommended treatment plan, with available parameters (for example, a provider may indicate the maximum dosage for a particular medicine, and it is up to the processor to determine if this medicine is a viable option). The Virtual Physician 10 may then provide several options of treatment plans, including the reasoning behind the particular course of action. If medicine is involved, the physician interface 14 may display the
prescriptions, corresponding strengths, and quantitative success rates. The system 10 through the Virtual Physician application(s) may also provide alternate options, including surgery, and the success rate of such a surgery for this particular patient (for example, taking in severity of symptoms of this particular patient, and comparing the input to the database of similar patients). The provider may also ask questions, via the interface 14, beyond treatment plans and options, including what the patient's chances are to be cured. The Virtual Physician may be able to provide percentage ranges of success through access to the various knowledge bases 32-36. The provider may also ask open ended questions for medical subjects other than epilepsy, which is merely the primary medical issue in this example. The Virtual Physician 10 can relate to the provider what the patient has said, as well as relay other observations about the patient. Using this information, the provider can respond with instructions to the Virtual Physician. For example, the provider may ask the Virtual Physician to perform related tasks, such as initiate information videos on the patient's device 16. It can be seen that the Virtual PhysicianIO is interactive with the patient and the provider, can offer comprehensive medical diagnosis and treatment, and can even go beyond the medical issue to help enhance patient's well- being.
[0037] In sum, an exemplary virtual physician extension system 10 according to the current disclosure may include: (a) a patient conversation-driver knowledge base 28, including conversation trees and decision trees designed for providing initial patient questions, and for providing follow-up questions based upon patient responses to initial patient questions; (b) a therapy knowledge base 32, a prognosis knowledge base 34 and/or a co-morbidity knowledge base 36; (c) a first computing device 16 providing a computerized patient interface 12, the first computing device 16 including a display, speaker, a mic 20, a camera 22, random access memory, persistent memory, patient interface application software resident (at least in part) on the persistent memory, an external data link, and processing circuitry, having access to the patient conversation-driver knowledge base 28 and operatively coupled to the display, the speaker 24, the mic 20, the camera 22, the random access memory, the external data link, and the persistent memory to operate the patient interface application software that is configured to, (i) provide an avatar 18 on the display and to elicit audible questions appearing to come from the avatar 8, via the speaker 24, to a patient based, at least in part, upon conversation trees provided by the patient conversation-driver knowledge base 28, and (ii) collect response data from audible responses received from the patient via the mic 20 and provide the response data to the patient conversation-driver knowledge base 28, and (d) a second computing device 46 providing a computerized practitioner interface 14, the second computing device 46 including a display, random access memory, persistent memory, practitioner interface application software resident (at least in part) on the persistent memory, an external data link, and processing circuitry, having access to therapy knowledge base 32, prognosis knowledge base 34 and/or co-morbidity knowledge base 34, and operatively coupled to the display, the random access memory, the external data link, and the persistent memory to operate the practitioner interface application software that is configured to display information to a practitioner based upon patient response data as applied with the therapy knowledge base 32, prognosis knowledge base 34 and/or co-morbidity knowledge base 36. [0038] While the systems and methods described herein constitute exemplary embodiments of the current disclosure, it is to be understood that the scope of the claims are not intended to be limited to the disclosed forms, and that changes may be made without departing from the scope of the claims as understood by those of ordinary skill in the art.
[0039] What is claimed is:

Claims

1. A virtual physician extension system, comprising: a patient conversation-driver knowledge base, including conversation trees and decision trees designed for providing initial patient questions, and for providing follow-up questions based upon patient responses to initial patient questions; at least one of a therapy knowledge base, a prognosis knowledge base and a co-morbidity knowledge base; a first computing device providing a computerized patient interface, the first computing device including a display, speaker, a mic, a camera, random access memory, persistent memory, patient interface application software resident, at least in part, on the persistent memory, an external data link, and processing circuitry, having access to the patient conversation-driver knowledge base and operatively coupled to the display, the speaker, the mic, the camera, the random access memory, the external data link, and the persistent memory to operate the patient interface application software that is configured to, provide an avatar on the display and to elicit audible questions appearing to come from the avatar, via the speaker, to a patient based, at least in part, upon conversation trees provided by the patient conversation-driver knowledge base, and collect response data from audible responses received from the patient via the mic and provide the response data to the patient conversation- driver knowledge base, and a second computing device providing a computerized practitioner interface, the second computing device including a display, random access memory, persistent memory, practitioner interface application software resident, at least in part, on the persistent memory, an external data link, and processing circuitry, having access to the at least one of the therapy knowledge base, prognosis knowledge base and comorbidity knowledge base, and operatively coupled to the display, the random access memory, the external data link, and the persistent memory to operate the practitioner interface application software that is configured to display information to a practitioner based upon response data as applied with the at least one therapy knowledge base, prognosis knowledge base and co-morbidity knowledge base.
2. The virtual physician extension system of claim 1 , wherein the avatar provided by the patient interface application software is at least one of a human avatar and an animal avatar.
3. The virtual physician extension system of claim 1 , wherein the practitioner interface application software is further configured to display at least one of prognosis information and co-morbidities risk information to the practitioner based upon response data as applied with the at least one therapy knowledge base, prognosis knowledge base and co-morbidity knowledge base.
4. The virtual physician extension system of claim 3, further comprising clinical decision support algorithms resident on at least one of the second computing device and a server external to the second computing device configured to apply the response data against the at least one of the therapy knowledge base, prognosis knowledge base and co-morbidity knowledge base in the generation of the at least one of prognosis information and co-morbidities risk information.
5. The virtual physician extension system of claim 4, wherein the clinical decision support algorithms access universal patient health records in the generation of the at least one of prognosis information and co-morbidities risk information.
6. The virtual physician extension system of claim 1 , wherein the patient interface application software is further configured to collect response data from visual responses received from the patient via the camera.
7. The virtual physician extension system of claim 1 , wherein the response data is based upon a combination of language and voice inflections received from the patient via the mic.
8. The virtual physician extension system of claim 1 , wherein the patient conversation-driver knowledge base is a dynamic knowledge base configured to adjust using machine learning.
9. The virtual physician extension system of claim 1 , wherein the first computing device is one of a tablet computer, notebook computer, and a hand-held computing device.
10. The virtual physician extension system of claim 1 , wherein the patient interface application software utilizes natural language processing algorithms and processes as part of collecting response data.
11. The virtual physician extension system of claim 1 , further comprising patient conversation support algorithms resident on at least one of the first computing device and a server external to the first computing device configured to apply the response data against the patient conversation-driver knowledge base and to elicit follow-up questions.
12. The virtual physician extension system of claim 11 , wherein the patient conversation support algorithms are configured to pass along patient responses to the practitioner interface falling outside of the conversation trees.
13. The virtual physician extension system of claim 1 , wherein the initial patient questions are open-ended questions.
14. The virtual physician extension system of claim 1 , wherein the information displayed to the practitioner by the practitioner interface includes at least one of: patient observations; recommended treatment plan; pharmaceutical information; optional treatments; and ranges of success.
15. The virtual physician extension system of claim 1 , wherein the practitioner interface application is further configured to receive patient treatment instructions from the practitioner utilizing the practitioner interface and to initiate communication of the patient treatment instructions to a patient record.
PCT/US2013/039063 2012-05-01 2013-05-01 Neuro-cognitive diagnosis and therapy decision support system WO2013166146A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261641275P 2012-05-01 2012-05-01
US61/641,275 2012-05-01

Publications (1)

Publication Number Publication Date
WO2013166146A1 true WO2013166146A1 (en) 2013-11-07

Family

ID=49514850

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/039063 WO2013166146A1 (en) 2012-05-01 2013-05-01 Neuro-cognitive diagnosis and therapy decision support system

Country Status (1)

Country Link
WO (1) WO2013166146A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160171971A1 (en) * 2014-12-16 2016-06-16 The Affinity Project, Inc. Guided personal companion
US20160171387A1 (en) * 2014-12-16 2016-06-16 The Affinity Project, Inc. Digital companions for human users
EP3153983A1 (en) * 2015-10-05 2017-04-12 Ricoh Company, Ltd. Advanced telemedicine system with virtual doctor
US9792825B1 (en) 2016-05-27 2017-10-17 The Affinity Project, Inc. Triggering a session with a virtual companion
US9802125B1 (en) 2016-05-27 2017-10-31 The Affinity Project, Inc. On demand guided virtual companion
EP3264301A1 (en) * 2016-07-01 2018-01-03 Panasonic Intellectual Property Management Co., Ltd. Information processing method and recording medium
US10140882B2 (en) 2016-05-27 2018-11-27 The Affinity Project, Inc. Configuring a virtual companion

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030146926A1 (en) * 2002-01-22 2003-08-07 Wesley Valdes Communication system
US20060010014A1 (en) * 1992-11-17 2006-01-12 Health Hero Network, Inc. Remote health monitoring and maintenance system
US20090055019A1 (en) * 2007-05-08 2009-02-26 Massachusetts Institute Of Technology Interactive systems employing robotic companions
US20090132275A1 (en) * 2007-11-19 2009-05-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Determining a demographic characteristic of a user based on computational user-health testing
US20100217619A1 (en) * 2009-02-26 2010-08-26 Aaron Roger Cox Methods for virtual world medical symptom identification
US20110077955A1 (en) * 2009-09-29 2011-03-31 Mckesson Financial Holdings Limited Methods, apparatuses, and computer program products for facilitating co-morbid care management

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060010014A1 (en) * 1992-11-17 2006-01-12 Health Hero Network, Inc. Remote health monitoring and maintenance system
US20030146926A1 (en) * 2002-01-22 2003-08-07 Wesley Valdes Communication system
US20090055019A1 (en) * 2007-05-08 2009-02-26 Massachusetts Institute Of Technology Interactive systems employing robotic companions
US20090132275A1 (en) * 2007-11-19 2009-05-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Determining a demographic characteristic of a user based on computational user-health testing
US20100217619A1 (en) * 2009-02-26 2010-08-26 Aaron Roger Cox Methods for virtual world medical symptom identification
US20110077955A1 (en) * 2009-09-29 2011-03-31 Mckesson Financial Holdings Limited Methods, apparatuses, and computer program products for facilitating co-morbid care management

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10235620B2 (en) 2014-12-16 2019-03-19 The Affinity Project, Inc. Guided personal companion
US20160171387A1 (en) * 2014-12-16 2016-06-16 The Affinity Project, Inc. Digital companions for human users
WO2016099827A1 (en) * 2014-12-16 2016-06-23 The Affinity Project, Inc. Digital companions for human users
US9704103B2 (en) 2014-12-16 2017-07-11 The Affinity Project, Inc. Digital companions for human users
US9710613B2 (en) 2014-12-16 2017-07-18 The Affinity Project, Inc. Guided personal companion
US20170220922A1 (en) * 2014-12-16 2017-08-03 The Affinity Project, Inc. Guided personal companion
US20160171971A1 (en) * 2014-12-16 2016-06-16 The Affinity Project, Inc. Guided personal companion
EP3153983A1 (en) * 2015-10-05 2017-04-12 Ricoh Company, Ltd. Advanced telemedicine system with virtual doctor
US10572626B2 (en) 2015-10-05 2020-02-25 Ricoh Co., Ltd. Advanced telemedicine system with virtual doctor
US9802125B1 (en) 2016-05-27 2017-10-31 The Affinity Project, Inc. On demand guided virtual companion
US10140882B2 (en) 2016-05-27 2018-11-27 The Affinity Project, Inc. Configuring a virtual companion
US9792825B1 (en) 2016-05-27 2017-10-17 The Affinity Project, Inc. Triggering a session with a virtual companion
CN107562770A (en) * 2016-07-01 2018-01-09 松下知识产权经营株式会社 Information processing method and recording medium
EP3438988A1 (en) * 2016-07-01 2019-02-06 Panasonic Intellectual Property Management Co., Ltd. Information processing method and recording medium
EP3264301A1 (en) * 2016-07-01 2018-01-03 Panasonic Intellectual Property Management Co., Ltd. Information processing method and recording medium
US11055799B2 (en) 2016-07-01 2021-07-06 Panasonic Intellectual Property Management Co., Ltd. Information processing method and recording medium

Similar Documents

Publication Publication Date Title
CN110024038B (en) System and method for synthetic interaction with users and devices
WO2013166146A1 (en) Neuro-cognitive diagnosis and therapy decision support system
US20170011195A1 (en) System And Method Of User Identity Validation in a Telemedicine System
US20080242947A1 (en) Configuring software for effective health monitoring or the like
US20130226601A1 (en) Remote clinical care system
US20170011177A1 (en) Automated healthcare integration system
US20230035208A1 (en) Clinical trial/patient follow-up platform
CN112786219B (en) Medical care management method, system and device
CN102185882A (en) Method and device for remotely monitoring embedded physiological information
US20120130739A1 (en) Unsupervised Telemedical Office for Remote &/or Autonomous & Automated Medical Care of Patients
CN114974613A (en) Disease management method and device, computer storage medium and electronic equipment
US20230298710A1 (en) Systems and method for medical platform employing artificial intellegence and wearable devices
US20080126123A1 (en) Customizing healthcare information
JP2018533800A (en) Patient Outcome Tracking Platform
JP7099751B2 (en) Patient assessment support device, patient assessment support method, program
US20080243543A1 (en) Effective response protocols for health monitoring or the like
Albert et al. Telemedicine in heart failure during COVID-19: like it, love it or lose it?
US20220384002A1 (en) Correlating Health Conditions with Behaviors for Treatment Programs in Neurohumoral Behavioral Therapy
CN112669963A (en) Intelligent health machine, health data generation method and health data management system
US20080242948A1 (en) Effective low-profile health monitoring or the like
JP6885663B2 (en) Information processing equipment and methods, and programs
CN205451063U (en) Platform is synthesized to general information of medical treatment
US20230320643A1 (en) Vr/ar phobia training in a controlled environment with stress level sensors and management through scenarios control
US20240090855A1 (en) Virtual Healthcare Apparatus, Methods And Systems
US20240029888A1 (en) Generating and traversing data structures for automated classification

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13784949

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13784949

Country of ref document: EP

Kind code of ref document: A1