US20220359050A1 - System and method for digital therapeutics implementing a digital deep layer patient profile - Google Patents

System and method for digital therapeutics implementing a digital deep layer patient profile Download PDF

Info

Publication number
US20220359050A1
US20220359050A1 US17/554,796 US202117554796A US2022359050A1 US 20220359050 A1 US20220359050 A1 US 20220359050A1 US 202117554796 A US202117554796 A US 202117554796A US 2022359050 A1 US2022359050 A1 US 2022359050A1
Authority
US
United States
Prior art keywords
patient
data
dlpp
treatment
deep layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/554,796
Inventor
Lynda Chin
Peter Bahrs
Raphael Chancey
Richard Lyle
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apricity Health Inc
Original Assignee
Apricity Health LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apricity Health LLC filed Critical Apricity Health LLC
Priority to US17/554,796 priority Critical patent/US20220359050A1/en
Assigned to Apricity Health LLC reassignment Apricity Health LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHANCEY, RAPHAEL, CHIN, LYNDA, LYLE, RICHARD, BAHRS, PETER
Publication of US20220359050A1 publication Critical patent/US20220359050A1/en
Assigned to APRICITY HEALTH, INC. reassignment APRICITY HEALTH, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: Apricity Health, LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/02Knowledge representation; Symbolic representation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/20ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/40ICT specially adapted for the handling or processing of patient-related medical or healthcare data for data related to laboratory analysis, e.g. patient specimen analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references
    • G16H70/20ICT specially adapted for the handling or processing of medical references relating to practices or guidelines
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references
    • G16H70/40ICT specially adapted for the handling or processing of medical references relating to drugs, e.g. their side effects or intended usage
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • the present invention relates to information handling systems. More specifically, embodiments of the invention relate to artificial intelligence systems for digital therapeutics that implement digital deep layer patient profile.
  • Immunotherapy makes use of a body's natural defenses or immune system to fight disease.
  • An area of immunotherapy used to specifically treat cancer is known as immuno-oncology (IO).
  • IO immuno-oncology
  • Cancerous cells can thrive because of their ability to hide from immune systems.
  • Immunotherapies or IO can mark cancer cells to allow immune systems to find and destroy cancer cells.
  • Certain immunotherapies or IO can boost immune systems to better fight against cancer.
  • Part of IO therapy or treatment includes activating a body's immune system response to overcome cancerous tumor survival and growth.
  • the treatment can also cause adverse side effects, such as attacks on healthy cells while attacking cancerous cells.
  • adverse responses can be referred to as Immune-mediated or Immune Related Adverse Event or irAE.
  • irAE can affect any organ system in the body, including the gastrointestinal tract, skin, heart and lung, liver and kidneys, the nervous system or brain, endocrine organs such as thyroid, pancreas, and many others.
  • irAE symptoms can include joint pain, swelling, soreness, redness, skin itchiness, rashes, fever, chills, dizziness, nausea, vomiting, muscle/joint paint, fatigue, headaches, trouble breathing, low/high blood pressure, swelling, retaining fluid, heart palpitations, sinus congestion, diarrhea, infection, vision problem etc.
  • Ongoing irAEs can lead to complications or end of life. It is understandable, that individual patients will have different reactions to certain immunotherapies or IO based on age, gender, genetics, prior medical history, cancer type, mutation type, and other differentiators. It is also understandable the categorically similar patients, once appropriately defined, may have categorically similar reactions to certain immunotherapies or IO. Patient's irAEs will be different and can dynamically change over time. Effective management of treatment of patients, assuring that better good is done than harm, is to accurately understand the therapy and effects of such therapy for each individual patient.
  • a system, method, and computer-readable medium are disclosed for digital therapeutics directed to patient care specific to a disease for digital therapeutics that implement digital deep layer patient profile.
  • Patient related information is presented by receiving data that includes patient data, lab result data, machine learning calculation data related to the patient, and physician result data.
  • the data is mapped as to intensities, multiple dimensions and time.
  • the mapping is converted to create an unstructured binary data with binary correlations as a digital deep layer patient profile.
  • the digital deep layer patient profile can be processed with machine learning and image processing algorithms.
  • FIG. 1 shows a general illustration of components of an information handling system as implemented in the system and method of the present invention
  • FIG. 2 shows a block diagram of a health AI environment
  • FIGS. 3A-3E show images representing a digital deep layer patient profile (DLPP);
  • FIG. 4 shows a deep layer patient profiles (DLPP) multi-dimensional image having a patient specific volume, density and shape
  • FIG. 5 shows a flowchart for patient conversation for digital therapeutics
  • FIG. 6 shows a flowchart for patient updates
  • FIG. 7 shows a flowchart for provider updates
  • FIG. 8 shows a flowchart for patient conversation for a patient to submit patient symptoms
  • FIG. 9 shows a flowchart as to when an electronic medical record (EMR) or lab notification is received
  • FIG. 10 shows flowchart as to when a health care provider changes data
  • FIG. 11 shows a flow or care pathway for digital therapeutics
  • FIGS. 12A, 12B show a block diagram of digital therapeutics
  • FIG. 13 shows a flowchart for presenting patient related information implementing a digital deep layer patient profile
  • FIG. 14 shows a screen presentation of a health care provider user interface
  • FIG. 15 shows a screen presentation of a health care provider user interface
  • FIG. 16 shows a screen presentation of a health care provider user interface
  • FIG. 17 shows a screen presentation of a health care provider user interface
  • FIG. 18 shows a screen presentation of a health care provider user interface
  • FIG. 19 shows a screen presentation of a health care provider user interface
  • FIG. 20 shows a screen presentation of an authoring tool user interface
  • FIG. 21 shows a screen presentation of an authoring tool user interface
  • FIG. 22 shows a screen presentation of an authoring tool user interface
  • FIG. 23 shows a screen presentation of an authoring tool user interface
  • FIG. 24 shows a screen presentation of an authoring tool user interface
  • FIG. 25 shows a screen presentation of an authoring tool user interface.
  • Digital Therapeutics includes software programs that provide efficacious interventions to patients to prevent, manage, or treat a broad spectrum of physical, mental, and behavioral conditions.
  • DTx can capture scientific evidence and expert knowledge about a cancer type and its treatment with a specific drug or class of drugs.
  • DTx captures and integrates real time patient data and clinical information.
  • DTx can generate insights personalized to a specific patient.
  • An information handling system generally processes, compiles, stores, and/or communicates information or data for business, personal, or other purposes thereby allowing users to take advantage of the value of the information. Because technology and information handling needs and requirements vary between different users or applications, information handling systems may also vary regarding what information is handled, how the information is protected, how the information is handled, how much information is processed, stored, or communicated, and how quickly and efficiently the information may be processed, stored, or communicated. In addition, information handling systems may include a variety of hardware and software components that may be configured to process, store, and communicate information and may include one or more computer systems, data storage systems, and networking systems.
  • an information handling system may include any instrumentality or aggregate of instrumentalities operable to compute, classify, predict, protect, process, transmit, receive, retrieve, originate, switch, store, display, manifest, detect, record, reproduce, handle, or utilize any form of information, intelligence, or data for business, scientific, control, or other purposes.
  • an information handling system may be a personal computer, a network storage device, or any other suitable device and may vary in size, shape, performance, functionality, and price.
  • the information handling system may include random access memory (RAM), one or more processing resources such as a central processing unit (CPU) or hardware or software control logic, ROM, and/or other types of nonvolatile memory.
  • Additional components of the information handling system may include one or more disk drives, one or more network ports for communicating with external devices as well as various input and output (I/O) devices, such as a keyboard, a mouse, a camera, a microphone, and a video display.
  • the information handling system may also include one or more buses operable to transmit communications between the various hardware components.
  • FIG. 1 is a generalized illustration of an information handling system 100 that can be used to implement the system and method of the present invention.
  • the information handling system 100 includes a processor (e.g., central processor unit or “CPU”) 102 , input/output (I/O) devices 104 , such as a display, a keyboard, a mouse, and associated controllers, a hard drive or disk storage 106 , and various other subsystems 108 .
  • the information handling system 100 also includes network port 110 operable to connect to a network 140 , which is likewise accessible by a service provider server 142 .
  • the information handling system 100 likewise includes system memory 112 , which is interconnected to the foregoing via one or more buses 114 .
  • System memory 112 further comprises operating system (OS) 116 and in various embodiments may also comprise a health artificial intelligence or AI system 118 .
  • OS operating system
  • the health AI system 118 includes multi-treatment digital therapy module 120 and deep layer patient profile (DLPP) module 122 .
  • the deep layer patient profile (DLPP) module 122 can store longitudinal data as a deep layer patient profile (DLPP).
  • the health AI system 118 can provide digital therapeutics or DTx on medical patients, delivering evidence-based therapeutics, which can be intervention of pending/planned treatment, to prevent, manage, or treat a medical disorder or disease, and in this example cancer.
  • DTx delivers evidence-based therapeutic interventions to patients using software programs to prevent, manage, or treat a medical disorder or disease.
  • DTx are used independently or in together with medications, devices, or other therapies to optimize patient care and health outcomes.
  • DTx can be independently implemented, or can be used with medication, medical devices, and other therapies to optimize patient care and treatment results.
  • DTx is implemented to provide detection, evaluation, treatment, and management of treatment related adverse events.
  • Outcomes are specific to patients and can have variable outcomes.
  • various technics can be used including genomic sequencing of blood, tissue, or stool, blood or urine maker determination, for detection, evaluation and treatment, etc.
  • phenomics data collected from patients, measured with connected devices, via mobile phone, web, or other connected means, in addition to clinical data from electronic health records (EHR), can be used to develop predictive models to guide precision management.
  • An integrated digital care pathway can be provided, that is informed by data and evidence, for management of treatment-related adverse events or Immune Related Adverse Events (irAE).
  • Scientific and expert evidence is captured about a disease and treatment with specific drug(s).
  • real-time patient data and clinical information is captured and integrated. Personalized/specific insight is generated for specific patients.
  • FIG. 2 is a block diagram of a health artificial intelligence (AI) environment 200 implemented in accordance with an embodiment of the invention.
  • the health AI environment 200 includes the information handling system 100 which includes the health AI system 118 .
  • the health AI system 118 includes the E-combination digital therapy module 120 and deep layer patient profile (DLPP) module 122 .
  • the information handling system 100 includes a data repository 202 which can store information, such as patient data processed by the information handling system 100 .
  • the data repository 202 can be configured to store patient related data, such as deep layer patient profiles (DLPP) 204 and data identifiers (IDs) 206 .
  • the deep layer patient profile (DLPP) module 122 can be used to generate deep later patient profiles (DLPP) 204 .
  • Deep layer patient profiles (DLPP) 204 can be continuously updated as further described below. Deep layer patient profiles (DLPP) 204 can include patient identifier (ID), recommendations, patient data entered through the health AI environment 200 , electronic health records (EHR) electronic medical records (EMR), etc.
  • context entered through the health AI environment 200 includes deep layer patient profiles (DLPP) 204 , where context can be a session with a deep layer patient profile (DLPP) with questions answered, and type of data that is desired, such as time zone, language, required or optional data, longitudinal data (e.g., over 14 day time period), etc.
  • Data IDs 206 can be used to identify data. For example, data is collected as to DTx knowledge models and symptoms as further described below. In certain implementations, data IDs 206 can be used to identify collected data, such as gathered answers to patient questions. In certain implementations, the data repository 202 includes a journal 208 , which for example, can store scores related to symptoms of different categories. In certain implementations, the data repository 202 includes DTx knowledge models 246 .
  • DTx knowledge models categorization Domain Oncology Program: Adverse Event Toxicity Management Questions - languages and genres Rules - pre and post conditionals and groups Actions - additional actions - dialog, rules, actions Detect Models Trigger - Symptom that drives model to be activated (e.g. rash for dermatitis, diarrhea for colitis) Primary Symptom - Priority, Weights, ID (e.g. headache) Associated Symptom - Priority, Weights, ID (e.g. location, severity, etc. of headache) Clinical Modifier - Priority, Weights, ID (e.g. had prior irAE) Diagnostic Models Clinical lab results drive model to be activated (e.g.
  • DTx calculations provide for weight variables to be adjusted based on organization preference, expert opinion or AI/ML algorithms.
  • Weight variables can be applied to, and are not limited to the following types: questions, symptoms, recommendations, rules, conditions, and actions, etc. Weights can be used to emphasize results, remove results, adjust priorities, or affect calculations. Weights can be applied to a category, a list of categories, an item, a list of items, or a chain of any of the aforementioned.
  • Priority for a selection, multiply the selection's priority by a weight.
  • Priority Inversion for a selection, after a previous high priority selection is identified and processed, if the current priority selection is the same as previous for N times, reduce the current priority so that the next selection with a lower priority can be processed. Derivatives: for values 0, 1, 2, . . . , N; if 0, then use the current value for any longitudinal selection as is (e.g.
  • Weighted Inversion divide selection priority by weight. Chaining: when multiple selections are weighted resulting in the same evaluation, a chain identifies a list of selection types that are processed left to right for final arbitration, (e.g., colitis, endocrinopathy, etc.). Environmental: reduce impact of selection weight by D % with accompanying environmental condition is present, (e.g., headache with “sun exposure.” Lab: modify impact of selection weight by M % based on lab result. LikeMe: modify weights in DLPP based on a function of weights resulting from P other patients or L other lab results; the function( ) can include replaced, multiply, divide or any of the previous weights.
  • the information handling system 100 can include a text classifier 208 .
  • Data or information can be received in a specific format as defined by a media type, or in certain instances a Multipurpose Internet Mail Extensions or mime type.
  • the text classifier 208 can be used to covert mime types to accessible text or data/information that is stored.
  • an answer type e.g., text, speech, image, etc.
  • Standard text classification can be implemented, such as speech to text, text to speech, image to text, text to image, etc.
  • the health AI system 118 can include various components.
  • such components include, a dialog component 212 , a transform component 214 , a diagnose component 216 , a detect component 218 , and a recommend component 220 .
  • the components will be further discussed below.
  • the digital therapy module 120 can also include an authoring tool 222 .
  • the authoring tool 222 can be used by entities, such as medical experts, and implemented to author/provide: questions, answers IDs, priorities, conditions, languages, etc.
  • the authoring tool 222 further can deploy DTx knowledge models.
  • multiple domains, programs, diseases, symptoms, subject matter expert knowledge models for the dialog component 212 , a transform component 214 , a diagnose component 216 , a detect component 218 , weight, priorities and genres are supported.
  • the information handling system 118 can connect to network 140 .
  • Network 140 is representative of various networks, that include internal and external networks, secure and unsecure networks, various computing devices, such as servers, cloud computing networks and devices, etc.
  • the described entities of health AI environment 200 are provided limited or secure access to certain networks and computing devices of network 140 .
  • the health AI environment 200 includes patient(s) 224 who are monitored and provided treatment, such as immunotherapy or immuno-oncology (IO).
  • Patient(s) 224 can use patient devices 226 to provide information and to receive information.
  • patient data monitoring/sensing devices 228 can collect or gather information or data about patient(s) 224 .
  • patient data monitoring/sensing devices 228 can include personal wearable devices, heat or optical measurement devices that detect temperature and images of the patient(s) 224 .
  • the patient data monitoring/sensing devices 228 can provide information or data to devices 226 by wireless transmission, such as Bluetooth.
  • a device 226 refers to an information handling system such as a personal computer, a laptop computer, a tablet computer, a personal digital assistant (PDA), a smart phone, a mobile telephone, a camera, a mirror, a robot or other device capable of communicating and processing data.
  • device 226 can present patient(s) 224 with patient related questions, as part of a conversion or dialog for digital therapeutics.
  • the conversion or dialog is performed through the dialog component 212 .
  • the dialog component 212 receives patient answers to health symptom dialog questions.
  • the dialog component 212 can further send questions to the patient for additional patient related information.
  • the patient answers are in response to modified dialog questions from detect and diagnose questions. Modified questions can be the result from reducing conflicting questions, changing questions based on a specific genre question syntax, re-categorized and re-ordered questions based on priority resolution, and additional questions for subjective reduction.
  • the conversation or dialog from dialog component 212 can be performed with devices 226 through various entry points such as applications and web browsers running on the devices 226 .
  • Answers to the questions can be obtained by entering text or information directly on devices 226 or can be gathered through specific patient data monitoring/sensing devices 228 such as a smart mirror which can detect facial expressions (e.g., facial image data), a microphone which can detect voice inflections, fixed and moveable robotics, personal cameras, etc.
  • data related to answers can be obtained through artificial intelligence (AI) which can include sentiment analysis (e.g., elated, happy, sad, angry), voice changes (faster, slower, louder, quieter), image changes (facial lines, weight loss, growing rash), etc.
  • Patient answers can be presented through a user interface, audio input and output, camera input and output, and gesture input and output, etc.
  • the questions from the dialog component 212 can be structured to receive answers from the patient(s) 224 to be used by the information handling system 100 and specifically the E-Combination digital therapy module 120 and its components as described herein. The answers can also be used by other entities of health AI environment 200 as further described below.
  • the structured questions can be stored in a database or repository, such as data repository 202 , and be stored by category, such as disease category (e.g. cancer, diabetes), treatment category (e.g. immunotherapy, chemotherapy), etc. For each category, questions can be defined in order to acquire needed patient reported data.
  • questions can have pre-conditions that are met for a question to be selected for conversation with a patient. Questions can have an identifier or ID number and be given a priority number.
  • Questions can be categorized to align with a medical term or program (e.g., treatment program), where there can be multiple programs.
  • programs can include chemotherapy, radiation therapy, immunotherapy, surgery, targeted therapy, hormone therapy, etc.
  • a program category for example immunotherapy or IO, can be defined by a DTx knowledge model which can include categories such a “primary symptoms”, “secondary symptoms”, “clinical modifiers”, “laboratory tests”, “imaging studies” etc.
  • DTx knowledge models can determine symptom scores and an overall score for the questions, and the DTx knowledge models can be used as criteria for selecting questions from a list of total questions.
  • medical or disease experts as further described below, can author, group, and provide attributes for questions.
  • a category of questions can be identified, and the dialog component 212 can modify or reduce the potential list of questions to actual questions that are presented to patient(s) 224 .
  • the recommend component can provide recommendations from the primary symptoms and laboratory results for addressing adverse events and related to the longitudinal data, deep layer patient profile (DLPP), and knowledge models. Recommendations can be derived from algorithms that use the patient answers, the laboratory results and the knowledge models. Furthermore, the recommendations can be derived by algorithms that use detect and diagnosis results, and produced preliminary diagnosis, priorities and weights.
  • Reducing/refining the questions can proceed as follows.
  • a number of questions are selected from a current category.
  • a number of questions are increased based on detected characteristics processed by artificial intelligence (AI) which can include sentiment analysis (elated, happy, sad, angry), voice changes (faster, slower, louder, quieter), image changes (facial lines, weight loss, growing rash), etc.
  • AI artificial intelligence
  • Questions can be increased by aligning AI characteristics to a question pool and retrieving additional questions. Questions can be reduced to only include questions for a current symptom (e.g., blood, diarrhea, headache, etc.).
  • Questions can be sorted in priority order. Questions can be presented in a priority order, such that certain questions can take precedence over other questions in diagnosing/evaluating patient(s) 224 .
  • the next question can be ensured to have a category related to the previous question, such that questions are asked in appropriate groups of symptoms.
  • a question can be removed, if the same question is asked within a time window of question attribute frequency.
  • a question can be included, if an “alert” indicates a question is to be repeated.
  • a question can be removed, if pre-conditions have not been met.
  • a question index number can be converted to match a request genre of questions, such as talking to a child versus talking to an elderly person, talking in a dialect versus talking in a standard language, talking with text versus pictures versus audio, etc.
  • a question index number can be used to retrieve a question in a correct language (e.g., English, Chinese, Spanish, etc.).
  • the health AI environment 200 can include developers 230 who provide and modify applications and software modules, such as components 212 , 214 , 216 , 218 and 220 . Developers 230 can create and update DTx knowledge models used in the digital therapeutics or DTx. Developers 230 can provide and update questions, rules and actions as to the DTx knowledge models. Developers 230 can connect to various entities of the health AI environment 200 and exchange information through devices 232 .
  • a device 232 refers to an information handling system such as a personal computer, a laptop computer, a tablet computer, a personal digital assistant (PDA), a smart phone, a mobile telephone, or other device capable of communicating and processing data.
  • Devices 232 can be connected to network 140 , and in certain implementations are connected to secure networks and devices that are included in the network 140 .
  • the health AI environment 200 can include disease or medical experts 234 who provide input updates in the development of applications and software modules, such as DTx knowledge models. Medical experts 234 can use the authoring tool 222 to author, group and provide attributes as to patient questions. In certain implementations, medical experts provide rules as used in by detect component 218 .
  • the detect component 218 can process and present primary symptoms with associated detect knowledge models as related to the longitudinal data and deep layer patient profile (DLPP). Medical experts 234 can connect to various entities of the health AI environment 200 and exchange information through devices 236 .
  • DLPP deep layer patient profile
  • a device 236 refers to an information handling system such as a personal computer, a laptop computer, a tablet computer, a personal digital assistant (PDA), a smart phone, a mobile telephone, or other device capable of communicating and processing data.
  • Devices 236 can be connected to network 140 , and in certain implementations are connected to secure networks and devices that are included in the network 140 .
  • the health AI environment 200 can include health care providers 238 who assist in treating patients 224 .
  • health care providers 238 can include physicians, nurses, laboratory/diagnostic facilities, entities providing treatment, etc.
  • health care providers 238 make use of digital therapeutics or DTx for patients 204 .
  • DTx can be independently implemented, or can be used with medication, medical devices, and other therapies to optimize care and treatment results for patients 224 .
  • Health care providers 238 can connect to various entities of the health AI environment 200 and exchange information through devices 240 .
  • Medical experts 234 and health care providers 238 can present physician results with associated diagnostic knowledge models as related to the longitudinal data and deep layer patient profile (DLPP).
  • DLPP deep layer patient profile
  • a device 236 refers to an information handling system such as a personal computer, a laptop computer, a tablet computer, a personal digital assistant (PDA), a smart phone, a mobile telephone, or other device capable of communicating and processing data.
  • Devices 236 can be connected to network 140 , and in certain implementations are connected to secure networks and devices that are included in the network 140 .
  • the health AI environment 200 includes a health data repository 242 , which can include various data stores, databases, etc.
  • the health data repository 242 and data and information stored therein can be made available to various entities of health AI environment 200 , including the information handling system 100 , and specifically the E-Combination digital therapy module 120 .
  • the health data repository 242 can be connected to the network 140 , and to secure and unsecure networks of network 140 .
  • Health data repository 242 can include electronic health records (EHR) 244 .
  • EHR 244 can include health information of patients 204 .
  • health information can include administrative information, progress data, demographics of the patient, medical history, previous and prior diagnoses, medications of the patient, immunization record, allergies of patient, radiology and laboratory information, test results, etc.
  • EHR 244 can include electronic medical records or EMR.
  • An EMR can be considered a subset of an EHR, where the EMR can include specific information for a patient from a particular physician or clinic.
  • An EMR can track data over time, identify particular patients for treatment, check patients' status based on certain parameters, such as blood pressure, etc.
  • dialog component 212 transform component 214 , diagnose component 216 , detect component 218 , and recommend component 220 are further described. It is to be understood that implementation of the invention, including the components, may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in an embodiment combining software and hardware.
  • the dialog component 212 can present to patient(s) 224 , a refined list of questions, which are answered by the patient(s) 224 .
  • the questions can be stored in a database, such as data repository 202 , and can be stored by category (e.g., diabetes, immunotherapy, chemotherapy, or other disease category).
  • category e.g., diabetes, immunotherapy, chemotherapy, or other disease category.
  • the question categories can align with medical term program, where multiple programs can be supported. For each category, the questions can be defined in order to get needed patient reported data. Questions can also have pre-conditions that are met in order for questions to be selected for conversation with a patient(s) 224 . Questions can have an identifier (ID) and a priority number.
  • ID identifier
  • a program category (e.g., immunotherapy) can have a defined DTx knowledge model that includes primary symptoms, secondary symptoms and clinical modifiers, etc.
  • the DTx knowledge model can determine symptom scores and an overall score.
  • the DTx knowledge model can be used as criteria for selecting questions out of the total list of questions.
  • the authoring tool 222 can be used by medical experts 234 to author, group and provide attributes for questions.
  • the dialog component 212 can interact with patient(s) through various entry points, such as a mobile phone application, tablet application, web browser, smart mirror, microphone, camera, fixed and moveable robotics, personalized cameras, human delegate, etc.
  • entry points such as a mobile phone application, tablet application, web browser, smart mirror, microphone, camera, fixed and moveable robotics, personalized cameras, human delegate, etc.
  • the dialog component 212 and the recommend component 220 identify a category of questions
  • the dialog component 212 can intelligently modify/reduce a potential list of questions to actual questions. Refining the questions by the dialog component is described in detail above.
  • the transform component 214 can transform answers to questions into longitudinal data.
  • Longitudinal data being repeated observational data using the same variable over various times.
  • longitudinal data can be digital deep layer patient profile (DLPP) (e.g., deep layer patient profile (DLPP) 204 ).
  • DLPP digital deep layer patient profile
  • a symptom has an identifier (ID) for data that is being collected (e.g., rash color)
  • ID e.g., rash color
  • the data e.g., rash color
  • the data can be added to longitudinal data for the identifier (ID) for the data with a timestamp.
  • the questions can have N answers, where each answer can be mapped to a different identifier (ID).
  • Each answer has an answer type (e.g., text, image, number, voice).
  • Primary symptoms can be produced from algorithms using patient reported symptoms and the subject matter expert knowledge models.
  • natural language patient reported symptoms disambiguating patient reported outcomes with dialog techniques, and producing a numerical longitudinal mapping for structured and deep layer patient profile (DLPP) storage are provided.
  • DLPP deep layer patient profile
  • the text classifier 210 can convert mime types to text or data type that are stored.
  • the type related to an answer can indicate how to store data.
  • the data can be a tuple (i.e., finite ordered list), such as value/duration, and includes a timestamp. For example, if blood is found in stool, the tuple is (blood, stool).
  • the detect component 218 can also generate longitudinal data. For example, if value/duration, duration of how long value has been going on.
  • the transform component 214 can update a deep layer patient profile (DLPP), which can be used to record patient responses, patient recorded outcomes, and other data.
  • DLPP deep layer patient profile
  • AI artificial intelligence
  • a deep layer patient profile is a mechanism to turn related and structured patient data into a multidimensional unstructured data format that is processed with machine learning and image processing algorithms. Instead of selecting data, for example, by patient id, date, and data types needed, a deep layer patient profile (DLPP) is used for searching for intensity values that match thresholds.
  • the deep layer patient profile (DLPP) is also anonymized from the original data source and thus can be shared without privacy concerns.
  • the deep layer patient profile (DLPP) after anonymization can be manipulated with completely new algorithms focused on binary and intensity operations, such as image processing and deep learning.
  • CCAE Common Terminology Criteria for Adverse Events
  • mapping values can be represented by another dimension in the deep layer patient profile (DLPP).
  • the deep layer patient profile (DLPP) enables longitudinal data (all of the aforementioned mappings timestamped) by providing a time dimension.
  • FIG. 3A shows an example of a view 300 of structured patient profile symptoms related to conditions and then related to Score and how these combined values can be mapped to intensities in a deep layer patient profile (DLPP).
  • DLPP deep layer patient profile
  • multiple symptoms, conditions and scores are given unique intensity mappings. Determining a unique value can be a 1 to 1 mapping between the sum of the available value ranges of structure items to an intensity value of 0 to the sum of ranges. More complex mappings are possible as in the case of needing only red, amber or green values, or, in the case of needing on gray scale values.
  • FIG. 3B shows an example of a longitudinal graph of symptoms that is mapped to a straight line with intensities.
  • mapping longitudinal data 302 the time dimension can be incremented, and additional intensities are added 304 .
  • FIG. 3C shows an example user interface view 306 of a patient profile.
  • a deep layer patient profile (DLPP)
  • DLPP deep layer patient profile
  • the patient signature can be “viewed” with multiple perspectives.
  • a common perspective is to view the patient profile in a user interface view on a computing screen 306 .
  • the user interface view 306 shows time at the bottom horizontal access and various values from N other dimensions stacked on top.
  • the user interface view 306 becomes a patient's quick response (QR) code/ink blot identifier.
  • the deep layer patient profile (DLPP) can be optimally used for performing artificial intelligence/machine learning algorithms, such as with Python, OpenCV, Node, etc.
  • FIG. 3D shows an example process of development and use 308 of deep layer patient profile (DLPP).
  • A 310
  • dimensions are shown as used in a particular deployment of a deep layer patient profile (DLPP).
  • the values per recording of patient data are shown in the formulas or variables.
  • Conditions 1 through C can have a score value of the summation of the symptom scores.
  • slope i.e., change in intensity
  • derivatives i.e., change in change
  • the rules indicated are related to patient report data. Lab data and electronic health record data can also be used to update reported data.
  • an N dimensional region of the DLPP can be referred to as a “slice” and uses Python notation (e.g., range dimension 1, range dimension 2, . . . range dimension N).
  • Python notation e.g., range dimension 1, range dimension 2, . . . range dimension N.
  • “B” 312 shows a structured recording of patient profile over time t 1 , t 2 , . . . tn.
  • each value recorded that is part of a deep layer patient profile (DLPP) is mapped to an intensity value.
  • time t 3 (column t 3 ) shows raw intensity values.
  • Raw values such as 1 . . . 5, can be scaled to wider values, such as 1 . . . 255, to allow more visual distinction when using a user interface perspective. Additional values can be included such as summation, mean, standard deviation, mode, median, and outliers.
  • C 314 shows the user interface view of a deep layer patient profile (DLPP) contain mapped data.
  • DLPP deep layer patient profile
  • Other perspectives include, but are not limited to, binary arrays, gray images and animations of intensities over time.
  • the animation in particular, can be a standard graph animation, or a geolocation animation where the intensity type is animated over a 3D human body outline.
  • slice and searching may be possible using ML and image processing algorithms.
  • these algorithms there may be no real indication that the included data is patient health data. Rather, algorithms such as movement detection, line detection, object detection, histogram analysis, rotation, translation, shear, thresholding, blurring, erosion, dilation, contouring, etc. can be used.
  • the deep layer patient profile can be a gateway between structured and protected patient health data, and, intensity based on data upon which to perform high performance AI algorithms.
  • a N dimensional region of a deep layer patient profile (DLPP) can be compared against other regions, such as is the case when comparing different time regions.
  • a N dimensional region of a deep layer patient profile (DLPP) can be compared to a region of another patient, such as is the case when searching for “a patient like me.”
  • the good data sets are those deep layer patient profiles (DLPP) where the outcome was successful—the best case being the patient is cured of the disease. Having good data sets allows Detect, Diagnose, to create inferred and probabilistic recommendations based on patching a current patient from time t 1 to tn to other patients who have survived from time t 1 to tn+K, where K is more time past where the current patient is. Time tn to tn+K are where next irAEs, recommendations, lab tests or questions will ‘probably’ be needed or addressed by the current patient.
  • DLPP deep layer patient profiles
  • FIG. 3E shows a goal outcome 316 .
  • Outcome 316 describes a set of longitudinal data, DLPP images, recommendations/treatments that indicate adverse events is/has been removed.
  • outcome 316 shows a combination of longitudinal paths, plus other data, where the trends approach “0” indicating that cancer has been removed. There can be various conditions that are considered as good. With enough good paths that are capture, it may be practical to pattern match a current “patient A” to search for similar paths. It may be possible to show that “patient A” with “confidence J” of next likely symptoms, conditions, trends, severity score, recommendations, etc.
  • CTSCAN/MRI 3D views that are sliced but where data is homogenous (i.e. all about neck, pixels) and geographically close. Semantics may be known at the beginning, such as taking a scan of the neck to look for new data based on values scanned (i.e. tumor in neck).
  • DLPP data may be multi-type and not homogenous and geographically disperse. For example, reported blood lower abdomen, fever in head, and rash on arm, etc. The semantics may not be known at the beginning. A determination may be made as toxicity, where a patient has cancer and is under toxic treatment based on reported data, expert authored models, and treated under a DTx platform. However, previously known toxicity effects, patient reports, or where/when irAE occurs are not known. Semantics can be derived, and then intensity values are derived, such as to display as pixels in a user interface.
  • DLPP can include primary data, such as patient reported, lab, EHR, social (e.g., family, wealth, etc.), environment (e.g., altitude, seasons, etc.), energy (movement, exercise, diet, etc.) and symptoms as represented by vector of change.
  • the DLPP can include secondary data, such as score, grading, counts, vector of change, determined conditions, score number, CTCAE score, etc.
  • DLPP can include tertiary data, such as using primary and secondary data with expert knowledge leading to recommendations.
  • the DLPP can use quaternary data, such as AI/ML algorithms, image processing and pattern matching across primary/secondary/tertiary data. Values can be replaced (i.e. what if next 2 days were these results), then match is performed.
  • Raw data can be provided by a patient or device (e.g. Yes/No; blood pressure 150/99).
  • Information data may be context to the raw data (e.g. patient indicated a fever).
  • Knowledge data can add additional context and experience to information data (e.g. patient having AE to medicine; patient has flu).
  • Derived data can add analytics to knowledge, information and raw data (e.g. average fever using medicine, % chance fever reduces in N days).
  • Inferred data can add AI/ML to derived, knowledge, information and raw data to generate additional deterministic data (e.g. based on 1000 patients like current patient, reduce medicine 50%).
  • Probabilistic data can add AI/ML to the aforementioned data to generate additional probability confidence data (e.g. 80% probability patient will escalate AE if medicine continued at 100%; 90% confident that current medication with two other medications will not reduce disease).
  • DLPP can contain all data progression levels as longitudinally included.
  • the transform component 214 can include an ambiguity layer for uncertain, vague or subjective responses from patients 224 .
  • the ambiguity layer can translate subjective non-standardized responses into objective standardized responses.
  • two patients 224 indicate a headache level 4 ; however, the pain threshold may be different between the two patients 224 . Therefore, a follow up question can be related to negative effects of the headache. For example, did you go to work? If one patient says no, there can be some level of assurance that the headache is debilitating and prevented the patient from working.
  • the diagnose component 216 can be invoked when deep layer patient profile (DLPP) updates are received from EMR, third party systems and lab results. Examples of such deep layer patient profile (DLPP) updates include lab results, x-ray results, blood work results, etc.
  • the dialog component 216 can process and present laboratory results with associated diagnostic knowledge models as related to the longitudinal data and deep layer patient profile (DLPP).
  • the diagnose component 215 can then provide additional recommendations based on the detect component 218 recommendations.
  • the diagnose component 216 can confirm results, such as lab results (e.g., low blood count, colitis, etc.)
  • the diagnose component 216 can provide treatment recommendations. Receiving a lab result can trigger the diagnose component 216 .
  • a difference between the diagnose component 216 and the detect component 218 is that the source of data for the diagnose component 216 includes lab test results, imaging studies, or procedures (clinical data) versus the source of data for the detect component 218 is patient provided data/outcomes (non-clinical data).
  • the diagnose component 216 and the detect component 218 can use different or overlapping DTx knowledge models.
  • the detect component 218 can filter out symptoms and rule out patient reported outcomes (PRO).
  • the detect component 218 can be invoked by the transform component 214 after the dialog component 212 invokes the transform component 214 .
  • Deep layer patient profile 204 can be sent to the detect component 218 .
  • the detect component 218 can also access the data repository 202 for any other relevant information.
  • the detect component 218 can load primary symptoms with associated symptom knowledge models.
  • the detect component 218 can run detection rules, where such rules can be authored by medical experts 234 and provided in a table of rules.
  • the following tables are examples of such rules.
  • Pneumonitis; primary shortOfBreath, coughSeverity, oxygen, oxygenWalking CTCAE Grade Grade 0 Grade 1 Grade 2 Grade 3 Grade 4 Alert Color green yellow red Evaluation 0 1 2 3 4 Priority Primary 0 for 1 on 1 on 2 on 2 on 3 on 3 on 4 on Symptoms all any any any any any any any any Associated 0-4 for 0-2 for 3-4 on 0-2 for 3-4 on 0-2 for 3-4 on 0-4 for 0-4 for Symptoms all all any all any all any all any all any all any all any all Associated 0-4 for 0-2 for 3-4 on 0-2 for 3-4 on 0-2 for 3-4 on 0-4 for Symptoms all all any all any all any all any all
  • Uveitis; primary visionChanges, eyePain CTCAE Grade Grade 0 Grade 1 Grade 2 Grade 3 Grade 4 Alert Color green yellow red Evaluation 0 1 2 3 4 Priority Primary 0 for 1 on 1 on 2 on 2 on 3 on 4 on Symptoms all any any any any any any Associated 0-4 for 0-2 for 3 on 0-2 for 3 on 0-3 for 0-3 for Symptoms all all any any any all all all
  • ALL meaning AND
  • ANY meaning OR of 1 or more
  • Patients 224 can report symptoms in grades, for example 0 to 4 for severity of blood. Grading scores determine level of symptom. Symptoms can be evaluated in right to left order in Table and indicated by evaluation number. If primary symptom is met (i.e. ALL or ANY evaluates to True) then evaluation associated symptoms are also, both primary and (associated symptoms OR clinical modifier) are True.
  • the detect component 218 further evaluates symptom scores for each category (e.g. Myositis, Dermatitis, Pneumonitis, etc.). Scores can be calculated from a SUM of (severity grade X weights). Patient 224 can define the grade, medical experts 234 using the authoring tool 222 can define weights. Scores not with Table guidelines can be filtered out. Filtered questions will not be used. Common Terminology Criteria for Adverse Events or CTCAE are a set of criteria for the standardized classification of adverse effects of drugs used in cancer therapy. CTCAE uses a range of grades from 1 to 5.
  • the scores can include CTCAE grades along with the scores determined by the digital therapy module 120 .
  • the CTCAE grade is per system area, while the scores determined by the digital therapy module 120 is a sum of all scores.
  • the detect component 218 can look at trends of CTCAE grade or sum of scores over time (longitudinal) and determine if a “alert” should be set to further in the questions to patient 224 .
  • the detect component 218 can store scores in journal 208 (data repository 202 ). Authorized parties can view the journal 208 .
  • the detect component 218 can invoke the recommend component 220 .
  • the recommend component 220 provides a set of recommendations for treatment. When invoked by the detect component 218 , the recommend component 220 can retrieve context or deep layer patient profile (DLPP) 204 . The recommend component 220 can load actions from the data repository 202 for programs that patient 224 is associated with. The recommend component 220 can run each action and test for pre-conditions. If a pre-condition is true, then tasks associated with the action are executed. Tasks can notify provider 238 , add a recommendation, add longitudinal data, add follow up, notify patient 224 , order a medical test for patient 224 . The recommend component 220 can build a collection of action results. The recommend component 220 can run through the recommendations and relate each to a group ID (e.g.
  • the recommend component 220 can provide traceability of conflicting recommendations for health care providers 238 . Conditions that led to the recommendation can be attached on the recommendation.
  • the recommend component 220 can determine what knowledge model and the level of the knowledge model the recommendation came from. Recommendations can be weighted, for example a weight of “1” can indicate it is required by the health care provider 238 , where a weight of “0” may mean it is optional. A health care provider 238 may require provider 238 indicate that they complied with the recommendation or check that the recommendation was followed. Recommendations can be written to the data repository 202 . Deep layer patient profiles (DLPP) 204 and context can be supplemented with recommendations and provided to the detect component 218 and used by the dialog component 212 .
  • DLPP Deep layer patient profiles
  • one program can be IO treatment related adverse event management.
  • other examples of programs can include chemotherapy-related, radiation therapy-related, targeted therapy-related, hormone therapy-related adverse event management, etc.
  • a set of knowledge models defines a program.
  • a set of questions, rules and actions are provided for the program.
  • Certain knowledge models are applicable to certain components of the digital therapy module 120 .
  • the knowledge models are defined by variables, weights and values, and other factors.
  • a “trigger” is a symptom that drives the knowledge model to be activated (e.g., rash for dermatitis, diarrhea for colitis, elevated liver function test results for hepatitis).
  • a “primary symptom” is related to a priority, weights, and an ID (e.g., headache).
  • An “associated symptom(s)” is related a priority, weights, and an ID (e.g., location, severity, etc. of the headache).
  • a “clinical modifier” is related to a priority, weights, and an ID (e.g., had a prior irAE).
  • a “primary symptom” is related to a priority, weights, and an ID (e.g., headache).
  • An “associated symptom(s)” is related a priority, weights, and an ID (e.g., location, severity, etc. of the headache).
  • a “clinical modifier” is related to a priority, weights, and an ID (e.g., had a prior irAE).
  • a recommendation can be weighted by a health care provider 238 and identified.
  • FIG. 4 is a block diagram that shows deep layer patient profiles (DLPP) multi-dimensional image having a patient specific volume, density and shape.
  • the multi-dimensions can be understood as a multi-helix set of adverse events starting and stopping as time progresses as shown by events 400 .
  • additional dimensions of information can be sliced and include, but not limited to, images (squares), audio (ovals), body proximity area (x,y,z x2,y2,z2), recommendations, medicines per recommendation, etc.
  • Patients will generally have different volumes, density and shapes.
  • Within a disease there is a bounding of known differences, which aids in machine learning.
  • Across diseases there is in intersection of slices that are similar. For example, with cancer and heart disease, many of the symptoms, vitals and disease are different. However, there is a higher correlation of patients that become cancer free and then develop a heart disease.
  • DLPP can enable machine learning and image processing training and comparison across within and across diseases.
  • Adverse event reports of rashes using numerical ranges 1 to 5 over a time period are less dense than image reporting or voice audio description reporting.
  • patient reports have been mapped to intensities.
  • a slice over the dense areas for rash at a period of time is extracted across many patients and used for machine learning training.
  • the machine learning can implement area intensities for images training, area intensities for audio training, and a combination for audio feature extraction (tense, happy, slurring) and image training. Given a set of extracted slices that have been processed for machine learning training, possible future contents can be produced. Future contents include rashes, headaches, mood, vitals, etc.
  • mapped data is sliced to extract intensities that related to location on a patient's body, physically or emotionally as shown in 402 .
  • Real reported rash images, or artificial canonical images are projected onto an artificial body to create a “movie like” reporting (actual data) with various endings (possible futures) as shown in 402 .
  • the patient's “movie” can be carried across illness and treating physicians and includes non-disease specific data that provide non-intuitive visual insights including height, weight, slouching actual symptoms vs. reported symptoms, valid symptoms vs. reduced symptoms (real pain), cancer size, cancer consistency, etc.
  • FIG. 5 is a generalized flowchart 500 for patient conversation for digital therapeutics.
  • the health AI system 118 is implemented.
  • the process 500 can be performed during interaction with dialog component 212 of the digital therapy module 120 .
  • the order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in an y order to implement the method, or alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein.
  • the method may be implemented in any suitable hardware, software, firmware, or a combination thereof, without departing from the scope of the invention.
  • a patient checks in.
  • patient(s) 224 through devices 226 invokes an application on devices 206 to connect with the information handling system 100 , and in specific to the dialog component 212 .
  • the patient is identified and logged in. For example, an ID related to the patient recognized by the dialog component 212 logs in the patient(s) 224 .
  • the process 500 ends. Otherwise, following the NO branch of block 506 , at step 510 conversation with the patient and the dialog component 212 begins or continues.
  • the conversation can include artificial intelligence (AI) generated questions form the dialog component 212 to the patient(s) 204 .
  • AI artificial intelligence
  • Deep layer patient profile (DLPP) data is loaded.
  • Deep layer patient profile data can include patient record(s), longitudinal data, current recommendations, state of the question process to the patient, question preferences, genre, etc.
  • the data repository 202 provides a specific deep layer patient profile (DLPP) to deep layer patient profiles (DLPP) 204 .
  • the dialog component 212 then loads the specific deep layer patient profile (DLPP).
  • the classification is performed on the deep layer patient profile (DLPP). For example, classification can be directed to format of the data, such as mime type as discussed above.
  • the dialog component 212 can invoke text classifier 210 to perform the classification.
  • the deep layer patient profile (DLPP) is sent to the transform component 214 .
  • knowledge models and rules are sent to the transform component 214 .
  • the transform component 214 is used to create machine readable data from information such as patient answers to questions.
  • the transform component 214 can be implemented to provide data that is readable by the health AI system 118 , and specifically the digital therapy module 120 and its components.
  • the authoring tool 222 provides the DTx knowledge models and rules to the transform component 214 .
  • data is saved.
  • the transform component 212 saves data to the data repository 202 .
  • the deep layer patient profile (DLPP) is sent to the detect component 218 .
  • the detect component 218 is used to filter out patient symptoms and rule out certain patient reported outcomes (PRO).
  • knowledge components and rules are loaded.
  • the authoring tool 222 provides or loads knowledge components and rules to the detect component 218 .
  • data is saved. In certain implementations, the data is saved from the detect component 218 to the data repository 202 .
  • the deep layer patient profile (DLPP) is sent.
  • the deep layer patient profile (DLPP) can be sent from the detect component 218 to the recommend component 220 .
  • actions are loaded. For example, the actions are loaded from the data repository 202 to the recommend component 220 .
  • the recommend component 220 can provide a set of recommendations for treatment of the patient.
  • recommend component 220 is invoked by the detect component 218 , and receives context or deep layer patient profile (DLPP), and loads and runs the actions for programs that the patient is associated with. At block 532 , recommendations are sent.
  • the recommend component 220 can send recommendations to be stored in the data repository 202 .
  • deep layer patient profiles are sent.
  • the recommend component 220 sends the deep layer patient profiles (DLPP) to the detect component 218 .
  • the detect component 218 sends the deep layer patient profile (DLPP) to the transform component 214 .
  • the transform component 214 sends the deep layer patient profile (DLPP) to the dialog component 214 .
  • the data repository 202 can load questions to the dialog component 212 .
  • the questions are modified.
  • the dialog component can modify the questions.
  • an update is performed on the journal 208 .
  • an updated deep layer patient profile (DLPP) is saved.
  • the updated deep layer patient profile (DLPP) is saved from the dialog component 212 to the deep layer patient profiles (DLPP) 204 , and the deep layer patient profiles (DLPP) 204 save the deep layer patient profile (DLPP) to the data repository 202 .
  • the questions are delivered to the patient.
  • the dialog component 212 provides the questions to a device 226 of user 224 .
  • the questions can be delivered through an application on the device 226 . If the patient decides to end the session, or if questions are completed, or if a session times out, then following the YES branch of block 506 , the process 500 ends. Otherwise, following the NO branch of block 506 , the process 506 continues.
  • FIG. 6 is a generalized flowchart 600 for patient updates.
  • the health AI system 118 is implemented.
  • the process 600 can be performed during interaction with diagnose component 216 of the digital therapy module 120 .
  • the order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method, or alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein.
  • the method may be implemented in any suitable hardware, software, firmware, or a combination thereof, without departing from the scope of the invention.
  • the process 600 starts.
  • lab results are received.
  • lab results are made available at health data 242 , and in the form of electronic health records (EHR) 244 .
  • the lab results can be received by the data repository 202 .
  • the lab results are loaded.
  • the lab result can be loaded from the data repository 202 to the diagnose component 216 .
  • deep layer patient profile (DLPP) is loaded.
  • the deep layer patient profile (DLPP) can be loaded from deep layer patient profiles (DLPP) 204 to the diagnose component 216 .
  • recommendations are updated.
  • the diagnose component 216 can update the recommendations to the recommend component 220 .
  • recommendations are saved.
  • the recommend component 220 can save the recommendations to the data repository 202 . If the updates are complete, then following the YES branch of block 614 , the process 600 ends at block 616 . Otherwise, following the NO branch of block 614 , the process 600 continues.
  • FIG. 7 is a generalized flowchart 700 for provider updates.
  • the information handling system 100 is implemented.
  • health care providers 238 through devices 240 interact with deep layer patient profiles (DLPP) 204 of the information handling system 100 .
  • DLPP deep layer patient profiles
  • the order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method, or alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein.
  • the method may be implemented in any suitable hardware, software, firmware, or a combination thereof, without departing from the scope of the invention.
  • the process 700 starts.
  • patient profile is loaded.
  • the deep layer patient profile (DLPP) is loaded from deep layer patient profiles (DLPP) 204 to a device(s) 240 of health care provider(s) 238 .
  • recommendations are updated.
  • the health care provider(s) 238 through device(s) 240 can update recommendations to the recommend component 220 .
  • deep layer patient profile (DLPP) is updated.
  • the health care provider(s) 238 through device(s) 240 can provide the updated deep layer patient profile (DLPP) to the deep layer patient profiles (DLPP) 204 of data repository 202 . If updates are complete, following the YES branch of block 710 , at block 710 , the process 700 ends. Otherwise, following the NO branch of block 710 , step 704 is performed.
  • FIG. 8 is a generalized flowchart 800 for a patient conversation for a patient to submit patient symptoms.
  • the process 800 in general describes the interaction of the components of digital therapy module 120 when a patient 224 submits patient symptoms or patient reported outcomes (PRO).
  • DTx knowledge models, deep layer patient profiles (DLPP), questions, and rules can be implemented.
  • the process 800 starts.
  • a patient checks in.
  • the dialog component 212 is initiated.
  • the transform component 808 is initiated.
  • the detect component 218 is initiated.
  • the recommend component 220 is initiated.
  • FIG. 9 is a generalized flowchart 900 as to when an electronic medical record (EMR) or lab notification is received.
  • EMR electronic medical record
  • the process 900 in general describes the interaction of the components of digital therapy module 120 when a patient 224 receives an EMR or lab notification.
  • DTx knowledge models, deep layer patient profile (DLPP), questions, and rules can be implemented.
  • the process 900 starts.
  • a lab result or EMR is received.
  • the diagnose component 216 is initiated.
  • the recommend component 220 is initiated.
  • FIG. 10 is a generalized flowchart 1000 as to when a health care provider changes data.
  • the information handling system 100 is implemented.
  • the process 1000 in general describes the interaction of the components of digital therapy module 120 when a health care provider 238 changes/modifies/adds patient related data. Throughout the process 1000 , as described above, DTx knowledge models, deep layer patient profiles (DLPP), questions, and rules can be implemented.
  • DLPP deep layer patient profiles
  • the process 1000 starts.
  • a health care provider 238 makes a change to patient data.
  • the detect component 218 is initiated.
  • the recommend component 220 is initiated.
  • FIG. 11 is an example method, flow or care pathway. Although depicted linearly, it is to be understood that the blocks can be performed in parallel, and that feedback from succeeding blocks can be provided to preceding blocks.
  • a flow or care pathway 1100 can be implemented as part of the digital therapeutics or DTx.
  • Block 1102 performs patient monitoring.
  • the monitoring block 1002 can be performed by patients 224 or delegates of patients 224 .
  • patent generated health data or PGHD 1004 is monitored.
  • the PGHD 1004 can be self-reported symptoms such as patient reported outcomes (PRO).
  • PGHD 1104 can be gathered through patient data monitoring/sensing devices 228 , such as Bluetooth enabled biometrics that measure patient vital signs, such as oxygen.
  • Block 1106 performs a remote triage.
  • the remote triage block 1106 can be video or voice, and be performed by care coordinators, such as nurses.
  • Block 1108 is a timing for the triage. In specific, timing is related to urgency for treatment. For example, an indication or “red” can be immediate, “yellow” can be for the next day, and “green” can be for notification only.
  • Block 1110 is directed to prioritization for an evaluation visit. For example, various levels can be implemented as to the priority, such as “P1” for emergent/immediate, “P2” for urgent/same day, “P3” for within 72 hours, “P4” for a watch list, and “P5” for a routine visit.
  • Block 1112 performs a diagnosis.
  • the diagnosis block 1012 can be performed by health care providers 238 , such as physicians.
  • an application can used by the health care providers 238 .
  • Block 1114 is evaluation and management which can provide for a diagnostic workup, such as exams/lab tests, procedures and imaging, consultation, etc.
  • symptom management can be provided, such as care setting, supportive care, immunosuppressive prescriptions, etc.
  • Block 1116 performs treatment.
  • the treatment block 1116 can be performed by health care providers 238 , such as physicians.
  • an application can be used by the health care providers 238 .
  • Block 1118 is evaluation and management which can distinguish between inpatient versus outpatient evaluation and management, virtual versus in-person evaluation and management, provide for monitoring labs, provide IO therapy status, provide immunosuppressive prescriptions, etc.
  • FIG. 12A is a block diagram of digital therapeutics or DTx that can be used to implement the system and method of the present invention.
  • the information handling system 100 and the health AI system 118 as described can be used as the platform for DTx 1100 .
  • the care pathway 1000 described above can receive input and provide output by various entities. Included in the care pathway are DTx knowledge models 246 and deep layer patient profiles (DLPP) 204 .
  • medical experts 234 which can include drug and disease experts, provide input as to DTx knowledge models 246 through authoring tool 222 .
  • the care pathway 1100 receives data from data component 1202 .
  • Data component 1202 can include patient generated health data 1204 , which can include biometrics, etc.
  • Data component 1202 can further include patient reported outcomes data 1206 , clinical data (electronic health records) 1208 , and environmental/social determinants data 1210 .
  • the data from data component 102 can come from health data 242 and devices 228 described in FIG. 2 .
  • data from data component 102 can be processed by a data cleansing component 1212 .
  • Data from data cleansing component can be received by the transform component 214 .
  • Data from the transform component 214 can be received by the care pathway 1000 .
  • authorizing parties 1214 request for treatment, and use of the DTx 1200 .
  • Authorizing parties 1214 can include patients 224 , which can include actual patients and delegates/caregivers of patients, etc.
  • Authorizing parties 1214 can be health care providers 238 , which can include primary prescribing physicians/staff, non-treating physicians/staff, etc.
  • Entities such as patients 224 and health care providers 238 can receive output, such as recommendations, from the care pathway 1000 .
  • the output from care pathway 1000 can be processed by application program interfaces (API) 1218 .
  • API application program interfaces
  • UI user interface
  • UX user experience
  • API 1218 can include a localization layer to interface or integrate with different data sources, or cloud environment for deployment.
  • FIG. 12B is a block diagram of another implementation of digital therapeutics or DTx that can be used to implement the system and method of the present invention.
  • an DTx as a “Studio” can provide permutations of customizable content including domains, programs, rules, actions, conditions, adverse events (see DTX knowledge model above) and care pathways. In additional patients, providers, and experts can be included.
  • the “Studio” can enable multiple deployments of topologies, geographies, and governing laws/regulations, etc. to direct customization and implementation.
  • the “Studio” can deploy entry points that can include (i) a hosted and managed environment where an organization and its patients use the product without customizations (e.g., clinical trial at existing cancer center with existing patients; (ii) a hosted and managed environment where an organization enhances customizable content; and (iii) an ability for a medical organization and providers to customize and management their own DTx product.
  • Customizable content can include modified knowledge models, questions, rules, actions, AEs and the method for a care pathway.
  • Studio customization can include the ability to add additional integration of devices, conversation flows and genres.
  • Studio customization can include the ability to add additional integration of EHRs, third party systems and patient data sources.
  • Studio can include selectable and customizable components for modifying care pathway that includes dialog, detect, diagnose and recommend.
  • FIG. 13 is a generalized flowchart 1300 for presenting patient related information implementing a digital deep layer patient profile.
  • the health AI system 118 is implemented.
  • the order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in an y order to implement the method, or alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein.
  • the method may be implemented in any suitable hardware, software, firmware, or a combination thereof, without departing from the scope of the invention.
  • the process 1300 starts.
  • receiving data that includes patient data, lab result data, machine learning calculation data related to the patient, and physician result data.
  • the health AI system 118 performs the step.
  • mapping the data as to intensities, multiple dimensions and time is performed.
  • the view 300 of structured patient profile symptoms related to conditions and then related to Score and how these combined values can be mapped to intensities in a deep layer patient profile (DLPP) is provided.
  • mapping longitudinal data 302 the time dimension can be incremented, and additional intensities are added 304 .
  • the deep layer patient profile (DLPP) module 122 can store longitudinal data as a deep layer patient profile (DLPP).
  • the digital deep layer patient profile is a binary multiple dimension image deidentified as to patient, disease, time or entity.
  • the digital deep layer patient profile is multi-sliced into portions of the digital deep layer patient profile and is used to perform algorithms as to areas of interest.
  • the digital deep layer patient profile is searched as to edge detection, rate of change, contour identification, color enhancing, color reduction, dilation, moments and masking, to deliver non intuitive trends and correlations in the data.
  • the digital deep layer patient profile is compared, using machine language trained patient profile sets, using sliced and transformed portions of the digital deep layer patient profile to portions of other digital deep layer patient profiles to determine similar patient profiles.
  • the digital deep layer patient profile is implemented with machine learning algorithms to present a future view of the digital deep layer patient profile that shows predicted treatment changes, predicted medicine changes, adverse event expected results, and expected patient reactions.
  • the process 1300 ends.
  • FIG. 14 an example screen presentation of a health care provider user interface 1400 is shown.
  • the user interface 1400 provides views into patient reports, status, recommendations, etc.
  • FIG. 15 an example screen presentation of a health care provider user interface 1500 is shown.
  • the user interface 1500 provides views into patient reports, status, recommendations, etc.
  • FIG. 16 an example screen presentation of a health care provider user interface 1600 is shown.
  • the user interface 1600 provides views into patient reports, status, recommendations, etc.
  • FIG. 17 an example screen presentation of a health care provider user interface 1700 is shown.
  • the user interface 1700 provides views into patient reports, status, recommendations, etc.
  • FIG. 18 an example screen presentation of a health care provider user interface 1800 is shown.
  • the user interface 1800 provides views into patient reports, status, recommendations, etc.
  • FIG. 19 an example screen presentation of a health care provider user interface 1900 is shown.
  • the user interface 1900 provides views into patient reports, status, recommendations, etc.
  • FIG. 20 an example screen presentation of an authoring tool user interface 2000 is shown.
  • the user interface 2000 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • FIG. 21 an example screen presentation of an authoring tool user interface 2100 is shown.
  • the user interface 2100 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • FIG. 22 an example screen presentation of an authoring tool user interface 2200 is shown.
  • the user interface 2200 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • FIG. 23 an example screen presentation of an authoring tool user interface 2300 is shown.
  • the user interface 2300 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • FIG. 24 an example screen presentation of an authoring tool user interface 2400 is shown.
  • the user interface 2400 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • FIG. 25 an example screen presentation of an authoring tool user interface 2500 is shown.
  • the user interface 2500 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • the present invention may be embodied as a method, system, or computer program product. Accordingly, embodiments of the invention may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in an embodiment combining software and hardware. These various embodiments may all generally be referred to herein as a “component,” “circuit,” “module,” or “system.” Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
  • the computer-usable or computer-readable medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, or a magnetic storage device.
  • a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • Computer program code for carrying out operations of the present invention may be written in a programming language such as JavaScript, Python, C# or the like.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • Embodiments of the invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medicinal Chemistry (AREA)
  • Pharmacology & Pharmacy (AREA)
  • Toxicology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Chemical & Material Sciences (AREA)
  • Bioethics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

A system, method, and computer-readable medium are disclosed for digital therapeutics directed to patient care specific to a disease for digital therapeutics that implement digital deep layer patient profile. Patient related information is presented by receiving data that includes patient data, lab result data, machine learning calculation data related to the patient, and physician result data. The data is mapped as to intensities, multiple dimensions and time. The mapping is converted to create an unstructured binary data with binary correlations as a digital deep layer patient profile. The digital deep layer patient profile can be processed with machine learning and image processing algorithms.

Description

    RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 16/700,312, filed on Dec. 2, 2019, which claims the benefit of priority of U.S. Provisional Patent Application Ser. No. 62/888,777, filed Aug. 19, 2019, incorporated herein by reference.
  • BACKGROUND OF THE INVENTION Field of the Invention
  • The present invention relates to information handling systems. More specifically, embodiments of the invention relate to artificial intelligence systems for digital therapeutics that implement digital deep layer patient profile.
  • Description of the Related Art
  • In treating patients with cancers or serious diseases, several approaches can be taken, including radiation, chemical, surgical, and immunotherapy. Immunotherapy makes use of a body's natural defenses or immune system to fight disease. An area of immunotherapy used to specifically treat cancer is known as immuno-oncology (IO). Cancerous cells can thrive because of their ability to hide from immune systems. Immunotherapies or IO can mark cancer cells to allow immune systems to find and destroy cancer cells. Certain immunotherapies or IO can boost immune systems to better fight against cancer.
  • Part of IO therapy or treatment includes activating a body's immune system response to overcome cancerous tumor survival and growth. The treatment can also cause adverse side effects, such as attacks on healthy cells while attacking cancerous cells. Such adverse responses can be referred to as Immune-mediated or Immune Related Adverse Event or irAE. irAE can affect any organ system in the body, including the gastrointestinal tract, skin, heart and lung, liver and kidneys, the nervous system or brain, endocrine organs such as thyroid, pancreas, and many others. Examples of irAE symptoms can include joint pain, swelling, soreness, redness, skin itchiness, rashes, fever, chills, dizziness, nausea, vomiting, muscle/joint paint, fatigue, headaches, trouble breathing, low/high blood pressure, swelling, retaining fluid, heart palpitations, sinus congestion, diarrhea, infection, vision problem etc.
  • Ongoing irAEs can lead to complications or end of life. It is understandable, that individual patients will have different reactions to certain immunotherapies or IO based on age, gender, genetics, prior medical history, cancer type, mutation type, and other differentiators. It is also understandable the categorically similar patients, once appropriately defined, may have categorically similar reactions to certain immunotherapies or IO. Patient's irAEs will be different and can dynamically change over time. Effective management of treatment of patients, assuring that better good is done than harm, is to accurately understand the therapy and effects of such therapy for each individual patient.
  • SUMMARY OF THE INVENTION
  • A system, method, and computer-readable medium are disclosed for digital therapeutics directed to patient care specific to a disease for digital therapeutics that implement digital deep layer patient profile. Patient related information is presented by receiving data that includes patient data, lab result data, machine learning calculation data related to the patient, and physician result data. The data is mapped as to intensities, multiple dimensions and time. The mapping is converted to create an unstructured binary data with binary correlations as a digital deep layer patient profile. The digital deep layer patient profile can be processed with machine learning and image processing algorithms.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention may be better understood, and its numerous objects, features and advantages made apparent to those skilled in the art by referencing the accompanying drawings. The use of the same reference number throughout the several figures designates a like or similar element.
  • FIG. 1 shows a general illustration of components of an information handling system as implemented in the system and method of the present invention;
  • FIG. 2 shows a block diagram of a health AI environment;
  • FIGS. 3A-3E show images representing a digital deep layer patient profile (DLPP);
  • FIG. 4 shows a deep layer patient profiles (DLPP) multi-dimensional image having a patient specific volume, density and shape;
  • FIG. 5 shows a flowchart for patient conversation for digital therapeutics;
  • FIG. 6 shows a flowchart for patient updates;
  • FIG. 7 shows a flowchart for provider updates;
  • FIG. 8 shows a flowchart for patient conversation for a patient to submit patient symptoms;
  • FIG. 9 shows a flowchart as to when an electronic medical record (EMR) or lab notification is received;
  • FIG. 10 shows flowchart as to when a health care provider changes data;
  • FIG. 11 shows a flow or care pathway for digital therapeutics;
  • FIGS. 12A, 12B show a block diagram of digital therapeutics;
  • FIG. 13 shows a flowchart for presenting patient related information implementing a digital deep layer patient profile;
  • FIG. 14 shows a screen presentation of a health care provider user interface;
  • FIG. 15 shows a screen presentation of a health care provider user interface;
  • FIG. 16 shows a screen presentation of a health care provider user interface;
  • FIG. 17 shows a screen presentation of a health care provider user interface;
  • FIG. 18 shows a screen presentation of a health care provider user interface;
  • FIG. 19 shows a screen presentation of a health care provider user interface;
  • FIG. 20 shows a screen presentation of an authoring tool user interface;
  • FIG. 21 shows a screen presentation of an authoring tool user interface;
  • FIG. 22 shows a screen presentation of an authoring tool user interface;
  • FIG. 23 shows a screen presentation of an authoring tool user interface;
  • FIG. 24 shows a screen presentation of an authoring tool user interface; and
  • FIG. 25 shows a screen presentation of an authoring tool user interface.
  • DETAILED DESCRIPTION
  • Digital Therapeutics (DTx) includes software programs that provide efficacious interventions to patients to prevent, manage, or treat a broad spectrum of physical, mental, and behavioral conditions. DTx can capture scientific evidence and expert knowledge about a cancer type and its treatment with a specific drug or class of drugs. DTx captures and integrates real time patient data and clinical information. DTx can generate insights personalized to a specific patient. When prescribed together with an anti-cancer drug, DTX can incorporate real-time patient data in context of curated clinical knowledge to generate personalized insights in clinically relevant time, activating the appropriate healthcare providers at the right time to detect, diagnose and treat irAEs. This concept is expressed simply as DTx+drug therapy=precision and personalization.
  • As cancer patients are treated with immunotherapy, such as immuno-oncology (IO), positive effects towards treating the disease are expected; however, in certain instances during treatment, adverse effects from the treatment can be experienced. If the adverse effects continue, the patient can experience severe complications. Furthermore, if a costly drug treatment or clinical drug trial is involved and adverse effects are not properly identified, the investment may be compromised. Determining such adverse effects and their causes can be performed using intelligent information handling systems that gather and store information, such as patient symptoms during treatment.
  • An information handling system generally processes, compiles, stores, and/or communicates information or data for business, personal, or other purposes thereby allowing users to take advantage of the value of the information. Because technology and information handling needs and requirements vary between different users or applications, information handling systems may also vary regarding what information is handled, how the information is protected, how the information is handled, how much information is processed, stored, or communicated, and how quickly and efficiently the information may be processed, stored, or communicated. In addition, information handling systems may include a variety of hardware and software components that may be configured to process, store, and communicate information and may include one or more computer systems, data storage systems, and networking systems.
  • For purposes of this disclosure, an information handling system may include any instrumentality or aggregate of instrumentalities operable to compute, classify, predict, protect, process, transmit, receive, retrieve, originate, switch, store, display, manifest, detect, record, reproduce, handle, or utilize any form of information, intelligence, or data for business, scientific, control, or other purposes. For example, an information handling system may be a personal computer, a network storage device, or any other suitable device and may vary in size, shape, performance, functionality, and price. The information handling system may include random access memory (RAM), one or more processing resources such as a central processing unit (CPU) or hardware or software control logic, ROM, and/or other types of nonvolatile memory. Additional components of the information handling system may include one or more disk drives, one or more network ports for communicating with external devices as well as various input and output (I/O) devices, such as a keyboard, a mouse, a camera, a microphone, and a video display. The information handling system may also include one or more buses operable to transmit communications between the various hardware components.
  • FIG. 1 is a generalized illustration of an information handling system 100 that can be used to implement the system and method of the present invention. The information handling system 100 includes a processor (e.g., central processor unit or “CPU”) 102, input/output (I/O) devices 104, such as a display, a keyboard, a mouse, and associated controllers, a hard drive or disk storage 106, and various other subsystems 108. In various embodiments, the information handling system 100 also includes network port 110 operable to connect to a network 140, which is likewise accessible by a service provider server 142. The information handling system 100 likewise includes system memory 112, which is interconnected to the foregoing via one or more buses 114. System memory 112 further comprises operating system (OS) 116 and in various embodiments may also comprise a health artificial intelligence or AI system 118.
  • In certain embodiments, the health AI system 118 includes multi-treatment digital therapy module 120 and deep layer patient profile (DLPP) module 122. The deep layer patient profile (DLPP) module 122 can store longitudinal data as a deep layer patient profile (DLPP). The health AI system 118 can provide digital therapeutics or DTx on medical patients, delivering evidence-based therapeutics, which can be intervention of pending/planned treatment, to prevent, manage, or treat a medical disorder or disease, and in this example cancer. DTx delivers evidence-based therapeutic interventions to patients using software programs to prevent, manage, or treat a medical disorder or disease. DTx are used independently or in together with medications, devices, or other therapies to optimize patient care and health outcomes. DTx can be independently implemented, or can be used with medication, medical devices, and other therapies to optimize patient care and treatment results.
  • DTx is implemented to provide detection, evaluation, treatment, and management of treatment related adverse events. Outcomes are specific to patients and can have variable outcomes. In certain implementations, various technics can be used including genomic sequencing of blood, tissue, or stool, blood or urine maker determination, for detection, evaluation and treatment, etc. In certain implementations, phenomics data collected from patients, measured with connected devices, via mobile phone, web, or other connected means, in addition to clinical data from electronic health records (EHR), can be used to develop predictive models to guide precision management. An integrated digital care pathway can be provided, that is informed by data and evidence, for management of treatment-related adverse events or Immune Related Adverse Events (irAE). Scientific and expert evidence is captured about a disease and treatment with specific drug(s). In addition, real-time patient data and clinical information is captured and integrated. Personalized/specific insight is generated for specific patients.
  • FIG. 2 is a block diagram of a health artificial intelligence (AI) environment 200 implemented in accordance with an embodiment of the invention. The health AI environment 200 includes the information handling system 100 which includes the health AI system 118. The health AI system 118 includes the E-combination digital therapy module 120 and deep layer patient profile (DLPP) module 122. In certain implementations, the information handling system 100 includes a data repository 202 which can store information, such as patient data processed by the information handling system 100. In certain implementations, the data repository 202 can be configured to store patient related data, such as deep layer patient profiles (DLPP) 204 and data identifiers (IDs) 206. The deep layer patient profile (DLPP) module 122 can be used to generate deep later patient profiles (DLPP) 204. Deep layer patient profiles (DLPP) 204 can be continuously updated as further described below. Deep layer patient profiles (DLPP) 204 can include patient identifier (ID), recommendations, patient data entered through the health AI environment 200, electronic health records (EHR) electronic medical records (EMR), etc. In certain implementations, context entered through the health AI environment 200 includes deep layer patient profiles (DLPP) 204, where context can be a session with a deep layer patient profile (DLPP) with questions answered, and type of data that is desired, such as time zone, language, required or optional data, longitudinal data (e.g., over 14 day time period), etc.
  • Data IDs 206 can be used to identify data. For example, data is collected as to DTx knowledge models and symptoms as further described below. In certain implementations, data IDs 206 can be used to identify collected data, such as gathered answers to patient questions. In certain implementations, the data repository 202 includes a journal 208, which for example, can store scores related to symptoms of different categories. In certain implementations, the data repository 202 includes DTx knowledge models 246.
  • The following code describes example knowledge models:
  • DTx knowledge models categorization
     Domain: Oncology
      Program: Adverse Event Toxicity Management
       Questions - languages and genres
       Rules - pre and post conditionals and groups
       Actions - additional actions - dialog, rules, actions
       Detect Models
        Trigger - Symptom that drives model to be activated (e.g. rash for dermatitis,
     diarrhea for colitis)
        Primary Symptom - Priority, Weights, ID (e.g. headache)
        Associated Symptom - Priority, Weights, ID (e.g. location, severity, etc. of
     headache)
        Clinical Modifier - Priority, Weights, ID (e.g. had prior irAE)
       Diagnostic Models
        Clinical lab results drive model to be activated (e.g. confirmed rash for
     dermatitis)
        Primary Symptom - Priority, Weights, ID (e.g. headache)
        Associated Symptom - Priority, Weights, ID (e.g. location, severity, #)
        Clinical Modifier - Priority, Weights, ID (e.g. had prior irAE)
       Recommendation Models
        Pre-Conditions - Rule / Conditional / Time Trend
         Recommendation 1 - weight (1 physician required, 0 optional),ID (text)
         Recommendation ...N
      #expanded Programs
      Program: Chemotherapy
      Program: Radiation therapy
      Program: Immunotherapy
      Program: Surgery
      Program: Targeted therapy
      Program: Hormone therapy
     #expanded Domains
     Domain: Heart Disease, etc.
     Domain: Shared
      Program: Shared
       Medications
       Conditions
       Data types
  • In certain implementations, DTx calculations provide for weight variables to be adjusted based on organization preference, expert opinion or AI/ML algorithms. Weight variables can be applied to, and are not limited to the following types: questions, symptoms, recommendations, rules, conditions, and actions, etc. Weights can be used to emphasize results, remove results, adjust priorities, or affect calculations. Weights can be applied to a category, a list of categories, an item, a list of items, or a chain of any of the aforementioned.
  • When selecting or applying to a category, the following weight variables and/or operations can be included. Priority: for a selection, multiply the selection's priority by a weight. Priority Inversion: for a selection, after a previous high priority selection is identified and processed, if the current priority selection is the same as previous for N times, reduce the current priority so that the next selection with a lower priority can be processed. Derivatives: for values 0, 1, 2, . . . , N; if 0, then use the current value for any longitudinal selection as is (e.g. blood pressure, or headache); if 1, then take the derivative (i.e., change) of current and previous K values to determine the change in longitudinal items; a positive change (i.e., slope) can indicate an increasing value; if 2, then take the 2nd derivative (i.e., change in slope). Weighted Inversion: divide selection priority by weight. Chaining: when multiple selections are weighted resulting in the same evaluation, a chain identifies a list of selection types that are processed left to right for final arbitration, (e.g., colitis, endocrinopathy, etc.). Environmental: reduce impact of selection weight by D % with accompanying environmental condition is present, (e.g., headache with “sun exposure.” Lab: modify impact of selection weight by M % based on lab result. LikeMe: modify weights in DLPP based on a function of weights resulting from P other patients or L other lab results; the function( ) can include replaced, multiply, divide or any of the previous weights.
  • In certain implementations, the information handling system 100 can include a text classifier 208. Data or information can be received in a specific format as defined by a media type, or in certain instances a Multipurpose Internet Mail Extensions or mime type. The text classifier 208 can be used to covert mime types to accessible text or data/information that is stored. In certain implementations, an answer type (e.g., text, speech, image, etc.) will indicate how to store data. Standard text classification can be implemented, such as speech to text, text to speech, image to text, text to image, etc.
  • In certain embodiments, the health AI system 118, and digital therapy module 120, can include various components. In certain implementations, such components include, a dialog component 212, a transform component 214, a diagnose component 216, a detect component 218, and a recommend component 220. The components will be further discussed below. The digital therapy module 120 can also include an authoring tool 222. The authoring tool 222 can be used by entities, such as medical experts, and implemented to author/provide: questions, answers IDs, priorities, conditions, languages, etc. The authoring tool 222 further can deploy DTx knowledge models.
  • In certain implantations, multiple domains, programs, diseases, symptoms, subject matter expert knowledge models for the dialog component 212, a transform component 214, a diagnose component 216, a detect component 218, weight, priorities and genres are supported.
  • The information handling system 118 can connect to network 140. Network 140 is representative of various networks, that include internal and external networks, secure and unsecure networks, various computing devices, such as servers, cloud computing networks and devices, etc. For certain implementations, the described entities of health AI environment 200 are provided limited or secure access to certain networks and computing devices of network 140.
  • The health AI environment 200 includes patient(s) 224 who are monitored and provided treatment, such as immunotherapy or immuno-oncology (IO). Patient(s) 224 can use patient devices 226 to provide information and to receive information. In certain implementations, patient data monitoring/sensing devices 228 can collect or gather information or data about patient(s) 224. For example, patient data monitoring/sensing devices 228 can include personal wearable devices, heat or optical measurement devices that detect temperature and images of the patient(s) 224. In certain implementations, the patient data monitoring/sensing devices 228 can provide information or data to devices 226 by wireless transmission, such as Bluetooth.
  • As used herein, a device 226 refers to an information handling system such as a personal computer, a laptop computer, a tablet computer, a personal digital assistant (PDA), a smart phone, a mobile telephone, a camera, a mirror, a robot or other device capable of communicating and processing data. In certain implementations, device 226 can present patient(s) 224 with patient related questions, as part of a conversion or dialog for digital therapeutics. In certain implementations, the conversion or dialog is performed through the dialog component 212. In particular, the dialog component 212 receives patient answers to health symptom dialog questions. The dialog component 212 can further send questions to the patient for additional patient related information. In certain implementations, the patient answers are in response to modified dialog questions from detect and diagnose questions. Modified questions can be the result from reducing conflicting questions, changing questions based on a specific genre question syntax, re-categorized and re-ordered questions based on priority resolution, and additional questions for subjective reduction.
  • The conversation or dialog from dialog component 212, as to the questions can be performed with devices 226 through various entry points such as applications and web browsers running on the devices 226. Answers to the questions can be obtained by entering text or information directly on devices 226 or can be gathered through specific patient data monitoring/sensing devices 228 such as a smart mirror which can detect facial expressions (e.g., facial image data), a microphone which can detect voice inflections, fixed and moveable robotics, personal cameras, etc. For example, data related to answers can be obtained through artificial intelligence (AI) which can include sentiment analysis (e.g., elated, happy, sad, angry), voice changes (faster, slower, louder, quieter), image changes (facial lines, weight loss, growing rash), etc. Patient answers can be presented through a user interface, audio input and output, camera input and output, and gesture input and output, etc.
  • The questions from the dialog component 212 can be structured to receive answers from the patient(s) 224 to be used by the information handling system 100 and specifically the E-Combination digital therapy module 120 and its components as described herein. The answers can also be used by other entities of health AI environment 200 as further described below. In certain implementations, the structured questions can be stored in a database or repository, such as data repository 202, and be stored by category, such as disease category (e.g. cancer, diabetes), treatment category (e.g. immunotherapy, chemotherapy), etc. For each category, questions can be defined in order to acquire needed patient reported data. In certain implementations, questions can have pre-conditions that are met for a question to be selected for conversation with a patient. Questions can have an identifier or ID number and be given a priority number.
  • Questions can be categorized to align with a medical term or program (e.g., treatment program), where there can be multiple programs. Examples of programs can include chemotherapy, radiation therapy, immunotherapy, surgery, targeted therapy, hormone therapy, etc. In certain implementations, a program category, for example immunotherapy or IO, can be defined by a DTx knowledge model which can include categories such a “primary symptoms”, “secondary symptoms”, “clinical modifiers”, “laboratory tests”, “imaging studies” etc. In certain implementations, such DTx knowledge models can determine symptom scores and an overall score for the questions, and the DTx knowledge models can be used as criteria for selecting questions from a list of total questions. In certain implementations, medical or disease experts as further described below, can author, group, and provide attributes for questions.
  • In certain implementations, such as through the detect component 218 and recommend component 220 as described below, a category of questions can be identified, and the dialog component 212 can modify or reduce the potential list of questions to actual questions that are presented to patient(s) 224. The recommend component can provide recommendations from the primary symptoms and laboratory results for addressing adverse events and related to the longitudinal data, deep layer patient profile (DLPP), and knowledge models. Recommendations can be derived from algorithms that use the patient answers, the laboratory results and the knowledge models. Furthermore, the recommendations can be derived by algorithms that use detect and diagnosis results, and produced preliminary diagnosis, priorities and weights.
  • Reducing/refining the questions can proceed as follows. A number of questions are selected from a current category. A number of questions are increased based on detected characteristics processed by artificial intelligence (AI) which can include sentiment analysis (elated, happy, sad, angry), voice changes (faster, slower, louder, quieter), image changes (facial lines, weight loss, growing rash), etc. Questions can be increased by aligning AI characteristics to a question pool and retrieving additional questions. Questions can be reduced to only include questions for a current symptom (e.g., blood, diarrhea, headache, etc.). Questions can be sorted in priority order. Questions can be presented in a priority order, such that certain questions can take precedence over other questions in diagnosing/evaluating patient(s) 224. The next question, if available, can be ensured to have a category related to the previous question, such that questions are asked in appropriate groups of symptoms. A question can be removed, if the same question is asked within a time window of question attribute frequency. A question can be included, if an “alert” indicates a question is to be repeated. A question can be removed, if pre-conditions have not been met. A question index number can be converted to match a request genre of questions, such as talking to a child versus talking to an elderly person, talking in a dialect versus talking in a standard language, talking with text versus pictures versus audio, etc. A question index number can be used to retrieve a question in a correct language (e.g., English, Chinese, Spanish, etc.).
  • The health AI environment 200 can include developers 230 who provide and modify applications and software modules, such as components 212, 214, 216, 218 and 220. Developers 230 can create and update DTx knowledge models used in the digital therapeutics or DTx. Developers 230 can provide and update questions, rules and actions as to the DTx knowledge models. Developers 230 can connect to various entities of the health AI environment 200 and exchange information through devices 232. As used herein, a device 232 refers to an information handling system such as a personal computer, a laptop computer, a tablet computer, a personal digital assistant (PDA), a smart phone, a mobile telephone, or other device capable of communicating and processing data. Devices 232 can be connected to network 140, and in certain implementations are connected to secure networks and devices that are included in the network 140.
  • The health AI environment 200 can include disease or medical experts 234 who provide input updates in the development of applications and software modules, such as DTx knowledge models. Medical experts 234 can use the authoring tool 222 to author, group and provide attributes as to patient questions. In certain implementations, medical experts provide rules as used in by detect component 218. The detect component 218 can process and present primary symptoms with associated detect knowledge models as related to the longitudinal data and deep layer patient profile (DLPP). Medical experts 234 can connect to various entities of the health AI environment 200 and exchange information through devices 236. As used herein, a device 236 refers to an information handling system such as a personal computer, a laptop computer, a tablet computer, a personal digital assistant (PDA), a smart phone, a mobile telephone, or other device capable of communicating and processing data. Devices 236 can be connected to network 140, and in certain implementations are connected to secure networks and devices that are included in the network 140.
  • The health AI environment 200 can include health care providers 238 who assist in treating patients 224. Examples of health care providers 238 can include physicians, nurses, laboratory/diagnostic facilities, entities providing treatment, etc. In specific, health care providers 238 make use of digital therapeutics or DTx for patients 204. As discussed above, DTx can be independently implemented, or can be used with medication, medical devices, and other therapies to optimize care and treatment results for patients 224. Health care providers 238 can connect to various entities of the health AI environment 200 and exchange information through devices 240. Medical experts 234 and health care providers 238 can present physician results with associated diagnostic knowledge models as related to the longitudinal data and deep layer patient profile (DLPP). In certain implementations, health care providers 238 through devices 240 interact with diagnose component 216, detect component 218, and recommend component 220. As used herein, a device 236 refers to an information handling system such as a personal computer, a laptop computer, a tablet computer, a personal digital assistant (PDA), a smart phone, a mobile telephone, or other device capable of communicating and processing data. Devices 236 can be connected to network 140, and in certain implementations are connected to secure networks and devices that are included in the network 140.
  • In certain implementations, the health AI environment 200 includes a health data repository 242, which can include various data stores, databases, etc. The health data repository 242 and data and information stored therein can be made available to various entities of health AI environment 200, including the information handling system 100, and specifically the E-Combination digital therapy module 120. The health data repository 242 can be connected to the network 140, and to secure and unsecure networks of network 140.
  • Health data repository 242 can include electronic health records (EHR) 244. EHR 244 can include health information of patients 204. For example, health information can include administrative information, progress data, demographics of the patient, medical history, previous and prior diagnoses, medications of the patient, immunization record, allergies of patient, radiology and laboratory information, test results, etc. In addition, EHR 244 can include electronic medical records or EMR. An EMR can be considered a subset of an EHR, where the EMR can include specific information for a patient from a particular physician or clinic. An EMR can track data over time, identify particular patients for treatment, check patients' status based on certain parameters, such as blood pressure, etc.
  • The dialog component 212, transform component 214, diagnose component 216, detect component 218, and recommend component 220 are further described. It is to be understood that implementation of the invention, including the components, may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in an embodiment combining software and hardware.
  • The dialog component 212 can present to patient(s) 224, a refined list of questions, which are answered by the patient(s) 224. The questions can be stored in a database, such as data repository 202, and can be stored by category (e.g., diabetes, immunotherapy, chemotherapy, or other disease category). The question categories can align with medical term program, where multiple programs can be supported. For each category, the questions can be defined in order to get needed patient reported data. Questions can also have pre-conditions that are met in order for questions to be selected for conversation with a patient(s) 224. Questions can have an identifier (ID) and a priority number.
  • In certain implementations, a program category (e.g., immunotherapy) can have a defined DTx knowledge model that includes primary symptoms, secondary symptoms and clinical modifiers, etc. The DTx knowledge model can determine symptom scores and an overall score. The DTx knowledge model can be used as criteria for selecting questions out of the total list of questions. The authoring tool 222 can be used by medical experts 234 to author, group and provide attributes for questions.
  • As discussed above, the dialog component 212 can interact with patient(s) through various entry points, such as a mobile phone application, tablet application, web browser, smart mirror, microphone, camera, fixed and moveable robotics, personalized cameras, human delegate, etc. When the dialog component 212 and the recommend component 220 identify a category of questions, the dialog component 212 can intelligently modify/reduce a potential list of questions to actual questions. Refining the questions by the dialog component is described in detail above.
  • The transform component 214 can transform answers to questions into longitudinal data. Longitudinal data being repeated observational data using the same variable over various times. In certain implementations, such longitudinal data can be digital deep layer patient profile (DLPP) (e.g., deep layer patient profile (DLPP) 204). By implementing DTx knowledge models and medical systems, if a symptom has an identifier (ID) for data that is being collected (e.g., rash color), the data (e.g., rash color) is attached to a question or answer. When questions are answered, the data can be added to longitudinal data for the identifier (ID) for the data with a timestamp. The questions can have N answers, where each answer can be mapped to a different identifier (ID). Each answer has an answer type (e.g., text, image, number, voice). Primary symptoms can be produced from algorithms using patient reported symptoms and the subject matter expert knowledge models. In certain implementations, natural language patient reported symptoms, disambiguating patient reported outcomes with dialog techniques, and producing a numerical longitudinal mapping for structured and deep layer patient profile (DLPP) storage are provided.
  • The text classifier 210 can convert mime types to text or data type that are stored. The type related to an answer can indicate how to store data. The data can be a tuple (i.e., finite ordered list), such as value/duration, and includes a timestamp. For example, if blood is found in stool, the tuple is (blood, stool). The detect component 218 can also generate longitudinal data. For example, if value/duration, duration of how long value has been going on.
  • The transform component 214 can update a deep layer patient profile (DLPP), which can be used to record patient responses, patient recorded outcomes, and other data. In certain implementations, the deep layer patient profile (DLPP) can be used for artificial intelligence (AI) analytics and searching.
  • A deep layer patient profile (DLPP) is a mechanism to turn related and structured patient data into a multidimensional unstructured data format that is processed with machine learning and image processing algorithms. Instead of selecting data, for example, by patient id, date, and data types needed, a deep layer patient profile (DLPP) is used for searching for intensity values that match thresholds. In certain embodiments, the deep layer patient profile (DLPP) is also anonymized from the original data source and thus can be shared without privacy concerns. The deep layer patient profile (DLPP) after anonymization, can be manipulated with completely new algorithms focused on binary and intensity operations, such as image processing and deep learning.
  • In certain embodiments, the deep layer patient profile (DLPP) is created, updated and accessed using the following mappings: time—using 24 bits for Julian date; patient symptom—assigned 1 to assigned 1 to N in 24 bits for 16 M possible values, related to patient response; condition—assigned 1 to N in 24 bits for 16 M possible values; severity score—assigned 1 to N in 24 bits for 16 M possible values, related to patient response transformed to business implementation representation; symptom (S), condition (C), specific collection of symptoms, overall; heart rate, blood pressure (BP), that can also be grouped/mapped to scores (e.g., 90-95=>3); Common Terminology Criteria for Adverse Events (CTCAE) grade from 1 to 5 (in 24 bits), where 1—Mild, 2—Moderate, 3—Severe, 4—Life-threatening, 5—Death; priority—0 to 4 in 24 bits; recommendation—1 to N in 24 bits. Optional: Labs, medications, conditions (problems), observations, etc. encoded each in 1 to N in 24 bits.
  • Slice and match to 24 bits can be shown as numeric, color pixel, intensity pixel. Each of these mapping values can be represented by another dimension in the deep layer patient profile (DLPP). The deep layer patient profile (DLPP) enables longitudinal data (all of the aforementioned mappings timestamped) by providing a time dimension.
  • FIG. 3A shows an example of a view 300 of structured patient profile symptoms related to conditions and then related to Score and how these combined values can be mapped to intensities in a deep layer patient profile (DLPP). In this example, at time t3, multiple symptoms, conditions and scores are given unique intensity mappings. Determining a unique value can be a 1 to 1 mapping between the sum of the available value ranges of structure items to an intensity value of 0 to the sum of ranges. More complex mappings are possible as in the case of needing only red, amber or green values, or, in the case of needing on gray scale values.
  • FIG. 3B shows an example of a longitudinal graph of symptoms that is mapped to a straight line with intensities. When mapping longitudinal data 302, the time dimension can be incremented, and additional intensities are added 304.
  • FIG. 3C shows an example user interface view 306 of a patient profile. As a deep layer patient profile (DLPP) is updated, it produces a patient signature. The patient signature can be “viewed” with multiple perspectives. A common perspective is to view the patient profile in a user interface view on a computing screen 306. The user interface view 306 shows time at the bottom horizontal access and various values from N other dimensions stacked on top. In certain implementation, the user interface view 306 becomes a patient's quick response (QR) code/ink blot identifier. In certain implementations, the deep layer patient profile (DLPP) can be optimally used for performing artificial intelligence/machine learning algorithms, such as with Python, OpenCV, Node, etc.
  • FIG. 3D shows an example process of development and use 308 of deep layer patient profile (DLPP). At “A” 310, dimensions are shown as used in a particular deployment of a deep layer patient profile (DLPP). The values per recording of patient data are shown in the formulas or variables. For example, Conditions 1 through C can have a score value of the summation of the symptom scores. Additionally, slope (i.e., change in intensity) or derivatives (i.e., change in change) can be calculated. The rules indicated are related to patient report data. Lab data and electronic health record data can also be used to update reported data. For example, an N dimensional region of the DLPP can be referred to as a “slice” and uses Python notation (e.g., range dimension 1, range dimension 2, . . . range dimension N). The deep layer patient profile (DLPP) becomes a patient's longitudinal representation: DTx+drug therapy=precision and personalized care pathway.
  • “B” 312 shows a structured recording of patient profile over time t1, t2, . . . tn. As previously mentioned, each value recorded that is part of a deep layer patient profile (DLPP) is mapped to an intensity value. In section be, time t3 (column t3) shows raw intensity values. Raw values, such as 1 . . . 5, can be scaled to wider values, such as 1 . . . 255, to allow more visual distinction when using a user interface perspective. Additional values can be included such as summation, mean, standard deviation, mode, median, and outliers.
  • “C” 314 shows the user interface view of a deep layer patient profile (DLPP) contain mapped data. As mentioned, a user interface is one perspective (or view) on a deep layer patient profile (DLPP). Other perspectives include, but are not limited to, binary arrays, gray images and animations of intensities over time. The animation, in particular, can be a standard graph animation, or a geolocation animation where the intensity type is animated over a 3D human body outline.
  • When using array views, slice and searching may be possible using ML and image processing algorithms. When using these algorithms, there may be no real indication that the included data is patient health data. Rather, algorithms such as movement detection, line detection, object detection, histogram analysis, rotation, translation, shear, thresholding, blurring, erosion, dilation, contouring, etc. can be used.
  • The deep layer patient profile (DLPP) can be a gateway between structured and protected patient health data, and, intensity based on data upon which to perform high performance AI algorithms. A N dimensional region of a deep layer patient profile (DLPP) can be compared against other regions, such as is the case when comparing different time regions. A N dimensional region of a deep layer patient profile (DLPP) can be compared to a region of another patient, such as is the case when searching for “a patient like me.”
  • When performing these AI/ML algorithms, it may be the case that a ‘good patient’ or ‘good data set’ is needed. The good data sets are those deep layer patient profiles (DLPP) where the outcome was successful—the best case being the patient is cured of the disease. Having good data sets allows Detect, Diagnose, to create inferred and probabilistic recommendations based on patching a current patient from time t1 to tn to other patients who have survived from time t1 to tn+K, where K is more time past where the current patient is. Time tn to tn+K are where next irAEs, recommendations, lab tests or questions will ‘probably’ be needed or addressed by the current patient.
  • FIG. 3E shows a goal outcome 316. Outcome 316 describes a set of longitudinal data, DLPP images, recommendations/treatments that indicate adverse events is/has been removed. In specific, outcome 316 shows a combination of longitudinal paths, plus other data, where the trends approach “0” indicating that cancer has been removed. There can be various conditions that are considered as good. With enough good paths that are capture, it may be practical to pattern match a current “patient A” to search for similar paths. It may be possible to show that “patient A” with “confidence J” of next likely symptoms, conditions, trends, severity score, recommendations, etc.
  • Other approaches to encoding patient data can be CTSCAN/MRI 3D views that are sliced but where data is homogenous (i.e. all about neck, pixels) and geographically close. Semantics may be known at the beginning, such as taking a scan of the neck to look for new data based on values scanned (i.e. tumor in neck).
  • DLPP data may be multi-type and not homogenous and geographically disperse. For example, reported blood lower abdomen, fever in head, and rash on arm, etc. The semantics may not be known at the beginning. A determination may be made as toxicity, where a patient has cancer and is under toxic treatment based on reported data, expert authored models, and treated under a DTx platform. However, previously known toxicity effects, patient reports, or where/when irAE occurs are not known. Semantics can be derived, and then intensity values are derived, such as to display as pixels in a user interface.
  • DLPP can include primary data, such as patient reported, lab, EHR, social (e.g., family, wealth, etc.), environment (e.g., altitude, seasons, etc.), energy (movement, exercise, diet, etc.) and symptoms as represented by vector of change. The DLPP can include secondary data, such as score, grading, counts, vector of change, determined conditions, score number, CTCAE score, etc. Furthermore, DLPP can include tertiary data, such as using primary and secondary data with expert knowledge leading to recommendations. The DLPP can use quaternary data, such as AI/ML algorithms, image processing and pattern matching across primary/secondary/tertiary data. Values can be replaced (i.e. what if next 2 days were these results), then match is performed.
  • Data progression can occur as DTx AI/ML systems and experts process data. Raw data can be provided by a patient or device (e.g. Yes/No; blood pressure 150/99). Information data may be context to the raw data (e.g. patient indicated a fever). Knowledge data can add additional context and experience to information data (e.g. patient having AE to medicine; patient has flu). Derived data can add analytics to knowledge, information and raw data (e.g. average fever using medicine, % chance fever reduces in N days). Inferred data can add AI/ML to derived, knowledge, information and raw data to generate additional deterministic data (e.g. based on 1000 patients like current patient, reduce medicine 50%). Probabilistic data can add AI/ML to the aforementioned data to generate additional probability confidence data (e.g. 80% probability patient will escalate AE if medicine continued at 100%; 90% confident that current medication with two other medications will not reduce disease). DLPP can contain all data progression levels as longitudinally included.
  • Now referring back to FIG. 2, in certain implementations, the transform component 214 can include an ambiguity layer for uncertain, vague or subjective responses from patients 224. The ambiguity layer can translate subjective non-standardized responses into objective standardized responses. As an example, two patients 224 indicate a headache level 4; however, the pain threshold may be different between the two patients 224. Therefore, a follow up question can be related to negative effects of the headache. For example, did you go to work? If one patient says no, there can be some level of assurance that the headache is debilitating and prevented the patient from working.
  • The diagnose component 216 can be invoked when deep layer patient profile (DLPP) updates are received from EMR, third party systems and lab results. Examples of such deep layer patient profile (DLPP) updates include lab results, x-ray results, blood work results, etc. The dialog component 216 can process and present laboratory results with associated diagnostic knowledge models as related to the longitudinal data and deep layer patient profile (DLPP). The diagnose component 215 can then provide additional recommendations based on the detect component 218 recommendations. The diagnose component 216 can confirm results, such as lab results (e.g., low blood count, colitis, etc.) The diagnose component 216 can provide treatment recommendations. Receiving a lab result can trigger the diagnose component 216. A difference between the diagnose component 216 and the detect component 218 is that the source of data for the diagnose component 216 includes lab test results, imaging studies, or procedures (clinical data) versus the source of data for the detect component 218 is patient provided data/outcomes (non-clinical data). In addition, the diagnose component 216 and the detect component 218 can use different or overlapping DTx knowledge models.
  • The detect component 218 can filter out symptoms and rule out patient reported outcomes (PRO). The detect component 218 can be invoked by the transform component 214 after the dialog component 212 invokes the transform component 214. Deep layer patient profile 204 can be sent to the detect component 218. The detect component 218 can also access the data repository 202 for any other relevant information.
  • The detect component 218 can load primary symptoms with associated symptom knowledge models. The detect component 218 can run detection rules, where such rules can be authored by medical experts 234 and provided in a table of rules. The following tables are examples of such rules.
  • Colitis; primary = bmFreq, bloodAmount
    CTCAE Grade Grade 0 Grade 1 Grade 2 Grade 3 Grade 4
    Alert Color green yellow red
    Evaluation
    0 1 2 3 4
    Priority
    Primary
    0 for 1 on 1 on 2 on 2 on 3 on 3 on 4 on
    Symptoms all any any any any any any any
    Associated 0-4 for 0-2 for 3 on 0-2 for 3 on 0-2 for 3 on 0-3 for
    Symptoms all all any all any all any all
  • Dermatitis; primary = rashCoverage
    CTCAE Grade Grade 0 Grade 1 Grade 2 Grade 3 Grade 4
    Alert Color green yellow red
    Evaluation
    0 1 2 3 4
    Priority
    Primary
    0 for 1 on 2 on 3 on 3 on 4 on
    Symptoms all any any any any any
    Associated 0-4 for 0-4 for 0-4 for 0-2 for 3-4 on 0-4 for
    Symptoms all all all all any all
  • Endocrine; primary = fatigueOrMalaise, headache, dizzyOrFainting
    CTCAE Grade Grade 0 Grade 1 Grade 2 Grade 3 Grade 4
    Alert Color green yellow red
    Evaluation
    0 1 2 3 4
    Priority
    Primary
    0 for 1 on 2 on 3 on 4 on
    Symptoms all any any any any
    Associated 0-4 for 0-4 on 0-4 on 0-4 on 0-4 on
    Symptoms all any any any any
  • Pneumonitis; primary = shortOfBreath, coughSeverity, oxygen, oxygenWalking
    CTCAE Grade Grade 0 Grade 1 Grade 2 Grade 3 Grade 4
    Alert Color green yellow red
    Evaluation
    0 1 2 3 4
    Priority
    Primary
    0 for 1 on 1 on 2 on 2 on 3 on 3 on 4 on
    Symptoms all any any any any any any any
    Associated 0-4 for 0-2 for 3-4 on 0-2 for 3-4 on 0-2 for 3-4 on 0-4 for
    Symptoms all all any all any all any all
  • Uveitis; primary = visionChanges, eyePain
    CTCAE Grade Grade 0 Grade 1 Grade 2 Grade 3 Grade 4
    Alert Color green yellow red
    Evaluation
    0 1 2 3 4
    Priority
    Primary
    0 for 1 on 1 on 2 on 2 on 3 on 4 on
    Symptoms all any any any any any any
    Associated 0-4 for 0-2 for 3 on 0-2 for 3 on 0-3 for 0-3 for
    Symptoms all all any any any all all
  • For example, for primary symptoms, ALL (meaning AND) or ANY (meaning OR of 1 or more) of the primary symptoms grading score have been met. Patients 224 can report symptoms in grades, for example 0 to 4 for severity of blood. Grading scores determine level of symptom. Symptoms can be evaluated in right to left order in Table and indicated by evaluation number. If primary symptom is met (i.e. ALL or ANY evaluates to True) then evaluation associated symptoms are also, both primary and (associated symptoms OR clinical modifier) are True.
  • The detect component 218 further evaluates symptom scores for each category (e.g. Myositis, Dermatitis, Pneumonitis, etc.). Scores can be calculated from a SUM of (severity grade X weights). Patient 224 can define the grade, medical experts 234 using the authoring tool 222 can define weights. Scores not with Table guidelines can be filtered out. Filtered questions will not be used. Common Terminology Criteria for Adverse Events or CTCAE are a set of criteria for the standardized classification of adverse effects of drugs used in cancer therapy. CTCAE uses a range of grades from 1 to 5. Specific conditions and symptoms may have values or descriptive comment for each level, but the general guideline is “1” for mild, “2” for moderate, “3” for severe, “4” for life threatening, and “5” for death. The scores can include CTCAE grades along with the scores determined by the digital therapy module 120. The CTCAE grade is per system area, while the scores determined by the digital therapy module 120 is a sum of all scores.
  • The detect component 218 can look at trends of CTCAE grade or sum of scores over time (longitudinal) and determine if a “alert” should be set to further in the questions to patient 224. The detect component 218 can store scores in journal 208 (data repository 202). Authorized parties can view the journal 208. The detect component 218 can invoke the recommend component 220.
  • The recommend component 220 provides a set of recommendations for treatment. When invoked by the detect component 218, the recommend component 220 can retrieve context or deep layer patient profile (DLPP) 204. The recommend component 220 can load actions from the data repository 202 for programs that patient 224 is associated with. The recommend component 220 can run each action and test for pre-conditions. If a pre-condition is true, then tasks associated with the action are executed. Tasks can notify provider 238, add a recommendation, add longitudinal data, add follow up, notify patient 224, order a medical test for patient 224. The recommend component 220 can build a collection of action results. The recommend component 220 can run through the recommendations and relate each to a group ID (e.g. group IO therapy) and get a recommendation to continue and discontinue at the same time. The higher the priority of a recommendation will override a lower priority of a recommendation. The recommend component 220 can provide traceability of conflicting recommendations for health care providers 238. Conditions that led to the recommendation can be attached on the recommendation.
  • The recommend component 220 can determine what knowledge model and the level of the knowledge model the recommendation came from. Recommendations can be weighted, for example a weight of “1” can indicate it is required by the health care provider 238, where a weight of “0” may mean it is optional. A health care provider 238 may require provider 238 indicate that they complied with the recommendation or check that the recommendation was followed. Recommendations can be written to the data repository 202. Deep layer patient profiles (DLPP) 204 and context can be supplemented with recommendations and provided to the detect component 218 and used by the dialog component 212.
  • To illustrate the use of knowledge models, the following example is described. In the domain of oncology, one program can be IO treatment related adverse event management. As discussed above, other examples of programs can include chemotherapy-related, radiation therapy-related, targeted therapy-related, hormone therapy-related adverse event management, etc. A set of knowledge models defines a program. A set of questions, rules and actions are provided for the program. Certain knowledge models are applicable to certain components of the digital therapy module 120. The knowledge models are defined by variables, weights and values, and other factors.
  • In this example, for knowledge models of the detect component 218, a “trigger” is a symptom that drives the knowledge model to be activated (e.g., rash for dermatitis, diarrhea for colitis, elevated liver function test results for hepatitis). A “primary symptom” is related to a priority, weights, and an ID (e.g., headache). An “associated symptom(s)” is related a priority, weights, and an ID (e.g., location, severity, etc. of the headache). A “clinical modifier” is related to a priority, weights, and an ID (e.g., had a prior irAE). For knowledge models of the diagnose component 216, clinical lab results drive the knowledge model to be activated (e.g., configured rash for dermatitis). A “primary symptom” is related to a priority, weights, and an ID (e.g., headache). An “associated symptom(s)” is related a priority, weights, and an ID (e.g., location, severity, etc. of the headache). A “clinical modifier” is related to a priority, weights, and an ID (e.g., had a prior irAE). For knowledge models of the recommend component 220, there can be pre-conditions based on rule, conditional, and time trends. A recommendation can be weighted by a health care provider 238 and identified.
  • FIG. 4 is a block diagram that shows deep layer patient profiles (DLPP) multi-dimensional image having a patient specific volume, density and shape. Logically, the multi-dimensions can be understood as a multi-helix set of adverse events starting and stopping as time progresses as shown by events 400. At any point in time, additional dimensions of information can be sliced and include, but not limited to, images (squares), audio (ovals), body proximity area (x,y,z x2,y2,z2), recommendations, medicines per recommendation, etc. Patients will generally have different volumes, density and shapes. Within a disease, there is a bounding of known differences, which aids in machine learning. Across diseases, there is in intersection of slices that are similar. For example, with cancer and heart disease, many of the symptoms, vitals and disease are different. However, there is a higher correlation of patients that become cancer free and then develop a heart disease. DLPP can enable machine learning and image processing training and comparison across within and across diseases.
  • Adverse event reports of rashes using numerical ranges 1 to 5 over a time period are less dense than image reporting or voice audio description reporting. In all cases, patient reports have been mapped to intensities. In the case of images or audio, there can be mapped to areas of intensity. A slice over the dense areas for rash at a period of time is extracted across many patients and used for machine learning training. The machine learning can implement area intensities for images training, area intensities for audio training, and a combination for audio feature extraction (tense, happy, slurring) and image training. Given a set of extracted slices that have been processed for machine learning training, possible future contents can be produced. Future contents include rashes, headaches, mood, vitals, etc.
  • Since the treatment of adverse events for disease treatment occurs over time, longitudinal data is typically the dimension humans visualize easiest. However, body proximity, similar to geo-location, is another key dimension. In this dimension, mapped data is sliced to extract intensities that related to location on a patient's body, physically or emotionally as shown in 402. Real reported rash images, or artificial canonical images are projected onto an artificial body to create a “movie like” reporting (actual data) with various endings (possible futures) as shown in 402. The patient's “movie” can be carried across illness and treating physicians and includes non-disease specific data that provide non-intuitive visual insights including height, weight, slouching actual symptoms vs. reported symptoms, valid symptoms vs. reduced symptoms (real pain), cancer size, cancer consistency, etc.
  • FIG. 5 is a generalized flowchart 500 for patient conversation for digital therapeutics. In various embodiments, the health AI system 118 is implemented. In particular, the process 500 can be performed during interaction with dialog component 212 of the digital therapy module 120. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in an y order to implement the method, or alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or a combination thereof, without departing from the scope of the invention.
  • At block 502 the process 500 starts. At step 502, a patient checks in. For example, patient(s) 224 through devices 226 invokes an application on devices 206 to connect with the information handling system 100, and in specific to the dialog component 212. At step 504, the patient is identified and logged in. For example, an ID related to the patient recognized by the dialog component 212 logs in the patient(s) 224.
  • If the patient decides to end the session, or if questions are completed, or if a session times out, then following the YES branch of block 506, the process 500 ends. Otherwise, following the NO branch of block 506, at step 510 conversation with the patient and the dialog component 212 begins or continues. For example, the conversation can include artificial intelligence (AI) generated questions form the dialog component 212 to the patient(s) 204.
  • At step 512, deep layer patient profile (DLPP) data is loaded. Deep layer patient profile data can include patient record(s), longitudinal data, current recommendations, state of the question process to the patient, question preferences, genre, etc. In certain implementations, the data repository 202 provides a specific deep layer patient profile (DLPP) to deep layer patient profiles (DLPP) 204. The dialog component 212 then loads the specific deep layer patient profile (DLPP). At step 514, the classification is performed on the deep layer patient profile (DLPP). For example, classification can be directed to format of the data, such as mime type as discussed above. In certain implementations, the dialog component 212 can invoke text classifier 210 to perform the classification.
  • At step 516, the deep layer patient profile (DLPP) is sent to the transform component 214. At step 518, knowledge models and rules are sent to the transform component 214. In certain implementations, the transform component 214 is used to create machine readable data from information such as patient answers to questions. In general, the transform component 214 can be implemented to provide data that is readable by the health AI system 118, and specifically the digital therapy module 120 and its components. In certain implementations, the authoring tool 222 provides the DTx knowledge models and rules to the transform component 214.
  • At step 520, data is saved. For example, the transform component 212 saves data to the data repository 202. At step 522, the deep layer patient profile (DLPP) is sent to the detect component 218. In certain implementations, the detect component 218 is used to filter out patient symptoms and rule out certain patient reported outcomes (PRO).
  • At step 524, knowledge components and rules are loaded. For example, the authoring tool 222 provides or loads knowledge components and rules to the detect component 218. At step 526, data is saved. In certain implementations, the data is saved from the detect component 218 to the data repository 202. At step 528, the deep layer patient profile (DLPP) is sent. The deep layer patient profile (DLPP) can be sent from the detect component 218 to the recommend component 220. At block 530, actions are loaded. For example, the actions are loaded from the data repository 202 to the recommend component 220. The recommend component 220 can provide a set of recommendations for treatment of the patient. In certain implementations, recommend component 220 is invoked by the detect component 218, and receives context or deep layer patient profile (DLPP), and loads and runs the actions for programs that the patient is associated with. At block 532, recommendations are sent. The recommend component 220 can send recommendations to be stored in the data repository 202.
  • At step 534, deep layer patient profiles (DLPP) are sent. In certain implementations, the recommend component 220 sends the deep layer patient profiles (DLPP) to the detect component 218. The detect component 218 sends the deep layer patient profile (DLPP) to the transform component 214. The transform component 214 sends the deep layer patient profile (DLPP) to the dialog component 214.
  • At step 536, questions are loaded. The data repository 202 can load questions to the dialog component 212. At step 538, the questions are modified. In certain implementations, the dialog component can modify the questions. At step 540, an update is performed on the journal 208. At step 542, an updated deep layer patient profile (DLPP) is saved. In certain implementation, the updated deep layer patient profile (DLPP) is saved from the dialog component 212 to the deep layer patient profiles (DLPP) 204, and the deep layer patient profiles (DLPP) 204 save the deep layer patient profile (DLPP) to the data repository 202.
  • At step 544, the questions are delivered to the patient. In certain implementations, the dialog component 212 provides the questions to a device 226 of user 224. The questions can be delivered through an application on the device 226. If the patient decides to end the session, or if questions are completed, or if a session times out, then following the YES branch of block 506, the process 500 ends. Otherwise, following the NO branch of block 506, the process 506 continues.
  • FIG. 6 is a generalized flowchart 600 for patient updates. In various embodiments, the health AI system 118 is implemented. In particular, the process 600 can be performed during interaction with diagnose component 216 of the digital therapy module 120. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method, or alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or a combination thereof, without departing from the scope of the invention.
  • At block 602 the process 600 starts. At step 604, lab results are received. In certain implementations, lab results are made available at health data 242, and in the form of electronic health records (EHR) 244. The lab results can be received by the data repository 202. At step 606, the lab results are loaded. The lab result can be loaded from the data repository 202 to the diagnose component 216. At step 608, deep layer patient profile (DLPP) is loaded. The deep layer patient profile (DLPP) can be loaded from deep layer patient profiles (DLPP) 204 to the diagnose component 216. At step 610, recommendations are updated. The diagnose component 216 can update the recommendations to the recommend component 220. At step 612, recommendations are saved. The recommend component 220 can save the recommendations to the data repository 202. If the updates are complete, then following the YES branch of block 614, the process 600 ends at block 616. Otherwise, following the NO branch of block 614, the process 600 continues.
  • FIG. 7 is a generalized flowchart 700 for provider updates. In various embodiments, the information handling system 100 is implemented. In particular, health care providers 238 through devices 240 interact with deep layer patient profiles (DLPP) 204 of the information handling system 100. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method, or alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or a combination thereof, without departing from the scope of the invention.
  • At block 702 the process 700 starts. At step 704, patient profile is loaded. In certain implementations, the deep layer patient profile (DLPP) is loaded from deep layer patient profiles (DLPP) 204 to a device(s) 240 of health care provider(s) 238. At step 706, recommendations are updated. The health care provider(s) 238 through device(s) 240 can update recommendations to the recommend component 220. At step 708, deep layer patient profile (DLPP) is updated. The health care provider(s) 238 through device(s) 240 can provide the updated deep layer patient profile (DLPP) to the deep layer patient profiles (DLPP) 204 of data repository 202. If updates are complete, following the YES branch of block 710, at block 710, the process 700 ends. Otherwise, following the NO branch of block 710, step 704 is performed.
  • FIG. 8 is a generalized flowchart 800 for a patient conversation for a patient to submit patient symptoms. In various embodiments, the information handling system 100 is implemented. The process 800 in general describes the interaction of the components of digital therapy module 120 when a patient 224 submits patient symptoms or patient reported outcomes (PRO). Throughout the process 800, as described above, DTx knowledge models, deep layer patient profiles (DLPP), questions, and rules can be implemented. At block 802, the process 800 starts. At step 804, a patient checks in. At step 806, the dialog component 212 is initiated. At step 808, the transform component 808 is initiated. At step 810, the detect component 218 is initiated. At step 812, the recommend component 220 is initiated. The process ends at block 814.
  • FIG. 9 is a generalized flowchart 900 as to when an electronic medical record (EMR) or lab notification is received. In various embodiments, the information handling system 100 is implemented. The process 900 in general describes the interaction of the components of digital therapy module 120 when a patient 224 receives an EMR or lab notification. Throughout the process 900, as described above, DTx knowledge models, deep layer patient profile (DLPP), questions, and rules can be implemented. At block 902, the process 900 starts. At step 904, a lab result or EMR is received. At step 906, the diagnose component 216 is initiated. At step 908, the recommend component 220 is initiated. The process ends at block 910.
  • FIG. 10 is a generalized flowchart 1000 as to when a health care provider changes data. In various embodiments, the information handling system 100 is implemented. The process 1000 in general describes the interaction of the components of digital therapy module 120 when a health care provider 238 changes/modifies/adds patient related data. Throughout the process 1000, as described above, DTx knowledge models, deep layer patient profiles (DLPP), questions, and rules can be implemented. At block 1002, the process 1000 starts. At step 1004, a health care provider 238 makes a change to patient data. At step 1006, the detect component 218 is initiated. At step 1008, the recommend component 220 is initiated. The process ends at block 1010.
  • FIG. 11 is an example method, flow or care pathway. Although depicted linearly, it is to be understood that the blocks can be performed in parallel, and that feedback from succeeding blocks can be provided to preceding blocks. A flow or care pathway 1100 can be implemented as part of the digital therapeutics or DTx. Block 1102 performs patient monitoring. The monitoring block 1002 can be performed by patients 224 or delegates of patients 224. In specific, patent generated health data or PGHD 1004 is monitored. The PGHD 1004 can be self-reported symptoms such as patient reported outcomes (PRO). In certain instances, PGHD 1104 can be gathered through patient data monitoring/sensing devices 228, such as Bluetooth enabled biometrics that measure patient vital signs, such as oxygen.
  • Block 1106 performs a remote triage. The remote triage block 1106 can be video or voice, and be performed by care coordinators, such as nurses. Block 1108 is a timing for the triage. In specific, timing is related to urgency for treatment. For example, an indication or “red” can be immediate, “yellow” can be for the next day, and “green” can be for notification only. Block 1110 is directed to prioritization for an evaluation visit. For example, various levels can be implemented as to the priority, such as “P1” for emergent/immediate, “P2” for urgent/same day, “P3” for within 72 hours, “P4” for a watch list, and “P5” for a routine visit.
  • Block 1112 performs a diagnosis. The diagnosis block 1012 can be performed by health care providers 238, such as physicians. In certain implementations, an application can used by the health care providers 238. Block 1114 is evaluation and management which can provide for a diagnostic workup, such as exams/lab tests, procedures and imaging, consultation, etc. In addition, symptom management can be provided, such as care setting, supportive care, immunosuppressive prescriptions, etc.
  • Block 1116 performs treatment. The treatment block 1116 can be performed by health care providers 238, such as physicians. In certain implementations, an application can be used by the health care providers 238. Block 1118 is evaluation and management which can distinguish between inpatient versus outpatient evaluation and management, virtual versus in-person evaluation and management, provide for monitoring labs, provide IO therapy status, provide immunosuppressive prescriptions, etc.
  • FIG. 12A is a block diagram of digital therapeutics or DTx that can be used to implement the system and method of the present invention.
  • In certain implementations, the information handling system 100, and the health AI system 118 as described can be used as the platform for DTx 1100. The care pathway 1000 described above, can receive input and provide output by various entities. Included in the care pathway are DTx knowledge models 246 and deep layer patient profiles (DLPP) 204. In certain implementations, medical experts 234, which can include drug and disease experts, provide input as to DTx knowledge models 246 through authoring tool 222.
  • The care pathway 1100 receives data from data component 1202. Data component 1202 can include patient generated health data 1204, which can include biometrics, etc. Data component 1202 can further include patient reported outcomes data 1206, clinical data (electronic health records) 1208, and environmental/social determinants data 1210. In certain implementations, the data from data component 102 can come from health data 242 and devices 228 described in FIG. 2. In certain implementations, data from data component 102 can be processed by a data cleansing component 1212. Data from data cleansing component can be received by the transform component 214. Data from the transform component 214 can be received by the care pathway 1000.
  • In certain implementations, authorizing parties 1214 request for treatment, and use of the DTx 1200. Authorizing parties 1214 can include patients 224, which can include actual patients and delegates/caregivers of patients, etc. Authorizing parties 1214 can be health care providers 238, which can include primary prescribing physicians/staff, non-treating physicians/staff, etc.
  • Entities, such as patients 224 and health care providers 238 can receive output, such as recommendations, from the care pathway 1000. The output from care pathway 1000 can be processed by application program interfaces (API) 1218. In certain implementations, a user interface (UI)/user experience (UX) can be used to provide output (e.g., recommendations) to patients 224 and health care providers 238. In certain implementations, API 1218 can include a localization layer to interface or integrate with different data sources, or cloud environment for deployment.
  • FIG. 12B is a block diagram of another implementation of digital therapeutics or DTx that can be used to implement the system and method of the present invention. Generally, an DTx as a “Studio” can provide permutations of customizable content including domains, programs, rules, actions, conditions, adverse events (see DTX knowledge model above) and care pathways. In additional patients, providers, and experts can be included. The “Studio” can enable multiple deployments of topologies, geographies, and governing laws/regulations, etc. to direct customization and implementation.
  • The “Studio” can deploy entry points that can include (i) a hosted and managed environment where an organization and its patients use the product without customizations (e.g., clinical trial at existing cancer center with existing patients; (ii) a hosted and managed environment where an organization enhances customizable content; and (iii) an ability for a medical organization and providers to customize and management their own DTx product.
  • Customizable content can include modified knowledge models, questions, rules, actions, AEs and the method for a care pathway. Studio customization can include the ability to add additional integration of devices, conversation flows and genres. Studio customization can include the ability to add additional integration of EHRs, third party systems and patient data sources. Studio can include selectable and customizable components for modifying care pathway that includes dialog, detect, diagnose and recommend.
  • FIG. 13 is a generalized flowchart 1300 for presenting patient related information implementing a digital deep layer patient profile. In various embodiments, the health AI system 118 is implemented. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in an y order to implement the method, or alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or a combination thereof, without departing from the scope of the invention.
  • At block 1302 the process 1300 starts. At step 1302, receiving data that includes patient data, lab result data, machine learning calculation data related to the patient, and physician result data. In certain implementations, the health AI system 118 performs the step.
  • At step 1304, mapping the data as to intensities, multiple dimensions and time is performed. For certain implementations, the view 300 of structured patient profile symptoms related to conditions and then related to Score and how these combined values can be mapped to intensities in a deep layer patient profile (DLPP) is provided.
  • At step 1306, converting the mapping to create an unstructured binary data with binary correlations as a digital deep layer patient profile is performed. The binary correlations can be used to obtain data insights, recommendations and predictions. In certain implementations, when mapping longitudinal data 302, the time dimension can be incremented, and additional intensities are added 304.
  • At 1308, processing the digital deep layer patient profile with machine learning and image processing algorithms is performed. In certain implementations, the deep layer patient profile (DLPP) module 122 can store longitudinal data as a deep layer patient profile (DLPP).
  • For certain implementations, the digital deep layer patient profile is a binary multiple dimension image deidentified as to patient, disease, time or entity.
  • For certain implementations, the digital deep layer patient profile is multi-sliced into portions of the digital deep layer patient profile and is used to perform algorithms as to areas of interest.
  • For certain implementations, the digital deep layer patient profile is searched as to edge detection, rate of change, contour identification, color enhancing, color reduction, dilation, moments and masking, to deliver non intuitive trends and correlations in the data.
  • For certain implementations, the digital deep layer patient profile is compared, using machine language trained patient profile sets, using sliced and transformed portions of the digital deep layer patient profile to portions of other digital deep layer patient profiles to determine similar patient profiles.
  • For certain implementations, the digital deep layer patient profile is implemented with machine learning algorithms to present a future view of the digital deep layer patient profile that shows predicted treatment changes, predicted medicine changes, adverse event expected results, and expected patient reactions.
  • At block 1310, the process 1300 ends.
  • Referring to FIG. 14, an example screen presentation of a health care provider user interface 1400 is shown. The user interface 1400 provides views into patient reports, status, recommendations, etc.
  • Referring to FIG. 15, an example screen presentation of a health care provider user interface 1500 is shown. The user interface 1500 provides views into patient reports, status, recommendations, etc.
  • Referring to FIG. 16, an example screen presentation of a health care provider user interface 1600 is shown. The user interface 1600 provides views into patient reports, status, recommendations, etc.
  • Referring to FIG. 17, an example screen presentation of a health care provider user interface 1700 is shown. The user interface 1700 provides views into patient reports, status, recommendations, etc.
  • Referring to FIG. 18, an example screen presentation of a health care provider user interface 1800 is shown. The user interface 1800 provides views into patient reports, status, recommendations, etc.
  • Referring to FIG. 19, an example screen presentation of a health care provider user interface 1900 is shown. The user interface 1900 provides views into patient reports, status, recommendations, etc.
  • Referring to FIG. 20, an example screen presentation of an authoring tool user interface 2000 is shown. The user interface 2000 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • Referring to FIG. 21, an example screen presentation of an authoring tool user interface 2100 is shown. The user interface 2100 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • Referring to FIG. 22, an example screen presentation of an authoring tool user interface 2200 is shown. The user interface 2200 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • Referring to FIG. 23, an example screen presentation of an authoring tool user interface 2300 is shown. The user interface 2300 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • Referring to FIG. 24, an example screen presentation of an authoring tool user interface 2400 is shown. The user interface 2400 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • Referring to FIG. 25, an example screen presentation of an authoring tool user interface 2500 is shown. The user interface 2500 provides views as authoring questions, IDs, priorities, conditions, languages, deploy DTx knowledge models, etc.
  • As will be appreciated by one skilled in the art, the present invention may be embodied as a method, system, or computer program product. Accordingly, embodiments of the invention may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in an embodiment combining software and hardware. These various embodiments may all generally be referred to herein as a “component,” “circuit,” “module,” or “system.” Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
  • Any suitable computer usable or computer readable medium may be utilized. The computer-usable or computer-readable medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, or a magnetic storage device. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • Computer program code for carrying out operations of the present invention may be written in a programming language such as JavaScript, Python, C# or the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • Embodiments of the invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • The present invention is well adapted to attain the advantages mentioned as well as others inherent therein. While the present invention has been depicted, described, and is defined by reference to particular embodiments of the invention, such references do not imply a limitation on the invention, and no such limitation is to be inferred. The invention is capable of considerable modification, alteration, and equivalents in form and function, as will occur to those ordinarily skilled in the pertinent arts. The depicted and described embodiments are examples only and are not exhaustive of the scope of the invention.
  • Consequently, the invention is intended to be limited only by the spirit and scope of the appended claims, giving full cognizance to equivalents in all respects.

Claims (21)

1.-20. (canceled)
21. A method of providing a treatment to a cancer patient, comprising
(i) obtaining a deep layer patient profile (DLPP), wherein the DLPP comprises patient data and clinical data;
(ii) analyzing the DLPP, wherein analyzing comprises comparing the DLPP to a population of DLPPs from cancer patients;
(iii) determining a treatment based on (ii), wherein the treatment comprises initiating treatment, discontinuing treatment, or continuing treatment, wherein the treatment is selected from immunotherapy, chemotherapy, radiation therapy, surgery, targeted therapy, hormone therapy or a combination thereof; and
(iv) providing the treatment determined in (iii) to the cancer patient.
22. The method of claim 21, wherein step (ii) analyzing the DLPP comprises using machine learning and image processing algorithms.
23. The method of claim 21, wherein step (iii) determining treatment comprises using machine learning and image processing algorithms.
24. The method of claim 21, wherein step (iii) determining treatment comprises using artificial intelligence.
25. The method of claim 21, wherein step (iii) determining treatment comprises determining the presence or absence of at least one immune related adverse event (irAE) in the patient.
26. The method of claim 21, wherein initiating treatment comprises administering an immunotherapy.
27. The method of claim 21, wherein discontinuing treatment comprises altering dosage of treatment or ceasing treatment.
28. The method of claim 21, wherein continuing treatment comprises administering or performing a new treatment.
29. The method of claim 21, further comprising step (v) monitoring response by the patient to treatment.
30. The method of claim 21, wherein the DLPP is updated over time.
31. The method of claim 30, wherein the DLPP is updated using artificial intelligence.
32. The method of any one of claim 29, wherein steps (ii)-(iv) or steps (ii)-(v) are repeated as the DLPP is updated.
33. The method of claim 21, wherein the DLPP comprises data generated from genomic sequencing of blood, tissue, stool, urine, or combinations thereof.
34. The method of claim 21, wherein clinical data comprises electronic health records (EHR).
35. The method of claim 21, wherein patient data comprises data collected from patients measured with connected devices via mobile, phone or internet.
36. The method of claim 21, wherein the DLPP comprises scientific and/or expert evidence of the cancer and/or treatment.
37. The method of claim 21, wherein step (ii) analyzing the DLPP comprises assigning weight variables to the patient data and the clinical data.
38. The method of claim 37, wherein the weight variables are assigned based on organization preference, expert opinion, artificial intelligence, a machine learning algorithm, or any combination thereof.
39. The method of claim 21, wherein the population of DLPPs comprises at least one data set where the treatment was successful.
40. A computer-implementable method of presenting patient related information comprising:
receiving data that includes patient data, lab result data, machine learning calculation data related to the patient, and physician result data;
mapping the data as to intensities, multiple dimensions and time;
converting the mapping to create an unstructured binary data with binary correlations as a digital deep layer patient profile; and
processing the digital deep layer patient profile with machine learning and image processing algorithms.
US17/554,796 2019-08-19 2021-12-17 System and method for digital therapeutics implementing a digital deep layer patient profile Abandoned US20220359050A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/554,796 US20220359050A1 (en) 2019-08-19 2021-12-17 System and method for digital therapeutics implementing a digital deep layer patient profile

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201962888777P 2019-08-19 2019-08-19
US16/700,312 US20210057106A1 (en) 2019-08-19 2019-12-02 System and Method for Digital Therapeutics Implementing a Digital Deep Layer Patient Profile
US17/554,796 US20220359050A1 (en) 2019-08-19 2021-12-17 System and method for digital therapeutics implementing a digital deep layer patient profile

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/700,312 Continuation US20210057106A1 (en) 2019-08-19 2019-12-02 System and Method for Digital Therapeutics Implementing a Digital Deep Layer Patient Profile

Publications (1)

Publication Number Publication Date
US20220359050A1 true US20220359050A1 (en) 2022-11-10

Family

ID=74645789

Family Applications (5)

Application Number Title Priority Date Filing Date
US16/563,113 Pending US20210057056A1 (en) 2019-08-19 2019-09-06 System and Method for Developing Artificial Intelligent Digital Therapeutics with Drug Therapy for Precision and Personalized Care Pathway
US16/678,010 Pending US20210057051A1 (en) 2019-08-19 2019-11-08 System Architecture for Digital Therapeutics with Drug Therapy for Precision and Personalized Care Pathway
US16/690,509 Active 2040-12-18 US11923051B2 (en) 2019-08-19 2019-11-21 System and method for creating digital therapeutics directed to patient care specific to a disease
US16/700,312 Abandoned US20210057106A1 (en) 2019-08-19 2019-12-02 System and Method for Digital Therapeutics Implementing a Digital Deep Layer Patient Profile
US17/554,796 Abandoned US20220359050A1 (en) 2019-08-19 2021-12-17 System and method for digital therapeutics implementing a digital deep layer patient profile

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US16/563,113 Pending US20210057056A1 (en) 2019-08-19 2019-09-06 System and Method for Developing Artificial Intelligent Digital Therapeutics with Drug Therapy for Precision and Personalized Care Pathway
US16/678,010 Pending US20210057051A1 (en) 2019-08-19 2019-11-08 System Architecture for Digital Therapeutics with Drug Therapy for Precision and Personalized Care Pathway
US16/690,509 Active 2040-12-18 US11923051B2 (en) 2019-08-19 2019-11-21 System and method for creating digital therapeutics directed to patient care specific to a disease
US16/700,312 Abandoned US20210057106A1 (en) 2019-08-19 2019-12-02 System and Method for Digital Therapeutics Implementing a Digital Deep Layer Patient Profile

Country Status (1)

Country Link
US (5) US20210057056A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020185973A1 (en) 2019-03-11 2020-09-17 doc.ai incorporated System and method with federated learning model for medical research applications
US20210249139A1 (en) * 2020-02-11 2021-08-12 doc.ai, Inc. Artificial Intelligence-Based Drug Adherence Management and Pharmacovigilance
US11321447B2 (en) 2020-04-21 2022-05-03 Sharecare AI, Inc. Systems and methods for generating and using anthropomorphic signatures to authenticate users
US20210390262A1 (en) * 2020-06-10 2021-12-16 Mette Dyhrberg Standardized data input from language using universal significance codes
US12039012B2 (en) 2020-10-23 2024-07-16 Sharecare AI, Inc. Systems and methods for heterogeneous federated transfer learning
KR20220159867A (en) 2021-05-26 2022-12-05 재단법인대구경북과학기술원 Apparatus and operation method for digital therapeutics object store
WO2023117322A1 (en) * 2021-12-22 2023-06-29 Biotronik Ag Method for classifying a medical device and/or drug, system and training method
WO2023133573A1 (en) * 2022-01-10 2023-07-13 Mahana Therapeutics, Inc. Methods and systems for treating chronic pain conditions using digital therapeutics in combination with other therapies
KR102626457B1 (en) * 2022-02-16 2024-01-18 주식회사 레몬헬스케어 Method and system for providing digital therapeutic portal service
WO2024196685A1 (en) * 2023-03-20 2024-09-26 Symita, Inc. Systems and methods for adaptive care pathways for complex health conditions
US12106860B1 (en) * 2023-10-09 2024-10-01 Click Therapeutics, Inc. Systems and methods for regulating provision of messages with content from disparate sources based on risk and feedback data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040193019A1 (en) * 2003-03-24 2004-09-30 Nien Wei Methods for predicting an individual's clinical treatment outcome from sampling a group of patient's biological profiles
US20100204920A1 (en) * 2005-04-25 2010-08-12 Caduceus Information Systems Inc. System for development of individualised treatment regimens
US20190034591A1 (en) * 2017-07-28 2019-01-31 Google Inc. System and Method for Predicting and Summarizing Medical Events from Electronic Health Records
US20200342968A1 (en) * 2019-04-24 2020-10-29 GE Precision Healthcare LLC Visualization of medical device event processing

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7260480B1 (en) * 2003-04-07 2007-08-21 Health Hero Network, Inc. Method and system for integrating feedback loops in medical knowledge development and healthcare management
US20090187425A1 (en) * 2007-09-17 2009-07-23 Arthur Solomon Thompson PDA software robots leveraging past history in seconds with software robots
US10483003B1 (en) * 2013-08-12 2019-11-19 Cerner Innovation, Inc. Dynamically determining risk of clinical condition
US10803249B2 (en) * 2017-02-12 2020-10-13 Seyed Ali Loghmani Convolutional state modeling for planning natural language conversations
WO2019055879A2 (en) * 2017-09-15 2019-03-21 PatientsLikeMe Inc. Systems and methods for collecting and analyzing comprehensive medical information
WO2019074545A1 (en) * 2017-10-13 2019-04-18 iHealthScreen Inc. Image based screening system for prediction of individual at risk of late age-related macular degeneration (amd)
US11024424B2 (en) * 2017-10-27 2021-06-01 Nuance Communications, Inc. Computer assisted coding systems and methods

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040193019A1 (en) * 2003-03-24 2004-09-30 Nien Wei Methods for predicting an individual's clinical treatment outcome from sampling a group of patient's biological profiles
US20100204920A1 (en) * 2005-04-25 2010-08-12 Caduceus Information Systems Inc. System for development of individualised treatment regimens
US20190034591A1 (en) * 2017-07-28 2019-01-31 Google Inc. System and Method for Predicting and Summarizing Medical Events from Electronic Health Records
US20200342968A1 (en) * 2019-04-24 2020-10-29 GE Precision Healthcare LLC Visualization of medical device event processing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Miotto, R., Li, L., Kidd, B. et al. "Deep Patient: An Unsupervised Representation to Predict the Future of Patients from the Electronic Health Records." Scientific Reports 6, Published May 17, 2016, Article 26094, https://doi.org/10.1038/srep26094 (Year: 2016) *

Also Published As

Publication number Publication date
US20210057057A1 (en) 2021-02-25
US20210057106A1 (en) 2021-02-25
US20210057051A1 (en) 2021-02-25
US20210057056A1 (en) 2021-02-25
US11923051B2 (en) 2024-03-05

Similar Documents

Publication Publication Date Title
US20220359050A1 (en) System and method for digital therapeutics implementing a digital deep layer patient profile
Wang et al. Deep learning in medicine—promise, progress, and challenges
Park et al. Artificial intelligence in health care: current applications and issues
US11488714B2 (en) Machine learning for collaborative medical data metrics
US10872684B2 (en) System and method for medical data analysis and sharing
Chintala AI-Driven Personalised Treatment Plans: The Future of Precision Medicine
Zahid et al. A systematic review of emerging information technologies for sustainable data-centric health-care
Dorr et al. Harnessing the promise of artificial intelligence responsibly
Karthikeyan et al. A Novel Deep Learning‐Based Black Fungus Disease Identification Using Modified Hybrid Learning Methodology
Khang et al. The Analytics of Hospitality of Hospitals in a Healthcare Ecosystem
US20230052573A1 (en) System and method for autonomously generating personalized care plans
CN110709938A (en) Method and system for generating a digital twin of patients
Khang et al. Application of Computer Vision (CV) in the Healthcare Ecosystem
Kotzias et al. Industry 4.0 and healthcare: Context, applications, benefits and challenges
US20230047253A1 (en) System and Method for Dynamic Goal Management in Care Plans
Riley et al. Internet of Things-based Smart Healthcare Systems and Wireless Biomedical Sensing Devices in Monitoring, Detection, and Prevention of COVID-19.
Chen et al. Using data mining strategies in clinical decision making: a literature review
Manias et al. iHELP: Personalised Health Monitoring and Decision Support Based on Artificial Intelligence and Holistic Health Records
Shukla et al. Optimization assisted bidirectional gated recurrent unit for healthcare monitoring system in big-data
Kumar et al. NATURAL LANGUAGE PROCESSING: HEALTHCARE ACHIEVING BENEFITS VIA NLP
Omotunde et al. The Modern Impact of Artificial Intelligence Systems in Healthcare: A Concise Analysis
US20230350897A1 (en) Method and system for processing large amounts of real world evidence
Kamra et al. Diagnosis support system for general diseases by implementing a novel machine learning based classifier
US20230360802A1 (en) Medical and healthcare service platforms and uses thereof
Iliuţă et al. Digital Twin Models for Personalised and Predictive Medicine in Ophthalmology

Legal Events

Date Code Title Description
AS Assignment

Owner name: APRICITY HEALTH LLC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHIN, LYNDA;BAHRS, PETER;CHANCEY, RAPHAEL;AND OTHERS;SIGNING DATES FROM 20191125 TO 20191202;REEL/FRAME:058658/0063

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: APRICITY HEALTH, INC., TEXAS

Free format text: CHANGE OF NAME;ASSIGNOR:APRICITY HEALTH, LLC;REEL/FRAME:065237/0804

Effective date: 20230104

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION