GB2612931A - A computer-implemented method for providing care - Google Patents
A computer-implemented method for providing care Download PDFInfo
- Publication number
- GB2612931A GB2612931A GB2302625.5A GB202302625A GB2612931A GB 2612931 A GB2612931 A GB 2612931A GB 202302625 A GB202302625 A GB 202302625A GB 2612931 A GB2612931 A GB 2612931A
- Authority
- GB
- United Kingdom
- Prior art keywords
- user
- input
- output
- dialogue
- reply
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/70—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
- G06F40/35—Discourse or dialogue representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/40—Processing or translation of natural language
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/20—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H70/00—ICT specially adapted for the handling or processing of medical references
- G16H70/20—ICT specially adapted for the handling or processing of medical references relating to practices or guidelines
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Theoretical Computer Science (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computational Linguistics (AREA)
- Artificial Intelligence (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Data Mining & Analysis (AREA)
- Biomedical Technology (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
A conversational agent is provided for maintaining or improving the wellbeing of a user presenting with mental health needs by the delivery of a care protocol. The conversational agent comprises: a plurality of sub-dialogue units, each configured to deliver an element of the care protocol, and an orchestrator configured to present the sub-dialogue units to the user, sequentially, wherein each sub-dialogue unit and the orchestrator comprises: a natural language understanding module configured to receive an input and/or reply and determine at least one intent and, where present, at least one slot within the input and/or reply; a dialogue planning module configured to determine an output based, at least in part, on the at least one intent and/or slot associated within the input and/or reply, and a natural language generation module configured to provide the output to the user.
Claims (25)
1. A conversational agent for maintaining or improving the wellbeing of a user presenting with mental health needs by the delivery of a care protocol, the conversational agent comprising: a plurality of sub-dialogue units, each configured to deliver an element of the care protocol; and an orchestrator configured to present the sub-dialogue units to the user, sequentially, wherein each sub-dialogue unit and the orchestrator comprises: a natural language understanding module configured to receive an input and/or reply and determine at least one intent and, where present, at least one slot within the input and/or reply; a dialogue planning module configured to determine an output based, at least in part, on the at least one intent and/or slot associated within the input and/or reply, and a natural language generation module configured to provide the output to the user.
2. The conversational agent according to claim 1, wherein the care protocol is a clinical protocol.
3. The conversational agent according to claim 1 or claim 2, wherein the care protocol is a transdiagnostic CBT protocol.
4. The conversational agent according to any one of claims 1 to 3, wherein the conversational agent comprises in excess of ten sub-dialogue units.
5. The conversational agent according to any one of claims 1 to 4, wherein the orchestrator is configured to select which sub-dialogue unit is presented to the user.
6. The conversational agent according to any one of claims 1 to 5, further comprising a risk assessment sub-dialogue unit which comprises a natural language understanding module configured to receive an input and/or reply for a user and analyse the input to determine, if present within the input and/or reply, at least one intent indicating a risk and, where present, at least one slot within the input and/or reply.
7. The conversational agent according to claim 6, wherein the risk assessment sub-dialogue unit is configured to receive and analyse all inputs from the user.
8. The conversational agent according to claim 6 or claim 7, wherein the risk assessment sub dialogue unit further comprises a dialogue planning module configured to determine an output based, at least in part, on the at least one intent and/or slot associated within the input and/or reply, and a natural language generation module configured to provide the output to the user.
9. The conversational agent according to any one of claims 6 to 8, wherein the risk assessment sub dialogue unit is configured to identify whether one or more intents identified in each user input and/or reply correspond to a predetermined list of intents associated with risk.
10. The conversational agent according to any one of claims 6 to 9, further comprising an adjudicator configured to identify each sub-dialogue unit comprising a natural language understanding module that identifies and intent; determine which of the identified sub-dialogue units meets a predetermined criterion; and select the sub-dialogue unit that meets the predetermined criterion such that only the selected sub-dialogue unit determines and provides an output to the user in response to each input.
11. The conversational agent according to claim 10, wherein the adjudicator is configured to enable the risk assessment sub-dialogue unit to provide an output to the user, where an intent relating to risk is identified.
12. A computer-based system for maintaining or improving the wellbeing of a user presenting with mental health needs, the system comprising a conversational agent according to any one of claims 1 to 11, and a treatment model module configured to provide a computer-readable representation of a treatment protocol.
13. The system according to claim 12, further comprising a dialogue history module configured to store previous inputs, outputs and, where present, replies.
14. The system according to claim 12 or claim 13, further comprising a user data module configured to store information about the user.
15. The system according to any of claims 12 to 14, further comprising a content module configured to store predefined data for providing to the user.
16. A computer-implemented method for maintaining or improving the wellbeing of a user presenting with mental health needs, the method comprising: receiving an input from a user; analysing the input using a natural language understanding module configured to determine at least one intent and, where present, at least one slot within the input; determining an output using a dialogue planning module, wherein the output is based, at least in part, on the at least one intent and/or slot associated with the input; providing the output to the user using a natural language generation module; and receiving, in response to the output, a reply from the user.
17. The method according to claim 16, further comprising: determining a user engagement level, wherein the output is based, at least in part, on the user engagement level.
18. The method according to claim 17, further comprising: determining if the user engagement level is below a predetermined threshold; and sending an alert to a second user upon determining a user engagement level below the predetermined threshold.
19. The method according to claim 18, wherein after having received the alert, the second user provides a second output to the user and/or amends the output determined by the dialogue planning module.
20. The method according to claim 19, further comprising: reviewing, using the natural language understanding module, the second output and/or amended output and providing, using the dialogue planning module, further amendments to the second output and/or amended output where needed.
21. The method according to claim 16 or claim 17, wherein the method is entirely autonomous.
22. The method according to any one of claims 16 to 21, further comprising: reviewing a memory configured to store at least one event, wherein each event comprises: an intent and, where present, at least one slot corresponding to a previous input; a previous output corresponding to the previous input, and where present, a previous reply received in response to the previous output, wherein the output is based, at least in part, on an event stored within the memory.
23. The method according to any one of claims 16 to 22, wherein the output is based, at least in part, on at least one of: the next stage in a psychotherapy treatment model for the user; the need to obtain a piece of information from the user; the piece of information required next from the user; a question contained within the input; the frequency of questions contained within the input; the frequency of questions generated by the natural language generation module; the amount of repetition within an input compared to a previous input or reply; and the amount of repetition within an output compared to a previous output.
24. The method according to any one of claims 16 to 23, wherein the input, output and/or reply is written, audio and/or visual.
25. The method according to any one of claims 16 to 24, wherein the method further comprises: alerting a second user in response to: determining when the input and/or reply is outside of a predetermined treatment model; or determining when the natural language understanding module is unable to determine the intent or, where present, slot associated with the input.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GBGB2109185.5A GB202109185D0 (en) | 2021-06-25 | 2021-06-25 | A computer-implemented method for providing care |
GB2204292.3A GB2616912A (en) | 2022-03-25 | 2022-03-25 | A pet litter pickup device |
PCT/GB2022/051629 WO2022269286A1 (en) | 2021-06-25 | 2022-06-24 | A computer-implemented method for providing care |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202302625D0 GB202302625D0 (en) | 2023-04-12 |
GB2612931A true GB2612931A (en) | 2023-05-17 |
Family
ID=85794136
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2302625.5A Pending GB2612931A (en) | 2021-06-25 | 2022-06-24 | A computer-implemented method for providing care |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2612931A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170324868A1 (en) * | 2016-05-06 | 2017-11-09 | Genesys Telecommunications Laboratories, Inc. | System and method for monitoring progress of automated chat conversations |
US20210098110A1 (en) * | 2019-09-29 | 2021-04-01 | Periyasamy Periyasamy | Digital Health Wellbeing |
US20210134179A1 (en) * | 2019-10-30 | 2021-05-06 | Happify, Inc. | Systems And Methods For Managing Dynamic User Interactions With Online Services For Enhancing Mental Health Of Users |
KR20210058449A (en) * | 2019-11-14 | 2021-05-24 | 주식회사 셀바스에이아이 | Apparatus and method for mental healthcare based on artificial intelligence |
-
2022
- 2022-06-24 GB GB2302625.5A patent/GB2612931A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170324868A1 (en) * | 2016-05-06 | 2017-11-09 | Genesys Telecommunications Laboratories, Inc. | System and method for monitoring progress of automated chat conversations |
US20210098110A1 (en) * | 2019-09-29 | 2021-04-01 | Periyasamy Periyasamy | Digital Health Wellbeing |
US20210134179A1 (en) * | 2019-10-30 | 2021-05-06 | Happify, Inc. | Systems And Methods For Managing Dynamic User Interactions With Online Services For Enhancing Mental Health Of Users |
KR20210058449A (en) * | 2019-11-14 | 2021-05-24 | 주식회사 셀바스에이아이 | Apparatus and method for mental healthcare based on artificial intelligence |
Also Published As
Publication number | Publication date |
---|---|
GB202302625D0 (en) | 2023-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Rogowsky et al. | Matching learning style to instructional method: Effects on comprehension. | |
Gordon et al. | To have and to hold: gratitude promotes relationship maintenance in intimate bonds. | |
Town et al. | Patient affect experiencing following therapist interventions in short-term dynamic psychotherapy | |
Callejas et al. | A computational model of social attitudes for a virtual recruiter | |
EP3543914A1 (en) | Techniques for improving turn-based automated counseling to alter behavior | |
Kang et al. | Self-identification with a virtual experience and its moderating effect on self-efficacy and presence | |
Principe et al. | False rumors and true belief: Memory processes underlying children’s errant reports of rumored events | |
US20150356573A1 (en) | Dynamic survey system | |
Miller-Day et al. | Teacher narratives and student engagement: Testing narrative engagement theory in drug prevention education | |
Womack et al. | Disfluencies as extra-propositional indicators of cognitive processing | |
Richter et al. | A qualitative exploration of clinicians’ strategies to communicate risks to patients in the complex reality of clinical practice | |
Boeijinga et al. | Health Communication| Risk Versus Planning Health Narratives Targeting Dutch Truck Drivers: Obtaining Impact Via Different Routes? | |
Terblanche et al. | Coaching at Scale: Investigating the Efficacy of Artificial Intelligence Coaching. | |
Elsborg et al. | Development and initial validation of the volition in exercise questionnaire (VEQ) | |
Henshaw et al. | Cogmed training does not generalize to real-world benefits for adult hearing aid users: Results of a blinded, active-controlled randomized trial | |
Matthews et al. | The effect of interstimulus interval on sequential effects in absolute identification | |
GB2612931A (en) | A computer-implemented method for providing care | |
GB2622548A (en) | Providing care to users with complex needs | |
Svetlova | Talking about the crisis: Performance of forecasting in financial markets | |
King et al. | Contextual effects on online pragmatic inferences of deception | |
JP7347794B2 (en) | Interactive information acquisition device, interactive information acquisition method, and program | |
Golob et al. | Interaction of number magnitude and auditory localization | |
US11526541B1 (en) | Method for collaborative knowledge base development | |
CN117788239B (en) | Multi-mode feedback method, device, equipment and storage medium for talent training | |
Koessler et al. | Promoting pro-social behavior with public statements of good intent |