AU2020257885A1 - Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis - Google Patents

Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis Download PDF

Info

Publication number
AU2020257885A1
AU2020257885A1 AU2020257885A AU2020257885A AU2020257885A1 AU 2020257885 A1 AU2020257885 A1 AU 2020257885A1 AU 2020257885 A AU2020257885 A AU 2020257885A AU 2020257885 A AU2020257885 A AU 2020257885A AU 2020257885 A1 AU2020257885 A1 AU 2020257885A1
Authority
AU
Australia
Prior art keywords
feeling
selection
interface
thought
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2020257885A
Inventor
Jillian Christine Ahrens
Michael Brown
Brent Paul Kersanske
Kenneth R. Weingardt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pear Therapeutics Inc
Original Assignee
Pear Therapeutics US Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pear Therapeutics US Inc filed Critical Pear Therapeutics US Inc
Priority claimed from PCT/US2020/027919 external-priority patent/WO2020214523A1/en
Publication of AU2020257885A1 publication Critical patent/AU2020257885A1/en
Assigned to PEAR THERAPEUTICS (US), INC. reassignment PEAR THERAPEUTICS (US), INC. Request for Assignment Assignors: Pear Therapeutics, Inc.
Priority to AU2023241395A priority Critical patent/AU2023241395A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/20ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Biomedical Technology (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Educational Technology (AREA)
  • General Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Certain implementations of the disclosed technology may include electronic devices and methods for treating depressive symptoms associated with multiple sclerosis. According to an example implementation, a method is provided. The method may include displaying a feeling selection interface; receiving a feeling selection input; displaying a feeling spectrum interface; receiving a first feeling intensity input; displaying an automatic thought selection interface; receiving an automatic thought selection input; displaying an alternative thought selection interface; receiving an alternative thought selection input; displaying the feeling spectrum interface again; receiving a second feeling intensity input; and generating a journal entry indicating at least any difference between the first feeling intensity input and the second feeling intensity input.

Description

ELECTRONIC DEVICES AND METHODS FOR
TREATING DEPRESSIVE SYMPTOMS ASSOCIATED WITH MULTIPLE SCLEROSIS
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This U.S. patent application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application 62/835,250, filed on April 17, 2019. The disclosure of this prior application is considered part of the disclosure of this application and is hereby incorporated by reference in its entirety.
FIELD
[0002] This disclosure relates, generally, to the treatment of depression and, more particularly, to electronic devices and methods for the treatment of depressive symptoms associated with multiple sclerosis utilizing computerized behavioral therapy.
BACKGROUND
[0003] Multiple sclerosis is a chronic disease involving damage to sheaths of nerve cells in the brain and spinal cord, causing symptoms including numbness, pain, fatigue, impaired speech and muscle coordination, and vision loss. The National Multiple Sclerosis Society estimates nearly one million people in the United States alone live with multiple sclerosis. Generally, there are four types of multiple sclerosis: Relapsing- remitting multiple sclerosis, secondary-progressive multiple sclerosis, primary- progressive multiple sclerosis, and progressive-relapsing multiple sclerosis. Relapsing- remitting multiple sclerosis is the most common type of multiple sclerosis as 85% of all multiple sclerosis patients are initially diagnosed with relapsing-remitting multiple sclerosis. Patients with relapsing-remitting multiple sclerosis experience clearly defined attacks of new or increasing neurological symptoms ( relapses ) followed by periods of partial or complete recovery {remission) .
[0004] Relapsing-remitting multiple sclerosis is defined by inflammatory attacks on nerve fibers and myelin, which are layers of insulating membranes surrounding the nerve fibers in the central nervous system. This can cause a patient to experience common symptoms of multiple sclerosis including fatigue, walking difficulties, numbness or tingling, spasticity, weakness, vision issues, dizziness or vertigo, urinary incontinence or bowel incontinence, and cognitive or emotional changes. Each patient’s experience with relapsing-remitting multiple sclerosis is unique; no two patients will present the same symptoms or have the same disease course.
[0005] Relapsing-remitting multiple sclerosis often leads to depressive symptoms and anxiety. Depressive symptoms are a natural reaction to the unpredictable course of a disabling chronic disease like relapsing-remitting multiple sclerosis. In fact, depressive symptoms and depressive disorders are the most common psychiatric illness and co- morbid disease for patients with multiple sclerosis. Patients with relapsing-remitting multiple sclerosis may be predisposed for depressive symptoms due to psychological risk factors such as inadequate coping or insufficient social support, as well as multiple sclerosis biological processes such as changes in brain structure.
[0006] There is no association between the severity of symptoms and likelihood of a patient experiencing depressive symptoms; any patient with relapsing-remitting multiple sclerosis can experience depressive symptoms at any point in the disease progression. But a variety of factors may influence depressive symptoms in patients with relapsing- remitting multiple sclerosis. A patient’s initial diagnosis of multiple sclerosis may be followed by a period of depressive symptoms. Patients may also experience depressive symptoms due to the physical symptoms associated with multiple sclerosis. For example, a patient suffering from fatigue may be depleted of emotional energy required to fight depressive symptoms. Furthermore, a patient’s high level of uncertainty about new symptoms and the future may cause patients to experience depressive symptoms.
Physiological causes, such as damage to the central nervous system, and chemical changes, such as expression of pro-inflammatory protein molecules involved in cell-to- cell communications, may cause patients to experience depressive symptoms as well. Medication side effects can worsen depressive symptoms. Steroids, for example, can cause euphoria in the short term, followed by depressive symptoms once the euphoria has stopped. [0007] Depressive symptoms significantly affect the mood of a patient suffering from multiple sclerosis, thereby negatively affecting the patient’s quality of life. Patients with multiple sclerosis may prioritize physical health over emotional health and leave depressive symptoms untreated. Leaving depressive symptoms untreated can lead to reduced quality of life and impaired cognitive function. For example, depressed patients may seek to withdraw from daily life activities, resulting in reduced social stimulation. Patients with multiple sclerosis also experience an increased risk of suicide - they are 7.5 times more likely to commit suicide than members of the general population.
[0008] Current treatment options for depressive symptoms in multiple sclerosis patients generally include antidepressant medication and face-to-face therapy with a clinician. However, these treatment options have proven sub-optimal. Accordingly, improved electronic devices and methods for treating depressive symptoms associated with multiple sclerosis are needed.
SUMMARY
[0009] The instant disclosure provides various electronic devices, methods, and digital therapeutics for treating depressive symptoms associated with multiple sclerosis. According to one aspect of the disclosure, an electronic device for treating depressive symptoms associated with multiple sclerosis is provided. The electronic device includes a display, an input device, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors. According to this aspect, the one or more programs include instructions for carrying out a method. The method includes, displaying, on the display, a feeling selection interface. The feeling selection interface presents a plurality of feeling interface elements, and each feeling interface element is associated with a particular feeling. The method further includes, while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs. The first sequence of inputs includes a feeling selection input that corresponds to a particular feeling interface element. The method further includes, in response to receiving the feeling selection input, displaying, on the display, a feeling spectrum interface presenting a plurality of intensities associated with the particular feeling. The method further includes, while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs. The second sequence of inputs includes a first feeling intensity input. The first feeling intensity input corresponds to a first intensity of the plurality of intensities. The method further includes, in response to receiving the first feeling intensity input, displaying, on the display, an automatic thought selection interface. The automatic thought selection interface presents a plurality of automatic thought interface elements, and each automatic thought interface element is associated with a particular automatic thought. The method further includes, while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs. The third sequence of inputs includes an automatic thought selection input corresponding to a particular automatic thought interface element. The method further includes, in response to receiving the automatic thought selection input, displaying, on the display, an alternative thought selection interface. The alternative thought selection interface presents a plurality of alternative thought interface elements. Each alternative thought interface element is associated with a particular alternative thought. The method further includes, while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs. The fourth sequence of inputs includes an alternative thought selection input. The alternative thought selection input corresponds to a particular alternative thought interface element. The method further includes, in response to receiving the alternative thought selection input, displaying, on the display, the feeling spectrum interface. The method further includes, while displaying the feeling spectrum interface, receiving, via the input device, a fifth sequence of inputs. The fifth sequence of inputs includes a second feeling intensity input. The second feeling intensity input corresponds to a second intensity of the plurality of intensities. The method further includes, generating, for display on the display, a journal entry. The journal entry indicates at least any difference between the first feeling intensity input and the second feeling intensity input.
[0010] This aspect may include one or more of the following optional features as well. In some aspects, the instructions implement a method that includes, in response to receiving the automatic thought selection input, displaying, on the display, a thinking traps interface. The thinking traps interface presents a plurality of thinking trap interface elements associated with the particular automatic thought interface element. Each thinking trap interface element is associated with a particular thinking trap. The method may further include, while displaying the thinking traps interface, receiving, via the input device, a sixth sequence of inputs. The sixth sequence of inputs includes one or more thinking trap selection inputs. The one or more thinking trap selection inputs correspond to one or more particular thinking trap interface elements. The method may further include the journal entry being modified to further indicate the one or more particular thinking trap interface elements.
[0011] In some aspects, the method includes, in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface element. The quick recap interface element indicates the particular automatic thought and the one or more particular thinking trap elements.
[0012] In one aspect, the method includes, the journal entry being modified to further indicate the particular alternative thought interface element.
[0013] In another aspect, the method includes, in response to receiving the feeling selection input, displaying, on the display, a company selection interface. The company selection interface presents a plurality of company interface elements. Each company interface element is associated with a particular relationship type. The method may further include, while displaying the company selection interface, receiving, via the input device, a seventh sequence of inputs. The seventh sequence of inputs includes a company selection input. The company selection input corresponds to a particular company interface element. The method may further include the journal entry being modified to further indicate the particular company interface element.
[0014] In some aspects, the method includes, in response to receiving the feeling selection input, displaying, on the display, a location selection interface. The location selection interface presents a plurality of location interface elements. Each location interface element is associated with a particular location. The method may further include, while displaying the location selection interface, receiving, via the input device, an eighth sequence of inputs. The eighth sequence of inputs includes a location selection input. The location selection input corresponds to a particular location interface element. The method may further include the journal entry being modified to further indicate the particular location interface element.
[0015] In another aspect, the method includes in response to receiving the feeling selection input, displaying, on the display, a multiple sclerosis symptoms selection interface. The multiple sclerosis symptoms selection interface presents a plurality of multiple sclerosis symptom interface elements. Each multiple sclerosis symptom interface element is associated with a particular multiple sclerosis symptom. The method may further include, while displaying the multiple sclerosis symptoms selection interface, receiving, via the input device, a ninth sequence of inputs. The ninth sequence of inputs include one or more multiple sclerosis symptom selection inputs. The one or more multiple sclerosis symptom selection inputs correspond to one or more particular multiple sclerosis symptom interface elements. The method may further include the journal entry being modified to further indicate the one or more particular multiple sclerosis symptom interface elements.
[0016] According to another aspect of the disclosure, a computerized method for treating depressive symptoms associated with multiple sclerosis is provided. The method includes, at an electronic device including a display and an input device, displaying, on the display, a feeling selection interface presenting a plurality of feeling interface elements. Each interface element is associated with a particular feeling. While displaying the feeling selection interface, the method further includes receiving, via the input device, a first sequence of inputs. The first sequence of inputs includes a feeling selection input. The feeling selection input corresponds to a particular feeling interface element. The method further includes, in response to receiving the feeling selection input, displaying, on the display, a feeling spectrum interface. The feeling spectrum interface presents a plurality of intensities associated with the particular feeling. The method further includes, while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs. The second sequence of inputs includes a first feeling intensity input that corresponds to a first intensity of the plurality of intensities. In response to receiving the first feeling intensity input, the method further includes displaying, on the display, an automatic thought selection interface. The automatic thought selection interface presents a plurality of automatic thought interface elements, and each automatic thought interface element is associated with a particular automatic thought. The method further includes, while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs. The third sequence of inputs includes an automatic thought selection input. The automatic thought selection input corresponds to a particular automatic thought interface element. In response to receiving the automatic thought selection input, the method further includes displaying, on the display, an alternative thought selection interface. The alternative thought selection interface presents a plurality of alternative thought interface elements, and each alternative thought interface element is associated with a particular alternative thought. The method further includes, while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs. The fourth sequence of inputs includes an alternative thought selection input. The alternative thought selection input corresponds to a particular alternative thought interface element. In response to receiving the alternative thought selection input, the method further includes displaying, on the display, the feeling spectrum interface. The method further includes, while displaying the feeling spectrum interface, receiving, via the input device, a fifth sequence of inputs. The fifth sequence of inputs includes a second feeling intensity input. The second feeling intensity input corresponds to a second intensity of the plurality of intensities. The method further includes generating, for display on the display, a journal entry. The journal entry indicates at least any difference between the first feeling intensity input and the second feeling intensity input.
[0017] Aspects of the disclosure may include one or more of the following features as well. In one exemplary aspect, the method includes, in response to receiving the automatic thought selection input, displaying, on the display, a thinking traps interface. The thinking traps interface presents a plurality of thinking trap interface elements associated with the particular automatic thought interface element. Each thinking trap interface element is associated with a particular thinking trap. The method may also further include, while displaying the thinking traps interface, receiving, via the input device, a sixth sequence of inputs. The sixth sequence of inputs includes one or more thinking trap selection inputs. The one or more thinking trap selection inputs correspond to one or more particular thinking trap interface elements. The method may also further include the journal entry being modified to further indicate the one or more particular thinking trap interface elements.
[0018] In another aspect, the method includes, in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface element. The quick recap interface element indicates the particular automatic thought and the one or more particular thinking trap elements.
[0019] In still another aspect, the method includes in response to receiving the alternative thought selection input, the journal entry is modified to further indicate the particular alternative thought interface element.
[0020] In one aspect, the method includes, in response to receiving the feeling selection input, displaying, on the display, a company selection interface. The company selection interface presents a plurality of company interface elements. Each company interface element is associated with a particular relationship type. The method may further include, while displaying the company selection interface, receiving, via the input device, a seventh sequence of inputs. The seventh sequence of inputs includes a company selection input. The company selection input corresponds to a particular company interface element. The method may further include the journal entry being modified to further indicate the particular company interface element.
[0021] In another aspect, the method includes, in response to receiving the feeling selection input, displaying, on the display, a location selection interface. The location selection interface presents a plurality of location interface elements, and each location interface element is associated with a particular location. The method may further include, while displaying the location selection interface, receiving, via the input device, an eighth sequence of inputs. The eighth sequence of inputs includes a location selection input. The location selection input corresponds to a particular location interface element. The method may further include the journal entry being modified to further indicate the particular location interface element. [0022] In one aspect, the method includes, in response to receiving the feeling selection input, displaying, on the display, a multiple sclerosis symptoms selection interface that presents a plurality of multiple sclerosis symptom interface elements. Each multiple sclerosis symptom interface element is associated with a particular multiple sclerosis symptom. The method may further include, while displaying the multiple sclerosis symptoms selection interface, receiving, via the input device, a ninth sequence of inputs. The ninth sequence of inputs includes one or more multiple sclerosis symptom selection inputs. The one or more multiple sclerosis symptom selection inputs correspond to one or more particular multiple sclerosis symptom interface elements. The method may further include the journal entry being modified to further indicate the one or more particular multiple sclerosis symptom interface elements.
[0023] An exemplary non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and input device including instructions for performing the foregoing method is also included as part of the instant disclosure.
[0024] An exemplary method for treatment of depressive symptoms associated with multiple sclerosis, in a subject, in need thereof, including administering to said subject a computerized method of the foregoing method is also included as part of the instant disclosure.
[0025] According to another aspect of the disclosure, another computerized method for treating depressive symptoms associated with multiple sclerosis is provided. The method includes, at an electronic device including a display and an input device, receiving, via the input device, feeling assessment data describing a feeling associated with a user. The method further includes receiving, via the input device, first feeling intensity data describing a first intensity of the feeling associated with the user. The method further includes identifying a plurality of potential automatic thoughts based on the feeling associated with the user. Each potential automatic thought of the plurality of potential automatic thoughts corresponds to a negative thought. Additionally, the method includes receiving, via the input device, automatic thought selection data. The automatic thought selection data identifies a particular potential automatic thought from among the plurality of potential automatic thoughts. The method also includes identifying a plurality of potential alternative thoughts based on the automatic thought selection data. Each potential alternative thought of the plurality of potential alternative thoughts corresponds to a positive thought. Further, the method includes receiving, via the input device, alternative thought selection data. The alternative thought selection data identifies a particular potential alternative thought from among the plurality of potential alternative thoughts. Continuing, the method includes receiving, via the input device, second feeling intensity data describing a second intensity of the feeling associated with the user. The method also includes determining any difference between the first intensity and the second intensity to provide feeling intensity difference data. Finally, according to this aspect of the disclosure, the method includes displaying, on the display, the feeling intensity difference data.
[0026] Exemplary electronic devices for performing the foregoing method are also included as part of the instant disclosure.
[0027] An exemplary non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and input device including instructions for performing the foregoing method is also included as part of the instant disclosure.
[0028] An exemplary method for treatment of depressive symptoms associated with multiple sclerosis, in a subject, in need thereof, including administering to said subject a computerized method of the foregoing method is also included as part of the instant disclosure.
[0029] According to another aspect of the disclosure, a digital therapeutic for treating depressive symptoms associated with multiple sclerosis is provided. The digital therapeutic includes an automatic thought identification module. The automatic thought identification module is configured to identify a plurality of potential automatic thoughts based on feeling assessment data describing a feeling associated with a user. Each potential automatic thought of the plurality of potential automatic thoughts corresponds to a negative thought. The automatic thought identification module is also configured to receive automatic thought selection data. The automatic thought selection data identifies a particular potential automatic thought from among the plurality of potential automatic thoughts. The digital therapeutic further includes an alternative thought identification module. The alternative thought identification module is configured to identify a plurality of potential alternative thoughts based on the automatic thought selection data. Each potential alternative thought of the plurality of potential alternative thoughts corresponds to a positive thought. The alternative thought identification module is also configured to receive alternative thought selection data. The alternative thought selection data identifies a particular potential alternative thought from among the plurality of potential alternative thoughts. The digital therapeutic further includes a feeling intensity module. The feeling intensity module is configured to receive first feeling intensity data describing a first intensity of the feeling associated with the user at first point in time. The feeling intensity module is also configured to receive second feeling intensity data describing a second intensity of the feeling associated with the user at second point in time. The second point in time is later than the first point in time. The feeling intensity module is also configured to generate feeling intensity difference data. The feeling intensity difference data indicates any difference between the first intensity and the second intensity. The digital therapeutic further includes a display module. The display module is configured to generate display data representing the feeling intensity difference data.
[0030] This aspect may include one or more of the following optional features as well. In some aspects, the digital therapeutic further includes a feeling assessment module. The feeling assessment module is configured to receive the feeling assessment data describing the feeling associated with the user.
[0031] According to another aspect of the disclosure, the digital therapeutic further includes a thinking traps module. The thinking traps module is configured to identify a plurality of potential thinking traps based on the feeling assessment data and receive thinking trap selection data. The thinking trap selection data identifies one or more particular potential thinking traps from among the plurality of potential thinking traps.
[0032] In another aspect, the digital therapeutic further includes a journal module.
The journal module is configured to generate a journal entry comprising at least the feeling intensity difference data. [0033] According to another aspect, the digital therapeutic further includes a company module. The company module is configured to receive company selection data. The company selection data identifies, by relationship type, a person who accompanied the user at a time in which the user experienced the feeling. The journal entry further includes the company selection data.
[0034] In yet another aspect of the disclosure, the digital therapeutic further includes a location module. The location module is configured to receive location selection data identifying a location of the user at a time in which the user experienced the feeling. The journal entry further includes the location selection data.
[0035] According to another aspect, the digital therapeutic further includes a multiple sclerosis (MS) symptom module. The multiple sclerosis symptom module is configured to receive multiple sclerosis symptom selection data identifying one or more multiple sclerosis symptoms associated with the user. The journal entry further includes the multiple sclerosis symptom selection data.
[0036] In another aspect, the digital therapeutic further includes the journal entry further including the thinking trap selection data.
[0037] An exemplary method for treatment of depressive symptoms associated with multiple sclerosis, in a subject, in need thereof, including administering to said subject a digital therapeutic of the foregoing digital therapeutic is also included as part of the instant disclosure.
[0038] The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF THE FIGURES
[0039] Reference will now be made to the accompanying Figures, which are not necessarily drawn to scale, and wherein:
[0040] FIG. l is a schematic view of an example system implementing a
computerized method for treating depressive symptoms associated with multiple sclerosis. [0041] FIG. 2A illustrates a feeling selection interface in accordance with an exemplary embodiment of the disclosure.
[0042] FIG. 2B illustrates a feeling spectrum interface in accordance with an exemplary embodiment of the disclosure.
[0043] FIG. 2C illustrates an automatic thought selection interface in accordance with an exemplary embodiment of the disclosure.
[0044] FIG. 2D illustrates an alternative thought selection interface in accordance with an exemplary embodiment of the disclosure.
[0045] FIG. 2E illustrates a feeling spectrum interface in accordance with an exemplary embodiment of the disclosure.
[0046] FIG. 2F illustrates a thinking traps interface in accordance with an exemplary embodiment of the disclosure.
[0047] FIG. 2G illustrates another view of the thinking traps interface in accordance with an exemplary embodiment of the disclosure.
[0048] FIG. 2H illustrates yet another view of the thinking traps interface in accordance with an exemplary embodiment of the disclosure.
[0049] FIG. 21 illustrates a company selection interface in accordance with an exemplary embodiment of the disclosure.
[0050] FIG. 2J illustrates a location selection interface in accordance with an exemplary embodiment of the disclosure.
[0051] FIG. 2K illustrates a symptoms selection interface in accordance with an exemplary embodiment of the disclosure.
[0052] FIG. 2L illustrates a recap interface element in accordance with an exemplary embodiment of the disclosure.
[0053] FIG. 2M illustrates a journal interface in accordance with an exemplary embodiment of the disclosure.
[0054] FIG. 2N illustrates a positive feeling selection interface in accordance with an exemplary embodiment of the disclosure.
[0055] FIG. 20 illustrates a situation selection interface in accordance with an exemplary embodiment of the disclosure. [0056] FIG. 2P illustrates a positive reflection element in accordance with an exemplary embodiment of the disclosure.
[0057] FIG. 2Q illustrates a positive journal interface in accordance with an exemplary embodiment of the disclosure.
[0058] FIG. 2R illustrates a relax-and-remind interface in accordance with an exemplary embodiment of the disclosure.
[0059] FIG. 2S illustrates a mindfulness interface in accordance with an exemplary embodiment of the disclosure.
[0060] FIG. 2T illustrates a mindfulness technique data interface in accordance with an exemplary embodiment of the disclosure.
[0061] FIG. 2U illustrates a fatigue interface in accordance with an exemplary embodiment of the disclosure.
[0062] FIG. 2V illustrates a fatigue type data interface in accordance with an exemplary embodiment of the disclosure.
[0063] FIG. 3 is a flowchart illustrating a computerized method for treating depressive symptoms associated with multiple sclerosis in accordance with an exemplary embodiment of the disclosure.
[0064] FIG. 4 is a flowchart illustrating another computerized method for treating depressive symptoms associated with multiple sclerosis in accordance with an exemplary embodiment of the disclosure.
[0065] FIG. 5 is a schematic view of an example electronic device for treating depressive symptoms associated with multiple sclerosis in accordance with an exemplary embodiment of the disclosure.
[0066] FIG. 6 is a functional block diagram illustrating a digital therapeutic for treating depressive symptoms associated with multiple sclerosis in accordance with an exemplary embodiment of the disclosure.
[0067] Like reference symbols in the various drawings indicate like elements. DETAILED DESCRIPTION
[0068] Some implementations of the disclosed technology will be described more fully with reference to the accompanying drawings. This disclosed technology may, however, be embodied in many different forms and should not be construed as limited to the implementations set forth herein.
[0069] Example implementations of the disclosed technology provide electronic devices, methods, and digital therapeutics for treating depressive symptoms associated with multiple sclerosis.
[0070] Example implementations of the disclosed technology will now be described with reference to the accompanying figures.
[0071] Referring to FIG. 1, in some implementations, a therapy prescription system
100 provides a patient 101 access to a prescription digital therapeutic 120 prescribed to the patient 101 and monitors events associated with the patient’s 101 interaction with the prescription digital therapeutic 120. Although the digital therapeutic 120 is described herein as being a“prescription” digital therapeutic, it is understood that, according to some implementations, the digital therapeutic 120 will not require a prescription from a clinician. Rather, in such implementations, the digital therapeutic 120 may be available to a patient without a prescription, and the digital therapeutic 120 nonetheless otherwise functions in accordance with the description of the prescription digital therapeutic 120 described herein. According to implementations in which the digital therapeutic 120 is not prescribed, the person using or being administered the digital therapeutic may be referred to as a“user.” A“user” may include a patient 101 or any other person using or being administered the digital therapeutic 120, irrespective of whether the digital therapeutic 120 was prescribed to that person.
[0072] As used herein, a digital therapy may also be referred to as a digital- therapeutic configured to deliver evidence-based psychosocial intervention techniques for treating a patient with a particular disease or disorder, as well as symptoms and/or behaviors associated with the particular disease or disorder. In the instant case, the patient
101 is diagnosed with multiple sclerosis (MS) and the prescription digital therapeutic 120 is specifically tailored for addressing one or more depressive symptoms associated with MS that the patient 101 may experience. An authorized healthcare provider (HCP) 109 (e.g., a doctor, nurse, etc.) supervising the patient 101 diagnosed with MS may prescribe the patient 101 the prescription digital therapeutic 120 designed to help the patient 101 identify feelings the patient 101 is experiencing and modify dysfunction emotions, behaviors, and thoughts in order to treat depressive symptoms in the patient 101. The HCP 109 may include a physician, nurse, clinician, or other health professional qualified for treating patients diagnosed with multiple sclerosis (“MS”).
[0073] In some examples, the system 100 includes a network 106, a patient device 102, an HCP system 140, and a multiple sclerosis therapy service 160. The network 106 provides access to cloud computing resources 150 (e.g., distributed system) that execute the multiple sclerosis therapy service 160 to provide for the performance of services on remote devices. Accordingly, the network 106 allows for interaction between patients 101 and HCPs 109 with the multiple sclerosis therapy service 160. For instance, the multiple sclerosis therapy service 160 may provide the patient 101 access to the prescription digital therapeutic 120 and receive event data 122 inputted by the patient 101 associated with the patient’s 101 interaction with the prescription digital therapeutic 120. In turn, the multiple sclerosis therapy service 160 may store the event data 122 on a storage resource 156.
[0074] The network 106 may include any type of network that allows sending and receiving communication signals, such as a wireless telecommunication network, a cellular telephone network, a time division multiple access (TDMA) network, a code division multiple access (CDMA) network, Global system for mobile communications (GSM), a third generation (3G) network, fourth generation (4G) network, a satellite communications network, and other communication networks. The network 106 may include one or more of a Wide Area Network (WAN), a Local Area Network (LAN), and a Personal Area Network (PAN). In some examples, the network 106 includes a combination of data networks, telecommunication networks, and a combination of data and telecommunication networks. The patient device 102, the HCP system 140, and the multiple sclerosis therapy service 160 communicate with each other by sending and receiving signals (wired or wireless) via the network 106. In some examples, the network 106 provides access to cloud computing resources, which may be elastic/on-demand computing and/or storage resources 156 available over the network 106. The term‘cloud’ services generally refers to a service performed not locally on a user’s device, but rather delivered from one or more remote devices accessible via one or more networks 106.
[0075] The patient device 102 may include, but is not limited to, a portable electronic device (e.g., smartphone, cellular phone, personal digital assistant, personal computer, or wireless tablet device), a desktop computer, or any other electronic device capable of sending and receiving information via the network 106. The patient device 102 includes data processing hardware 112 (a computing device that executes instructions), memory hardware 114, and a display 116 in communication with the data processing hardware 112. In some examples, the patient device 102 includes a keyboard 148, mouse, microphones, and/or a camera for allowing the patient 101 to input data. In addition to or in lieu of the display 116, the patient device 102 may include one or more speakers to output audio data to the patient 101. For instance, audible alerts may be output by the speaker to notify the patient 101 about some time sensitive event associated with the prescription digital therapeutic 120. In some implementations, the patient device 102 executes a patient application 103 (or accesses a web-based patient application) for establishing a connection with the multiple sclerosis therapy service 160 to access the prescription digital therapeutic 120. For instance, the patient 101 may have access to the patient application 103 for a duration (e.g., 3 months) of the prescription digital therapeutic 120 prescribed to the patient 101. Here, the patient device 102 may launch the patient application 103 by initially providing an access code 104 when the prescription digital therapeutic 120 is prescribed by the HCP 109 that allows the patient 101 to access content associated with the prescription digital therapeutic 120 from the multiple sclerosis therapy service 160 that is specifically tailored for treating/addressing one or more symptoms associated with MS that the patient 101 may be experiencing. The patient application 103, when executing on the data processing hardware 112 of the patient device 102, is configured to display a variety of graphical user interfaces (GUIs) (e.g., the feeling selection GUI 204 shown at FIG. 2A) on the display 116 of the patient device 102 that, among other things, allow the patient 101 to input event data 122 associated particular feelings the patient is experiencing, solicit information from the patient 101, and present journal entries for the patient 101 to view.
[0076] The patient application 120 may send notifications to the patient device 102. In some embodiments, the patient application 120 may send notifications to the patient device 102 even when the application is not running on the patient device. The notifications may be sent to the notification center of the patient device 102. The notifications may remind the patient 101, daily, weekly, or otherwise periodically to run and engage with the patient application 103. For example, the patient application 120 may cause a notification to be sent to the patient device 102 every evening to remind the patient 101 to open the patient application 102.
[0077] The storage resources 156 may provide data storage 158 for storing the event data 122 received from the patient 101 in a corresponding patient record 105 as well as the prescription digital therapeutic 120 prescribed to the patient 101. The patient record 105 may be encrypted while stored on in the data storage 158 so that any information identifying patient 101 is anonymized, but may later be de-crypted when the patient 101 or supervising HCP 109 requests the patient record 105 (assuming the requester is authorized/authenticated to access the patient record 105). All data transmitted over the network 106 between the patient device 102 and the cloud computing system 150 may be encrypted and sent over secure communication channels. For instance, the patient application 103 may encrypt the event data 122 before transmitting to the multiple sclerosis therapy service 160 via the HTTPS protocol and decrypt a patient record 105 received from the multiple sclerosis therapy service 160. When network connectivity is not available, the patient application 103 may store the event data 122 in an encrypted queue within the memory hardware 114 until network connectivity is available.
[0078] The HCP system 140 may be located at a clinic, doctor’s office, or facility administered by the HCP 109 and includes data processing hardware 142, memory hardware 144, and a display 146. The memory hardware 144 and the display 146 are in communication with the data processing hardware 142. For instance, the data processing hardware 142 may reside on a desktop computer or portable electronic device for allowing the HCP 109 to input and retrieve data to and from the multiple sclerosis therapy service 160. In some examples, the HCP 109 may initially onboard some or all of patient data 107 at the time of prescribing the prescription digital therapeutic 120 to the patient 101. The HCP system 140 includes a keyboard 148, mouse, microphones, speakers and/or a camera. In some implementations, the HCP system 140 (i.e., via the data processing hardware 142) executes a HCP application 110 (or accesses a web-based patient application) for establishing a connection with the multiple sclerosis therapy service 160 to input and retrieve data therefrom. For instance, the HCP system 140 may be able to access the anonymized patient record 105 securely stored by the multiple sclerosis therapy service 160 on the storage resources 156 by providing an authentication token 108 validating that the HCP 109 is supervising the patient 101 and authorized to access the corresponding patient record 105. The authentication token 108 may identify the particular patient 101 associated with the patient record 105 that the HCP system 140 is permitted to obtain from the multiple sclerosis therapy service 160. The patient record 105 may include time-stamped event data 122 indicating the patient’s interaction with the prescription digital therapeutic 120 through the patient application 103 executing on the patient device 102.
[0079] The cloud computing resources 150 may be a distributed system (e.g., remote environment) having scalable/elastic resources 152. The resources 152 include computing resources 154 (e.g., data processing hardware) and/or the storage resources 156 (e.g., memory hardware). The cloud computing resources 150 execute the multiple sclerosis therapy service 160 for facilitating communications with the patient device 102 and the HCP system 140 and storing data on the storage resources 156 within the data storage 158. In some examples, multiple sclerosis therapy service 160 and the data storage 158 reside on a standalone computing device. The multiple sclerosis therapy service 160 may provide the patient 101 with the patient application 103 (e.g., a mobile application, a web-site application, or a downloadable program that includes a set of instructions) executable on the data processing hardware 112 and accessible through the network 106 via the patient device 102 when the patient 101 provides a valid access code 104.
Similarly, the multiple sclerosis therapy service 160 may provide the HCP 109 with the HCP application 110 (e.g., a mobile application, a web-site application, or a downloadable program that includes a set of instructions) executable on the data processing hardware 142 and accessible through the network 106 via the HCP system 140.
[0080] FIGS. 2A-2Q illustrate schematic views of exemplary GUIs of the prescription digital therapeutic 120 (e.g., by execution of the patient application 103) displayed on the display 116 of the patient device 102 for treating depressive symptoms associated with MS. The example GUIs are configured to display graphical elements (e.g., buttons) that the patient 101 may select via user inputs such as touch inputs, speech inputs, or other input techniques such as via a mouse, stylus, keyboard, gesture, or eye gaze.
[0081] Referring to FIG. 2A, in some implementations, upon launching the patient application 103 associated with the prescription digital therapeutic 120 prescribed to the patient 101, the patient application 103 displays a feeling selection GUI 204 that allows the patient 101 to input a particular feeling they are presently experiencing, or has recently experienced. In the example shown, the feeling selection GUI 204 provides a plurality of feeling interface elements 205, each 205a-n associated with a corresponding feeling the patient 101 is experiencing or has recently experienced. While the example shown depicts interface elements 205a-205g, the patient 101 may view additional interface elements 205n by scrolling (e.g., via a swipe gesture). The plurality of feeling interface elements 205 may be prepopulated based on common feelings a typical patient diagnosed with MS may be experiencing. The patient 101 may indicate their current feelings by selecting the corresponding feeling interface element 205 displayed in the feeling selection GUI 204. In the example shown, a first feeling interface element 205a (“Anxious”) indicates that the patient 101 is feeling anxious, a second feeling interface element 205b (“Scared”) indicates that the patient 101 is feeling scared, a third feeling interface element 205c (“Dreadful”) indicates that the patient 101 is feeling dreadful, a fourth feeling interface element 205d (“Panicked”) indicates that the patient 101 is feeling panicked, a fifth feeling interface element 205e (“Angry”) indicates that the patient 101 is feeling angry, a sixth feeling interface element 205f (“Frustrated”) indicates that the patient 101 is feeling frustrated, and a seventh feeling interface element 205g (“Grieved”) indicates that the patient 101 is feeling grieved.
[0082] The feeling interface elements 205a-205g do not represent an exhaustive list of all feeling interface elements, but rather an exemplary list of feeling interface elements that may be included as part of the feeling selection GUI 204. Furthermore, the feeling selection GUI 204 may include other feeling interface elements in addition to feeling interface elements 205a-205g, or may omit one or more of feeling interface elements 205a-205g, without departing from the teachings herein. In some implementations, each of the plurality of feeling interface elements 205 is categorized as being associated with one of“Negative” feelings or“Positive” feelings, such that additional feeling interface elements 205 within the Positive category (e.g., FIG. 2N) may be associated with feelings such as calm (“Calm”), neutral (“Okay”), prideful (“Proud”), optimistic
(“Hopeful”), or content (“Happy”).
[0083] In the example shown, the patient device 102 detects a first sequence of inputs, the first sequence of inputs including a feeling selection input 206 (e.g., touch or spoken) corresponding to the feeling element interface 205b (“Scared”) indicating they are feeling scared. As used herein, a sequence of inputs can be a single input. In some implementations, the feeling selection input 206 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient is presently feeling scared.
[0084] In some examples, the feeling selection input 206 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected feeling. In other examples, the feeling selection input 206 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected feeling.
[0085] After detecting selection of a feeling interface element 205, the patient application 103 advances to display a feeling spectrum GUI 207 (FIG. 2B) on the display 116 of the patient device 102. In some configurations, the feeling selection input 206 selecting the feeling interface element 205 causes the patient application 103 to automatically display the feeling spectrum GUI 207. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected feeling interface element 205 by selecting a Feeling Selection Done Button 237 (e.g., as shown in FIG. 2A). In these configurations, the patient application 103 displays the feeling spectrum GUI 207 in response to a selection indication indicating selection of the Feeling Selection Done Button 237.
[0086] At FIG. 2B, in some configurations, the patient application 103 causes the patient device 102 to display the feeling spectrum GUI 207 that allows the patient 101 to input a feeling intensity of the particular feeling that the they are presently experiencing. In the example shown, the feeling spectrum GUI 207 provides a plurality of intensities 208, each individual intensity 208a-208e being associated with a corresponding intensity of the particular feeling the patient 101 may be presently experiencing. The patient 101 may indicate the present intensity of their current feelings by moving a Slider button 238 to select a corresponding intensity. In some configurations, Slider button 238 translates up and down a Scale 241, and the position of Slider button 238 relative to the Scale 241 indicates a particular intensity. For example, the location of the Slider button 238 relative to the Scale 241 is reflected in an intensity value 239. The intensity value 239 will provide the patient 101 with a numerical percentage value of their intensity of their current feeling. For example, if the patient 101 translates the Slider button 238 more than half way up the Scale 241, the intensity value 239 will reflect a higher percentage value. As seen in FIG. 2B, the location of the Slider button 238 relative to the Scale indicates the intensity of the feeling scared that the patient 101 is feeling, and the intensity value 239 indicates that the patient 101 is 59% scared.
[0087] With continued reference to FIG. 2B, in some configurations, the location of Slider button 238 relative to the Scale 241 will correspond to one of the plurality of intensities 208. The patient 101 may indicate a feeling intensity of the particular feeling that they are currently feeling by translating the Slider button 238 relative to the Scale 241 to correspond to one of the plurality of intensities 208 displayed in the feeling spectrum GUI 207. The plurality of intensities 208 correspond to the feeling selection input 206 that was selected in the prior GUI, feeling selection GUI 204. In the example shown, the plurality of intensities 208 correspond to the feeling of“scared”; a first intensity 208a (“Extremely”) indicates that the patient is feeling extremely scared, a second intensity 208b (“Very”) indicates that the patient is feeling very scared, a third intensity 208c (“Fairly”) indicates that the patient is feeling fairly scared, a fourth intensity 208d (“A little”) indicates that the patient is feeling a little scared, and the fifth intensity 208e (“Barely”) indicates that the patient is feeling barely scared. The intensities 208a-208e do not represent an exhaustive list of all intensities, but rather an exemplary list of feeling interface elements that may be included on the feeling spectrum GUI 207. Furthermore, feeling spectrum GUI 207 may include other intensities in addition to the intensities 208a-208e, or may omit one or more intensities 208a-208e.
[0088] In the example shown, the patient device 102 detects a second sequence of inputs, the second sequence of inputs including a first feeling intensity input 209 (e.g., touch or spoken) that selects the intensity 208c, corresponding to the intensity value 239, indicating that they are feeling fairly scared. In some implementations, the first feeling intensity input 209 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient is presently feeling fairly scared.
[0089] In some examples, the first feeling intensity input 209 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected feeling intensity. In other examples, the first feeling intensity input 209 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected feeling intensity.
[0090] After detecting selection of the plurality of intensities 208, the patient application 103 advances to display an automatic thought selection GUI 210 (FIG. 2C) on the display 116 of the patient device 102. In some configurations, the first feeling intensity input 209 selecting one of the plurality of intensities 208 causes the patient application 103 to automatically display the automatic thought selection GUI 210. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected one of the plurality of intensities 208 by selecting a Feeling Spectrum Done Button 240 (e.g., as shown in FIG. 2B). In these configurations, the patient application 103 displays the automatic thought selection interface GUI 210 in response to a selection indication indicating selection of the Feeling Spectrum Done Button 240. According to some examples, and as shown in FIG. 2B, the text included within the Feeling Spectrum Done Button 240 may be based on the selected feeling intensity.
[0091] At FIG. 2C, in some configurations, the patient application 103 causes the patient device 102 to display the automatic thought selection GUI 210 that allows the patient 101 to input a particular automatic thought corresponding to their thoughts. In the example shown, the automatic thought selection GUI 210 provides a plurality of automatic thought interface elements 211, each individual automatic thought interface element 21 la-21 In being associated with a corresponding automatic thought that the patient 101 may have recently had, or currently has. While the example shown depicts automatic thought interface elements 21 la-21 lj, the patient 101 may view additional interface elements 21 In by scrolling (e.g., via a swipe gesture). The automatic thoughts represent thoughts that are common in patients with MS. As depicted in FIG. 2C, in the example shown, the particular thoughts are negative thoughts that users with MS experience that can cause depressive symptoms. Displaying common automatic thoughts advantageously allows the patient 101 to identify a particular thought that the patient has that may be associated with one or more depressive symptoms. The plurality of automatic thought interface elements 211 may be prepopulated based on common automatic thoughts a typical patient diagnosed with MS may have had or currently has. The patient 101 may indicate the automatic thought associated with them by selecting the
corresponding automatic thought interface element 211 displayed in the automatic thought selection GUI 210. In the example shown, a first automatic thought interface element 211a (“Relax and calm down”) indicates that the patient 101 has or had the thought to relax and calm down, a second automatic thought interface element 211b (“When you get her/him going you can’t stop her at all.”) indicates that the patient 101 has or had the thought that when you get him/her going you can’t stop her at all, a third automatic thought interface element 211c (“I need to calm down”) indicates that the patient 101 has or had the thought that they need to calm down, a fourth automatic thought interface element 21 Id (“Why is my wife with me?”) indicates that the patient 101 has or had the thought asking why their wife is still with them, a fifth automatic thought interface element 21 le (“Why can’t I have that?”) indicates that the patient 101 has or had the thought asking why they can’t have that, a sixth automatic thought interface element 21 If (“I hate to bother people”) indicates that the patient 101 has or had the thought that they hate to bother people, a seventh automatic thought interface element 21 lg (“I’m not good enough”) indicates that the patient 101 has or had the thought that they are not good enough, an eighth automatic thought interface element 21 lh (“I’m worthless”) indicates that the patient 101 has or had the thought that they are worthless, a ninth automatic thought interface element 21 li (“I can’t do anything correctly”) indicates that the patient 101 has or had the thought that they can’t do anything correctly, and a tenth automatic thought interface element 21 lj (“No one is ever going to be able to rely on me”) indicates that the patient 101 has or had the thought that no one is ever going to be able to rely on them.
[0092] The automatic thought interface elements 21 la-21 lj do not represent an exhaustive list of all automatic thought interface elements, but rather an exemplary list of automatic thought interface elements that may be included on the automatic thought selection GUI 210. Furthermore, the automatic thought selection GUI 210 may include other automatic thought interface elements in addition to automatic thought interface elements 21 la-21 lj, or may omit one or more automatic thought interface elements 211a- 21 lj .
[0093] In the example shown, the patient device 102 detects a third sequence of inputs, the third sequence of inputs including an automatic thought selection input 212 (e.g., touch or spoken) corresponding to the automatic thought interface element 21 If (“I hate to bother people”) indicating that the patient 101 has or has recently had the thought that they hate to bother people. In some implementations, the automatic thought selection input 212 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient 101 has or had the thought that they hate to bother people. [0094] In some examples, the automatic thought selection input 212 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected automatic thought. In other examples, the automatic thought selection input 212 causes the patient application 103 to modify the already -generated plurality of journal interface elements 231 to indicate the selected automatic thought.
[0095] After detecting selection of an automatic thought interface element 211, the patient application 103 advances to display an alternative thought selection GUI 213 (FIG. 2D) on the display 116 of the patient device 102. In some configurations, the automatic thought selection input 212 selecting the automatic thought interface element 211 causes the patient application 103 to automatically display the alternative thought selection GUI 213. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected automatic thought interface element 211 by selecting an Automatic Thought Selection Done Button 242. In these configurations, the patient application 103 displays the alternative thought selection GUI 213 in response to a selection indication indicating selection of the Automatic Thought Selection Done Button 242.
[0096] At FIG. 2D, in some configurations, the patient application 103 causes the patient device 102 to display the alternative thought selection GUI 213 that allows a patient 101 to input a particular alternative thought corresponding to their thoughts. In the example shown, the alternative thought selection GUI 213 provides a plurality of alternative thought interface elements 214, each individual alternative thought interface element 214a-214n being associated with a corresponding alternative thought that the patient 101 can use to modify their thoughts and feelings. While the example shown depicts alternative thought interface elements 214a-214h, the patient 101 may view additional interface elements 214n by scrolling (e.g., via a swipe gesture). The alternative thoughts represent thoughts that can help users with MS modify their automatic thoughts by changing the distortion of their thoughts. The alternative thoughts reflect positive thoughts that patients with depressive symptoms associated with MS can think about to modify their automatic thought(s) that are related to their depressive symptoms. The plurality of alternative thought interface elements 214 may be prepopulated based on recommended alternative thoughts a typical patient diagnosed with MS would find beneficial to think about in order to modify automatic thought(s). The patient 101 may indicate the alternative thought that they would like to use to modify their feelings and thoughts by selecting the corresponding alternative thought interface element 214 displayed in the alternative thought selection GUI 213. In the example shown, a first alternative thought interface element 214a (“I’m going to get through this eventually”) indicates that the patient 101 would like to modify their thoughts to thinking that they are going to get through this eventually, a second alternative thought interface element 214b (“I cannot hurt myself, my kids need me”) indicates that the patient 101 would like to modify their thoughts to thinking that they cannot hurt themselves and their kids need them, a third alternative thought interface element 214c (“Trying to talk myself out of the depths of despair and looking at the good things I have”) indicates that the patient 101 would like to modify their thoughts to thinking to trying to talk himself or herself out of the depths of despair and to look at the good things they have, a fourth alternative thought interface element 214d (“Try not to worry about tomorrow”) indicates that the patient 101 would like to modify their thoughts to try not to worry about tomorrow, a fifth alternative thought interface element 214e (“You have to keep pushing, be the man that you have always wanted to be”) indicates that the patient 101 would like to modify their thoughts to thinking that they have to keep pushing to be the person that they have always wanted to be, a sixth alternative thought interface element 214f (“My family is going to be ok”) indicates that the patient 101 would like to modify their thoughts to thinking that their family is going to be ok, a seventh alternative thought interface element 214g (“Take your time and complete the job right”) indicates that the patient 101 would like to modify their thoughts to thinking about taking their time and to complete the job right, and an eighth alternative thought interface element 214h (“Somehow someway God will provide”) indicates that the patient 101 would like to modify their thoughts to thinking that somehow someway God will provide. [0097] The alternative thought interface elements 214a-214h do not represent an exhaustive list of all alternative thought interface elements, but rather an exemplary list of alternative thought interface elements that may be included on the alternative thought selection GUI 213. Furthermore, the alternative thought selection GUI 213 may include other alternative thought interface elements in addition to alternative thought interface elements 214a-214h, or may omit one or more alternative thought interface elements 214a-214h.
[0098] In the example shown, the patient device 102 detects a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input 215 (e.g., touch or spoken) corresponding to the alternative thought interface element 214d (“Try not to worry about tomorrow”) indicating that the patient 101 would like to modify their thoughts to try not to worry about tomorrow. In some implementations, the alternative thought selection input 215 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient would like to modify their thoughts to try not to worry about tomorrow.
[0099] In some examples, the alternative thought selection input 215 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected alternative thought. In other examples, the alternative thought selection input 215 causes the patient application 103 to modify the already -generated plurality of journal interface elements 231 to indicate the selected alternative thought.
[00100] After detecting selection of an alternative thought interface element 214, the patient application 103 advances to display the feeling spectrum GUI 207 (FIG. 2E) on the display 116 of the patient device 102. In some configurations, the alternative thought selection input 215 selecting the alternative thought interface element 214 causes the patient application 103 to automatically display the feeling spectrum GUI 207. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected alternative thought interface element 214 by selecting an Alternative Thought Selection Done Button 243 (e.g., as shown in FIG. 2D). In these configurations, the patient application 103 displays the feeling spectrum GUI 207 in response to a selection indication indicating selection of the Alternative Thought Selection Done Button 243.
[00101] At FIG. 2E, in some configurations, the patient application 103 causes the patient device 102 to display again the feeling spectrum GUI 207 that allows a patient 101 to, again, input a feeling intensity of the particular feeling that they are presently experiencing or recently felt. In the example shown, the patient device 102 detects a fifth sequence of inputs, the fifth sequence of inputs including a second feeling intensity input 216 (e.g., touch or spoken) that selects the fifth intensity 208e, corresponding to an updated intensity value 244, indicating that they are feeling barely scared. In some implementations, the second feeling intensity input 216 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient is presently feeling barely scared.
[00102] In some examples, the second feeling intensity input 216 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating at least any difference between the first feeling intensity input 209 and the second feeling intensity input 216 (e.g., as reflected through a percentage decrease or the like). In other examples, the second feeling intensity input 216 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate at least any difference between the first feeling intensity input 209 and the second feeling intensity input 216.
[00103] After detecting selection of one of the plurality of intensities 208, the patient application 103 advances to display a next GUI on the display 116 of the patient device 102. In some configurations, the second feeling intensity input 216 selecting one of the plurality of intensities 208 causes the patient application 103 to automatically display the next GUI. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected one of the plurality of intensities 208 by selecting the Feeling Spectrum Done Button 240. In these configurations, the patient application 103 displays the next GUI in response to a selection indication indicating selection of the Feeling Spectrum Done Button 240.
[00104] Referring now to FIGs. 2F-2M, the patient application 103 may display some or all of the GUIs corresponding to the figures. The GUIs corresponding to FIGs. 2F-2M may be displayed, if at all, in any particular order at any time the patient 101 interacts with the patient application 103.
[00105] At FIGs. 2F-2H, in some configurations, the patient application 103 causes the patient device to display a thinking traps GUI 217 that allows the patient 101 to input a thinking trap associated with the particular thoughts they are having. In the examples shown, the thinking traps GUI 217 provides a plurality of thinking trap interface elements 218, each individual think trap interface element 218a-218n being associated with a corresponding thinking trap the patient 101 may be presently thinking or has recently thought. It should be noted that while the example shown depicts the thinking traps GUI 217 displaying the plurality of thinking trap interface elements 218, in other examples, thinking traps GUI 217 can display any other type of cognitive distortions other than thinking traps. While the example shown depicts thinking trap interface elements 218a- 218b, the patient 101 may view additional thinking trap interface elements 218n by scrolling (e.g., via a swipe gesture). The plurality of thinking trap interface elements 218 may be prepopulated based on thinking traps a typical patient diagnosed with MS may be thinking. In some examples, the particular thinking trap interface elements 218a-218b identified for presentation via the GUI 217 may be based on the feeling selected by the patient 101 via, for example, GUI 204 (see FIG. 2 A). The patient 101 may indicate their thinking by selecting one or more corresponding thinking trap interface elements 218a- 218b displayed in the thinking traps GUI 217. In the examples shown (e.g., as shown in FIGs. 2F-2H), a first thinking trap interface element 218a (“Overgeneralizing”) indicates that the patient 101 is overgeneralizing, and a second thinking trap interface element 218b (“Catastrophizing”) indicates that the patient 101 is catastrophizing. The thinking trap interface elements 218a-218b do not represent an exhaustive list of all thinking traps interface elements, but rather an exemplary list of thinking trap interface elements that may be included as part of the thinking traps GUI 217. Furthermore, the thinking traps GUI 217 may include other thinking trap interface elements in addition to thinking trap interface elements 218a-218b, or may omit one or more of thinking trap interface elements 218a-218b.
[00106] In the example shown, the patient device 102 detects a sixth sequence of inputs, the sixth sequence of inputs including a thinking trap selection input 219a (e.g., touch or spoken) corresponding to a Sounds Like Me Button 245a that corresponds to the thinking trap interface element 218a (“Overgeneralizing”) indicating that the patient 101 is overgeneralizing. In some implementations, the patient 101 can select one or more thinking trap interface elements by selecting more than one Sounds Like Me Buttons 245, each Sounds Like Me Button 245 corresponding to a thinking trap interface element 218. In other implementations, the patient 101 may opt not to select any thinking trap interface elements. In an example in which the patient opts to select one or more thinking trap interface elements, the patient 101 could select the Sounds Like Me Button 245a that corresponds to the thinking trap interface element 218a and a Sounds Like Me Button 245b that corresponds to the thinking trap interface element 218b, indicating that the patient 101 is both overgeneralizing and catastrophizing.
[00107] In some implementations, the thinking trap selection input 219a causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient is presently overgeneralizing.
[00108] In some examples, the thinking trap selection input 219 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected thinking trap. In other examples, the thinking trap selection input 219 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected thinking trap.
[00109] In some examples, a company selection GUI 221 (FIG. 21) is provided on the display 116 of the patient device 102. The patient application 103 may advance to the company selection GUI 221, according to one example, in response to the patient 101 selecting one or more thinking trap interface elements 218a-218b. In some configurations, the thinking trap selection input 219 selecting the Sounds Like Me button
245 causes the patient application 103 to automatically display the company selection GUI 221. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected thinking trap interface element 218 by selecting a Done button
246 (e.g., as shown in FIG. 2F). In these configurations, the patient application 103 displays the company selection GUI 221 in response to a selection indication indicating selection of the Done button 246.
[00110] At FIG. 21, in some configurations, the patient application 103 causes the patient device 102 to display the company selection GUI 221 that allows a patient 101 to input the company that they were with when they felt the particular feeling. In the examples shown, the company selection GUI 221 provides a plurality of company interface elements 233, each individual company interface element 233a-n being associated with a corresponding person (as identified by relationship type) that the patient 101 may have been with prior to, or when experiencing, the particular feeling. While the example shown depicts interface elements 233a-233e, the patient 101 may view additional company interface elements 233n by scrolling (e.g., via a swipe gesture). The plurality of company interface elements 233 may be prepopulated based on company a typical patient diagnosed with MS may be with when they experience a particular feeling. The patient 101 may indicate the company that they were with when they experienced the particular feeling by selecting the corresponding company interface element 233 displayed in the company selection GUI 221. In the example shown, a first company interface element 233a (“My Self’) indicates that the patient 101 was alone when they experienced the particular feeling, a second company interface element 233b (“My Partner”) indicates that the patient 101 was with their partner when they experienced the particular feeling, a third company interface element 233c (“My Children”) indicates that the patient 101 was with their children when they experienced the particular feeling, a fourth company interface element 233d (“My Sibling”) indicates that the patient 101 was with their sibling when they experienced the particular feeling, and a fifth company interface element 233 e (“My Parent”) indicates that the patient 101 was with their parent when they experienced the particular feeling. [00111] The company interface elements 233a-e do not represent an exhaustive list of all company interface elements, but rather an exemplary list of company interface elements that may be included on company selection GUI 221. Furthermore, company selection GUI 221 may include other company interface elements in addition to company interface elements 233a-233e, or may omit one or more of company interface elements 233a-233e.
[00112] In the example shown, the patient device 102 detects a seventh sequence of inputs, the seventh sequence of inputs including a company selection input 223 (e.g., touch or spoken) corresponding to the company interface element 223d (“My Sibling”) indicating that the patient 101 was with their sibling when they felt the particular feeling. In some implementations, the company selection input 223 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient was with their sibling when they felt the particular feeling.
[00113] In some examples, the company selection input 223 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected company. In other examples, the company selection input 223 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected company.
[00114] In some examples, a location selection GUI 224 (FIG. 2J) is provided on the display 116 of the patient device 102. The patient application 103 may advance to the location selection GUI 224, according to one example, in response to the patient 101 selecting a company interface element 233. In some configurations, the company selection input 223 selecting the company interface element 233 causes the patient application 103 to automatically display the location selection GUI 224. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected company interface element 233 by selecting a Company Selection Done Button 247 (e.g., as shown in FIG. 21). In these configurations, the patient application 103 displays the location selection GUI 224 in response to a selection indication indicating selection of the Company Selection Done Button 247.
[00115] At FIG. 2J, in some configurations, the patient application 103 causes the patient device 102 to display the location selection GUI 224 that allows a patient 101 to input the location that patient 101 was at prior to, or when, the patient 101 felt the particular feeling. In the examples shown, the location selection GUI 224 provides a plurality of location interface elements 225, each individual location interface element 225a-n being associated with a corresponding location that the patient 101 may have been at prior to, or when, experiencing the particular feeling. While the example shown depicts location interface elements 225a-225e, the patient 101 may view additional location interface elements 225n by scrolling (e.g., via a swipe gesture). The plurality of location interface elements 225 may be prepopulated based on locations commonly frequented by patients diagnosed with MS. The patient 101 may indicate the location that they were at prior to, or when, they experienced the particular feeling by selecting the corresponding location interface element 225 displayed in the location selection GUI 224. In the example shown, a first location interface element 225a (“Home”) indicates that the patient 101 was at home when they experienced the particular feeling, a second location interface element 225b (“Doctor”) indicates that the patient 101 was at their doctor’s office when they experienced the particular feeling, a third location interface element 225c (“Work”) indicates that the patient 101 was at their work or place of employment when they experienced the particular feeling, a fourth location interface element 225d (“Commute”) indicates that the patient 101 was commuting to and/or from a location when they experienced the particular feeling, and a fifth location interface element 225e (“Store”) indicates that the patient 101 was at a store when they experienced the particular feeling.
[00116] The location interface elements 225a-e do not represent an exhaustive list of all location interface elements, but rather an exemplary list of location interface elements that may be included on location selection GUI 224. Furthermore, location selection GUI 224 may include other location interface elements in addition to location interface elements 225a-225e, or may omit one or more of location interface elements 225a-225e. [00117] In the example shown, the patient device 102 detects an eighth sequence of inputs, the eighth sequence of inputs including a location selection input 226 (e.g., touch or spoken) corresponding to the feeling interface element 225d (“Commute”) indicating that the patient 101 was commuting to or from a location when they felt the particular feeling. In some implementations, the location selection input 226 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient was commuting to or from a location when they experienced the particular feeling.
[00118] In some examples, the location selection input 226 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected location. In other examples, the location selection input 226 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected location.
[00119] In some examples, a MS symptoms selection GUI 227 (FIG. 2K) is provided on the display 116 of the patient device 102. The patient application 103 may advance to the MS symptoms selection GUI 227, according to one example, in response to the patient 101 selecting a location interface element 225. In some configurations, the location selection input 226 selecting the location interface element 225 causes the patient application 103 to automatically display the MS symptoms selection GUI 227. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected location interface element 225 by selecting a Location Selection Done Button 248. In these configurations, the patient application 103 displays the MS symptom selection GUI 227 in response to a selection indication indicating selection of the Location Selection Done Button 248.
[00120] At FIG. 2K, in some configurations, the patient application 103 causes the patient device 102 to display the MS symptom selection GUI 227 that allows a patient 101 to input one or more MS symptoms that they experienced associated with the particular feeling. In the examples shown, the MS symptom selection GUI 227 provides a plurality of MS symptom interface elements 228, each individual MS symptom interface element 228a-n being associated with a corresponding symptom that the patient 101 may have experienced associated with the particular feeling. While the example shown depicts MS symptom interface elements 228a-228h, the patient 101 may view additional MS symptom interface elements 228n by scrolling (e.g., via a swipe gesture). The plurality of MS symptom interface elements 228 may be prepopulated based on MS symptom a patient diagnosed with MS may experience related to the selected feeling (e.g., as selected through GUI 204 shown at FIG. 2A). The patient 101 may indicate the MS symptom that they experienced associated with the particular feeling by selecting the corresponding MS symptom interface element 228 displayed in the MS symptom selection GUI 228. In the example shown, a first MS symptom interface element 228a (“Relapse”) indicates that the patient 101 had a relapse associated with the particular feeling, a second MS symptom interface element 228b (“Fatigue”) indicates that the patient 101 experienced fatigue associated with the particular feeling, a third MS symptom interface element 228c (“Brain Fog”) indicates that the patient 101 experienced brain fog associated with the particular feeling, a fourth MS symptom interface element 228d (“Tremor”) indicates that the patient 101 experienced at least one tremor associated with the particular feeling, a fifth MS symptom interface element 228e (“Focus”) indicates that the patient 101 experienced difficulty focusing associated with the particular feeling, a sixth MS symptom interface element 228f (“Memory”) indicates that the patient 101 experienced memory problems associated with the particular feeling, a seventh MS symptom interface element 228g (“Balance Problems”) indicates that the patient 101 experienced balance problems associated with the particular feeling, and an eighth MS symptom interface element 228h (“Vision”) indicates that the patient 101 experienced vision problems associated with the particular feeling.
[00121] The MS symptoms interface elements 228a-h do not represent an exhaustive list of all MS symptom interface elements, but rather an exemplary list of symptom interface elements that may be included on MS symptom selection GUI 227.
Furthermore, MS symptom selection GUI 227 may include other symptom interface elements in addition to symptom interface elements 228a-228h, or may omit one or more of MS symptom interface elements 228a-228h. [00122] In the example shown, the patient device 102 detects a ninth sequence of inputs, the ninth sequence of inputs including a MS symptom selection input 229 (e.g., touch or spoken) corresponding to the MS symptom interface element 228d (“Tremor”) indicating that the patient 101 felt one or more tremors when they experienced the particular feeling. In some implementations, the MS symptom selection input 229 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient felt tremors when they experienced the particular feeling.
[00123] In some examples, the MS symptom selection input 229 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected MS symptom. In other examples, the MS symptom selection input 229 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected MS symptom.
[00124] In some examples, a journal GUI 230 (FIG. 2M) is provided on the display 116 of the patient device 102. The patient application 103 may advance to the journal GUI 230, according to one example, in response to the patient 101 selecting a MS symptom interface element 228. In some configurations, the MS symptom selection input 229 selecting the MS symptom interface element 228 causes the patient application 103 to automatically display the journal GUI 230. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected MS interface element 228 by selecting a MS Symptoms Selection Done Button 249. In these configurations, the patient application 103 displays the journal GUI 230 in response to a selection indication indicating selection of the MS Symptoms Selection Done Button 249.
[00125] At FIG. 2M, in some configurations, the patient application 103 causes the patient device 102 to display the journal GUI 230 that allows a patient 101 to view information corresponding to a history of past interactions between the patient 101 and the patient application 103. In the examples shown, the journal GUI 230 provides a timestamp interface element 232 associated with a particular time and date that the patient application recorded the interaction between the patient 101 and the patient application 103, a plurality of journal interface elements 231, each individual journal interface element being associated with corresponding journal information that the patient 101 may have entered in while interacting with the patient application 103. While the example shown depicts journal interface elements 23 la-23 lh, the patient 101 may view additional journal interface elements 23 In by scrolling (e.g., via a swipe gesture). The plurality of journal interface elements 231 may be prepopulated based on interactions between the patient 101 and the patient application 103 at the time and day corresponding to the timestamp interface element 232. The patient 101 may view past interactions between the patient 101 and the patient application 103. In the example shown, at a time and day corresponding to the timestamp interface element 232 (“January 30th 2019, 2:58pm”), a first journal interface element 231a (“Start Feeling”) indicates that the patient 101 first selected the scared feeling and a feeling intensity of 59%, a second journal interface element 23 lb (“Who I Was With”) indicates that the patient 101 was alone when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 232, a third journal interface element 231c (“Where”) indicates that the patient 101 was at the doctor’s office when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 232, a fourth journal interface element 23 Id (“MS
Symptoms”) indicates that the patient 101 felt fatigue when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 232, a fifth journal interface element 23 le (“Automatic Thought”) indicates that the patient 101 had the automatic thought that the patient needs to calm down when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 232, a sixth journal interface element 23 If (“Thinking Traps”) indicates that the patient 101 overgeneralized when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 232, a seventh journal interface element 23 lg (“Alternative Thought”) indicates that the patient 101 chose the alternative thought that the patient 101 is going to get through this eventually when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 232, and an eighth journal interface element 23 lh (“End Feeling”) indicates the patient feels 41% less scared at the ending of the interaction between the patient 101 and the patient application 103 at the time and day corresponding to the timestamp interface element 232.
[00126] The journal interface elements 23 la-h do not represent an exhaustive list of all journal interface elements, but rather an exemplary list of journal interface elements that may be included on journal GUI 230. Furthermore, journal GUI 230 may include other journal interface elements in addition to journal interface elements 23 la-23 lh, or may omit one or more of journal interface elements 23 la-23 lh.
[00127] At FIG. 2L, in some configurations, the patient application 103 causes the patient device 102 to display a recap interface element 220. This may occur at any point during the interaction between the patient 101 and the patient application 103, but in the example shown, occurs at least after the patient 101 has selected an automatic thought and one or more thinking traps. In the example shown, the recap interface element 220 provides information to patient 101 corresponding to an automatic thought and a thinking trap selected by the patient 101 while the patient 101 interacted with the patient application 103. The information in the recap interface element 220 does not represent an exhaustive list of all information capable of representation in the recap interface element 220, but rather an example of the type of information that can be presented in the recap interface 220. Furthermore, the recap interface 220 may include other information in addition to the information depicted in the example in FIG. 2L, or may omit information depicted in the example in FIG. 2L.
[00128] At FIG. 2N, in some configurations, the patient application 103 causes the patient device 102 to display a positive feeling selection GUI 250 that allows the patient 101 to input a particular feeling they are presently experiencing, or has recently experienced. In the example shown, the positive feeling selection GUI 250 provides a plurality of positive feeling interface elements 251, each 251a-n associated with a corresponding feeling the patient 101 is experiencing or has recently experienced. While the example shown depicts interface elements 251a-251h, the patient 101 may view additional interface elements 25 In by scrolling (e.g., via a swipe gesture). The plurality of positive feelings interface elements 251 may be prepopulated based on common feelings a typical patient with MS may be experiencing. The patient 101 may indicate their current feelings by selecting the corresponding positive feeling interface element 251 displayed in the positive feeling selection GUI 250. In the example shown, a first positive feeling interface element 251a (“Calm”) indicates that the patient 101 is feeling calm, a second positive feeling interface element 251b (“Okay”) indicates that the patient 101 is feeling okay, a third positive feeling interface element 251c (“Proud”) indicates that the patient 101 is feeling proud, a fourth positive feeling interface element 25 Id (“Hopeful”) indicates that the patient 101 is feeling hopeful, a fifth positive feeling interface element 25 le (“Happy”) indicates that the patient 101 is feeling happy, a sixth positive feeling interface element 25 If (“Optimistic”) indicates that the patient 101 is feeling optimistic, a seventh positive feeling interface element 25 lg (“Determined”) indicates that the patient 101 is feeling determined, and an eighth positive feeling interface element 25 lh (“Grateful”) indicates that the patient 101 is feeling grateful.
[00129] The positive feeling interface elements 25 la-25 lh do not represent an exhaustive list of all positive feeling interface elements, but rather an exemplary list of positive feeling interface elements that may be included as part of the positive feeling selection GUI 250. Furthermore, the positive feeling selection GUI 250 may include other positive feeling interface elements in addition to positive feeling interface elements 25 la-25 lh, or may omit one or more of positive feeling interface elements 25 la-25 lh, without departing from the teachings herein. In some implementations, each of the plurality of positive feeling interface elements 251 is categorized as being associated with one of“Negative” feelings (e.g., FIG. 2A) or“Positive” feelings.
[00130] In the example shown, the patient device 102 detects a tenth sequence of inputs, the tenth sequence of inputs including a positive feeling selection input 254 (e.g., touch or spoken) corresponding to the positive feeling element interface 251c (“Proud”) indicating they are feeling proud. In some implementations, the positive feeling selection input 254 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient is presently feeling proud. [00131] In some examples, the positive feeling selection input 254 causes the patient application 103 to generate, for display on the patient device 102, a positive journal interface element of a plurality of journal interface elements 260 (FIG. 2Q), the positive journal interface element indicating the selected feeling. In other examples, the positive feeling selection input 254 causes the patient application 103 to modify the already- generated plurality of positive journal interface elements 260 to indicate the selected feeling.
[00132] After detecting selection of a positive feeling interface element 251, in some embodiments, the patient application 103 advances to display a situation selection GUI 255 (FIG. 20) on the display 116 of the patient device 102. In some configurations, the positive feeling selection input 254 selecting the positive feeling interface element 251 causes the patient application 103 to automatically display the situation selection GUI 255. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected positive feeling interface element 251 by selecting a Positive Feeling Selection Done Button 253 (e.g., as shown in FIG. 20). In these configurations, the patient application 103 displays the feeling spectrum GUI 207 in response to a selection indication indicating selection of the Feeling Selection Done Button 237.
[00133] At FIG. 20, in some configurations, the patient application 103 causes the patient device 102 to display the situation selection GUI 255 that allows the patient 101 to input a situation corresponding to what they did. The situation may correspond to an activity the patient 101 did recently. The situation may also correspond to an activity the patient 101 engaged in when the patient 101 felt the selected positive feeling, or when the patient 101 felt a positive feeling. In the example shown, the situation selection GUI 255 provides a plurality of situation interface elements 256, each situation interface element 256a-256n being associated with a corresponding situation that the patient 101 may have recently been involved in, or currently is involved in. While the example shown depicts situation interface elements 256a-256j, the patient 101 may view additional interface elements 256n by scrolling (e.g., via a swipe gesture). The plurality of situation interface elements 256 may be prepopulated based on situations that patients with MS are commonly involved with, or activities that patients with MS commonly partake in. The patient 101 may indicate the situation associated with them by selecting the
corresponding situation interface element 256 displayed in the situation selection GUI 255. In the example shown, a first situation interface element 256a (“Catch it, Check it, Change it”) indicates that the patient 101 engaged in the activity of Catch it, Check it, Change it, a second situation interface element 256b (“Meditated”) indicates that the patient 101 meditated, a third situation interface element 256c (“Spent time with a loved one”) indicates that the patient 101 spent time with a loved one, a fourth situation interface element 256d (“Spent time with a pet”) indicates that the patient 101 spent time with a pet, a fifth situation interface element 256e (“Ate healthy”) indicates that the patient 101 ate healthy, a sixth situation interface element 256f (“I got a good check up at the doctor”) indicates that the patient 101 got a good check up at the doctor, a seventh situation interface element 256g (“I accomplished something”) indicates that the patient 101 accomplished something, an eighth situation interface element 256h (“I just feel good”) indicates that the patient 101 just feels good, a ninth situation interface element 256i (“Exercised”) indicates that the patient 101 exercised, a tenth situation interface element 256j (“Yoga”) indicates that the patient 101 did yoga, and an eleventh situation interface element 256k (“Spiritual Activity”) indicates that the patient 101 engaged in a spiritual activity.
[00134] The situation interface elements 256a-256k do not represent an exhaustive list of all situation interface elements, but rather an exemplary list of situation interface elements that may be included on the situation selection GUI 255. Furthermore, the situation selection GUI 255 may include other situation interface elements in addition to situation interface elements 256a-256k, or may omit one or more situation interface elements 256a-256k.
[00135] In the example shown, the patient device 102 detects an eleventh sequence of inputs, the eleventh sequence of inputs including a situation selection input 257 (e.g., touch or spoken) corresponding to the situation interface 256e (“Ate healthy”) indicating that the patient 101 ate healthy. In some implementations, the situation selection input 257 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient 101 ate healthy.
[00136] In some examples, the situation selection input 257 causes the patient application 103 to generate, for display on the patient device 102, a positive journal interface element of a plurality of journal interface elements 260 (FIG. 2Q), the positive journal interface element indicating the selected situation. In other examples, the situation selection input 257 causes the patient application 103 to modify the already-generated plurality of positive j ournal interface elements 260 to indicate the selected situation.
[00137] At FIG. 2P, in some configurations, the patient application 103 causes the patient device 102 to display a positive reflection element 258. This may occur at any point during the interaction between the patient 101 and the patient application 103, but in the example shown, occurs at least after the patient 101 has selected a positive feeling and a situation. In the example shown, the positive reflection element 258 provides information to patient 101 corresponding to a positive feeling and a situation selected by the patient 101 while the patient 101 interacted with the patient application 103. The information in the positive reflection element 258 does not represent an exhaustive list of all information capable of representation in the positive reflection element 258, but rather an example of the type of information that can be presented in the positive reflection element 258. Furthermore, the positive reflection element 258 may include other information in addition to the information depicted in the example in FIG. 2P, or may omit information depicted in the example in FIG. 2P.
[00138] At FIG. 2Q, in some configurations, the patient application 103 causes the patient device 102 to display a positive journal GUI 259 that allows a patient 101 to view information corresponding to a history of past interactions between the patient 101 and the patient application 103. In the examples shown, the positive journal GUI 259 provides a timestamp interface element 261 associated with a particular time and date that the patient application recorded the interaction between the patient 101 and the patient application 103, the plurality of positive j ournal interface elements 260, each individual positive journal interface element being associated with corresponding journal information that the patient 101 may have entered in while interacting with the patient application 103. While the example shown depicts positive journal interface elements 260a-260e, the patient 101 may view additional positive journal interface elements 260n by scrolling (e.g., via a swipe gesture). The plurality of journal interface elements 260 may be prepopulated based on interactions between the patient 101 and the patient application 103 at the time and day corresponding to the timestamp interface element 261. The patient 101 may view past interactions between the patient 101 and the patient application 103. In the example shown, at a time and day corresponding to the timestamp interface element 261 (“Monday, Jan 7th, 1 :00PM”), a first positive journal interface element 260a (“Feeling”) indicates that the patient 101 felt proud at the time and day corresponding to the timestamp interface element 261, a second positive journal interface element 260b (“Where you were”) indicates that the patient 101 was at their house when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 261, a third positive journal interface element 260c (“Who you were with”) indicates that the patient 101 was alone when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 261, a fourth positive journal interface element 260d (“Ate Healthy”) indicates that the patient 101 ate healthy when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 261, and a fifth positive journal interface element 260e (“Positive Reflection”) indicates that the patient 101 felt proud of themselves for taking control and following through with a healthier diet and believed that it really helped with their symptoms when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 261.
[00139] The positive journal interface elements 260a-e do not represent an exhaustive list of all journal interface elements, but rather an exemplary list of positive journal interface elements that may be included on the positive journal GUI 259. Furthermore, the positive journal GUI 259 may include other positive journal interface elements in addition to positive journal interface elements 260a-260e, or may omit one or more of positive journal interface elements 260a-260e. [00140] At FIG. 2R, in some configurations, the patient application 103 causes the patient device 102 to display a relax-and-remind GUI 262 that provides a mindfulness interface element 264 and a fatigue interface element 266. This may occur at any point during the interaction between the patient 101 and the patient application 103. In some examples, the patient device 102 displays the relax-and-remind GUI 262 at least after the patient 101 has selected a relax-and-remind selection input.
[00141] At FIG. 2S, upon the patient application 103 detecting a mindfulness selection input 265 selecting the mindfulness interface element 264, the patient application 103 is configured to display a mindfulness GUI 268 that provides a plurality of mindfulness technique interface elements 270a-f, each mindfulness technique interface element 270 being associated with a particular mindfulness technique. In some implementations, the mindfulness techniques correspond to current thoughts or emotions experienced by the patient 101. For example, the mindfulness techniques may correspond to stress relief, feeling stressed, resolving shame, going through shame, less lonely now, lingering loneliness, clearing depression, lingering depression, resolving grief, still grieving, surrender frustration, feeling frustrated, goodbye anger, anger persists, less anxious, more anxious, letting go of panic, panic stricken, etc.
[00142] The mindfulness technique interface elements 270a-f do not represent an exhaustive list of all mindfulness technique interface elements, but rather an exemplary list that may be included on the mindfulness GUI 268. Furthermore, the mindfulness GUI 268 may include other mindfulness technique interface elements in addition to the mindfulness technique interface elements 270a-f, or may omit one or more of the mindfulness technique interface elements 270a-f.
[00143] At FIG. 2T, upon the patient application 103 detecting a mindfulness technique selection input 271 selecting one of the mindfulness technique interface elements 270, e.g., the mindfulness technique interface element 270d corresponding to “Feeling Frustrated,” the patient application 103 is configured to display a mindfulness technique data GUI 272 that provides the data corresponding to the selected mindfulness technique. The plurality of mindfulness techniques may include audio data, video data, audio/video data, interactive data, etc. The mindfulness technique data GUI 272 may provide other interface elements, such as a play/pause button, an“I’m Done” button, etc. While FIG. 2T illustrates a single audio and/or video display, it should be understood that multiple selectable presentations may be presented. In some implementations, the mindfulness technique selection input 271 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient 101 has or had feelings of being frustrated. As noted previously, by sending the time-stamped event data 122 to the multiple sclerosis therapy service 160, then a log of the patient’s inputs into the interface can be maintained, for example for diagnostic or research purposes, or to allow tracking of the progress of the digital therapy.
[00144] At FIG. 2U, upon the patient application 103 detecting a fatigue selection input 267 selecting the fatigue interface element 266 (FIG. 2R), the patient application 103 is configured to display a fatigue GUI 274 providing a plurality of fatigue type interface elements 276a-h, each fatigue type interface element 276 being associated with a particular fatigue type that may be experienced by patients suffering from multiple sclerosis. The plurality of fatigue types may correspond to lassitude, diet, sleep, environment, cognitive, emotional, overstimulation, inactivity, heat, etc.
[00145] The fatigue type interface elements 276a-h do not represent an exhaustive list of all fatigue type interface elements, but rather an exemplary list that may be included on the fatigue GUI 274. Furthermore, the fatigue GUI 274 may include other fatigue type interface elements in addition to the fatigue type interface elements 276a-h, or may omit one or more of the fatigue type interface elements 276a-h.
[00146] At FIG. 2V, upon the patient application 103 detecting a fatigue type selection input 277 selecting one of the fatigue type interface elements 276, e.g., the fatigue type interface element 276c corresponding to“Sleep,” the patient application 103 is configured to display a fatigue type data GUI 278 that provides the data corresponding to the selected fatigue type. In some examples, the data corresponding to the selected fatigue type includes a plurality of presentations 280a-c. The plurality of fatigue types may include audio data, video data, audio/video data, interactive data, etc. The fatigue type data GUI 278 may provide other interface elements, such as a play/pause button, an “I’m Done” button, etc. In some implementations, the fatigue type selection input 277 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient 101 has or had feelings of being frustrated. As noted previously, by sending the time-stamped event data 122 to the multiple sclerosis therapy service 160, then a log of the patient’s inputs into the interface can be maintained, for example for diagnostic or research purposes, or to allow tracking of the progress of the digital therapy.
[00147] In implementations where the mindfulness techniques provide audio and/or video data and the fatigue types provide audio and/or video data, the audio and/or video data may be presented by a patient suffering from multiple sclerosis to provide a sense of community and empathy that may not be exhibited through use of, e.g., a paid actor.
[00148] FIG. 3 is a flow chart illustrating a method 300 for treating depressive symptoms associated with multiple sclerosis in accordance with an example
implementation of the disclosed technology. According to one example, the method 300 may be performed by an electronic device, such as the patient device 102. The method 300 begins at block 302 where a feeling selection interface (e.g., the feeling selection GUI 204) is displayed. The feeling selection interface presents a plurality of feeling interface elements (e.g., the plurality of feeling interface elements 205), each feeling interface element being associated with a particular feeling. At block 304, a first sequence of inputs including a feeling selection input (e.g., the feeling selection input 206) is received. The feeling selection input corresponds to a particular feeling interface element (e.g., the second feeling interface element 205b). At block 306, the electronic device displays a feeling spectrum interface (e.g., the feeling spectrum GUI 207). The feeling spectrum interface presents a plurality of intensities (e.g., the plurality of intensities 208) associated with the particular feeling.
[00149] At block 308, the electronic device receives a second sequence of inputs including a first feeling intensity input (e.g., the first feeling intensity input 209). The first feeling intensity input corresponds to a first intensity (e.g., the third intensity 208c) of the plurality of intensities. At block 310, the electronic device displays an automatic thought selection interface (e.g., the automatic thought selection GUI 210). The automatic thought selection interface presenting a plurality of automatic thought interface elements (e.g., the plurality of automatic thought interface elements 211). Each automatic thought interface element is associated with a particular automatic thought. At block 312, the electronic device receives a third sequence of inputs including an automatic thought selection input (e.g., the automatic thought selection input 212). The automatic thought selection input corresponds to a particular automatic thought interface element. At block 314, the electronic device displays an alternative thought selection interface (e.g., the alternative thought selection GUI 213). The alternative thought selection interface presents a plurality of alternative thought interface elements (e.g., the plurality of alternative thought interface elements 214). Each alternative thought interface element is associated with a particular alternative thought.
[00150] At block 316, the electronic device receives a fourth sequence of inputs including an alternative thought selection input (e.g., the alternative thought selection input 215). The alternative thought selection input corresponds to a particular alternative thought interface element. At block 318, the electronic device displays the feeling spectrum interface. At block 320, the electronic device receives a fifth sequence of inputs including a second feeling intensity input (e.g., the second feeling intensity input 216). The second feeling intensity input corresponds to a second intensity (e.g., the fifth intensity 208e) of the plurality of intensities. At block 322, the electronic device generates a journal entry (e.g., the eighth journal interface element 23 lh). The journal entry indicates at least any difference between the first feeling intensity input and the second feeling intensity input. Following block 322, the method 300 concludes.
[00151] FIG. 4 is a flow chart illustrating another method 400 for treating depressive symptoms associated with multiple sclerosis in accordance with an example
implementation of the disclosed technology. According to one example, the method 400 may be performed by an electronic device, such as the patient device 102. The method 400 begins at block 402 where the electronic device receives feeling assessment data describing a feeling associated with a user (e.g., as shown in FIG. 2A). At block 404, the electronic device receives first feeling intensity data describing a first intensity of the feeling associated with the user (e.g., as shown in FIG. 2B). [00152] At block 406, the electronic device identifies a plurality of potential automatic thoughts based on the feeling associated with the user (e.g., as shown in FIG. 2C). Each potential automatic thought of the plurality of potential automatic thoughts correspond to a negative thought. At block 408, the electronic device receives automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts (e.g., as shown in FIG. 2C).
[00153] At block 410, the electronic device identifies a plurality of potential alternative thoughts based on the automatic thought selection data (e.g., as shown in FIG. 2D). Each potential alternative thought of the plurality of potential alternative thoughts correspond to a positive thought. At block 412, the electronic device receives alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts (e.g., as shown in FIG. 2D).
[00154] At block 414, the electronic device receives second feeling intensity data describing a second intensity of the feeling associated with the user (e.g., as shown in FIG. 2E). At block 416, the electronic device determines any difference between the first intensity and the second intensity to provide feeling intensity difference data. At block 418, the electronic device displays the feeling intensity difference data (e.g., as shown in FIG. 2M). Following block 418, the method 400 concludes.
[00155] FIG. 5 is schematic view of an example electronic device 500 (e.g., a computing device) that may be used to implement the systems and methods described in this document. The electronic device 500 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document.
[00156] The electronic device 500 includes a processor 510, memory 520, a storage device 530, a high-speed interface/controller 540 connecting to the memory 520 and high-speed expansion ports 550, and a low speed interface/controller 560 connecting to a low speed bus 570 and a storage device 530. Each of the components 510, 520, 530, 540, 550, and 560, is interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 510 can process instructions for execution within the electronic device 500, including instructions stored in the memory 520 or on the storage device 530 to display graphical information for a graphical user interface (GUI) on an external input/output device, such as display 580 coupled to high speed interface 540. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple electronic device 500 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
[00157] The memory 520 stores information non-transitorily within the electronic device 500. The memory 520 may be a computer- readable medium, a volatile memory unit(s), or non-volatile memory unit(s). The non-transitory memory 520 may be physical devices used to store programs (e.g., sequences of instructions) or data (e.g., program state information) on a temporary or permanent basis for use by the electronic device 500. Examples of non-volatile memory include, but are not limited to, flash memory and read-only memory (ROM) / programmable read-only memory (PROM) / erasable programmable read-only memory (EPROM) / electronically erasable programmable read only memory (EEPROM) (e.g., typically used for firmware, such as boot programs). Examples of volatile memory include, but are not limited to, random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), phase change memory (PCM) as well as disks or tapes.
[00158] The storage device 530 is capable of providing mass storage for the electronic device 500. In some implementations, the storage device 530 is a computer-readable medium. In various different implementations, the storage device 530 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. In additional implementations, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as the memory 520, the storage device 530, or memory on processor 510.
[00159] The high speed controller 540 manages bandwidth-intensive operations for the electronic device 500, while the low speed controller 560 manages lower bandwidth intensive operations. Such allocation of duties is exemplary only. In some
implementations, the high-speed controller 540 is coupled to the memory 520, the display 580 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 550, which may accept various expansion cards (not shown).
[00160] The electronic device 500 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 500a or multiple times in a group of such servers 500a, as a laptop computer 500b, or as part of a rack server system 500c.
[00161] Referring now to FIG. 6, one example of a digital therapeutic 600 for treating depressive symptoms associated with multiple sclerosis is illustrated, in functional block form. As shown, the digital therapeutic 600 includes a feeling assessment module 604, an automatic thought identification module 606, an alternative thought identification module 614, a feeling intensity module 622, a thinking traps module 634, a company module 644, a location module 648, a multiple sclerosis symptom module 652, a journal module 654, and a display module 630. According to one example, the digital therapeutic 600 may be implemented as a computer program executed on an electronic device, such as device 102. According to this example, executing the computer program on the electronic device may serve to administer therapeutic treatment to a user of the electronic device in a manner designed to mitigate, or alleviate, depressive symptoms associated with multiple sclerosis.
[00162] In operation, the digital therapeutic 600 may function as follows. The feeling assessment module 604 is configured to receive feeling assessment data 602 (e.g. input 206; block 304). The feeling assessment data 602 may constitute data describing a feeling associated with a user (e.g., anxious, scared, dreadful, etc.). According to one example, the feeling assessment data 602 may be provided to the feeling assessment module 604 via user input as discussed, for example, with regard to FIG. 2A above. [00163] The automatic thought identification module 606 is configured to receive the feeling assessment data 602 from the feeling assessment module 604. In addition, the automatic thought identification module 606 is configured to identify a plurality of potential automatic thoughts 608 based on the feeling assessment data 602. By way of example and not limitation, the plurality of potential automatic thoughts 608 may be identified from within a database or the like (not shown) storing a variety of automatic thoughts. Each potential automatic thought of the plurality of potential automatic thoughts 608 may correspond to a negative thought (although, according to some examples, one or more potential automatic thoughts may correspond to a positive thought). Further, the automatic thought identification module 606 is configured to receive automatic thought selection data 612 (e.g., input 212; block 312). The automatic thought selection data 612 may identify a particular potential automatic thought 610 from among the plurality of potential automatic thoughts 608. According to one example, the automatic thought selection data 612 may be provided to the automatic thought identification module 606 via user input as discussed, for example, with regard to FIG.
2C above.
[00164] The alternative thought identification module 614 is configured to receive the automatic thought selection data 612. In addition, the alternative thought identification module 614 is configured to identify a plurality of potential alternative thoughts 616 based on the automatic thought selection data 612. By way of example and not limitation, the plurality of potential alternative thoughts 616 may be identified from within a database or the like (not shown) storing a variety of alternative thoughts. Each potential alternative thought of the plurality of potential alternative thoughts 616 may correspond to a positive thought. Further, the alternative thought identification module 614 is configured to receive alternative thought selection data 620 (e.g., input 215; block 316). The alternative thought selection data 620 may identify a particular potential alternative thought 618 from among the plurality of potential alternative thoughts 616. According to one example, the alternative thought selection data 620 may be provided to the alternative thought identification module 614 via user input as discussed, for example, with regard to FIG. 2D above. [00165] The feeling intensity module 622 is configured to receive first feeling intensity data 624 and second feeling intensity data 626 (e.g., input 209 and input 216; block 308 and block 320). The first feeling intensity data 624 may describe a first intensity of the feeling associated with the user (e.g., as indicated by the feeling assessment data 602) being treated via the digital therapeutic 600 at a first point in time. The second feeling intensity data 626 may describe a second intensity of the feeling associated with the user at a second point in time. According to one example, the second point in time is later than the first point in time. According to one example, the first feeling intensity data 624 may be provided to the feeling intensity module 622 via user input as discussed, for example, with regard to FIG. 2B above. Similarly, in one example, the second feeling intensity data 626 may be provided to the feeling intensity module 622 via user input as discussed, for example, with regard to FIG. 2E above.
[00166] In response to receiving the first feeling intensity data 624 and the second feeling intensity data 626, the feeling intensity module is configured to generate feeling intensity difference data 628 (e.g., interface element 23 lh of FIG. 2M; block 322). The feeling intensity difference data 628 may indicate any difference (including, in some examples, no difference) between the first feeling intensity data 624 and the second feeling intensity data 626. For example, and as discussed with respect to element 23 lh of FIG. 2M above, the feeling intensity difference data 628 may indicate a change (e.g., a drop) in the intensity of a particular feeling experienced by the user receiving treatment via the digital therapeutic 600.
[00167] The thinking traps module 634 is configured to receive the automatic thought selection data 612. In addition, the thinking traps module 634 is configured to identify a plurality of potential thinking traps 636 based on the feeling assessment data 602. By way of example and not limitation, the plurality of potential thinking traps 636 may be identified from within a database or the like (not shown) storing a variety of thinking traps. Each potential thinking trap of the plurality of potential thinking traps 636 may correspond to a negative emotional tendency, such as overgeneralizing, catastrophizing, etc. Further, the thinking traps module 634 is configured to receive thinking trap selection data 640 (e.g., input 219). The thinking trap selection data 640 may identify a particular potential thinking trap 638 from among the plurality of potential thinking traps 636. According to one example, the thinking trap selection data 640 may be provided to the thinking traps module 634 via user input as discussed, for example, with regard to FIGS. 2F-2H above.
[00168] The company module 644 is configured to receive company selection data 642 (e.g., input 223). The company selection data 642 may identify, by relationship type (e.g., partner, children, sibling, parent, friend, co-worker, etc.), a person who
accompanied a user of the digital therapeutic 600 at a time in which the user experienced the feeling described by the feeling assessment data 602, or, whether the user was alone when they experienced the feeling described by the feeling assessment data 602. As discussed in additional detail below, in some examples, the company selection data 642 may be provided to the journal module 654 for use in generating a journal entry 656.
[00169] The location module 648 is configured to receive location selection data 646 (e.g., input 226). The location selection data 646 may identify a location (e.g., home, doctor, work, commute, store, etc.) of the user at the time in which the user experienced the feeling described by the feeling assessment data 602. As discussed in additional detail below, in some examples, the location selection data 646 may be provided to the journal module 654 for use in generating a journal entry 656.
[00170] The multiple sclerosis symptom module 652 is configured to receive multiple sclerosis symptom selection data 650 (e.g., input 229). The multiple sclerosis symptom selection data 650 may identify one or more multiple sclerosis symptoms (e.g., relapse, fatigue, brain fog, tremor, focus, memory, balance problems, vision problems, etc.) associated with the user. As discussed in additional detail below, in some examples, the multiple sclerosis symptom selection data 650 may be provided to the journal module 654 for use in generating a journal entry 656.
[00171] The journal module 654 is configured to receive the company selection data 642, location selection data 646, multiple sclerosis symptom selection data 650, the particular potential thinking trap 638, the feeling intensity difference data 628, particular potential automatic thought 610, and the particular potential alternative thought 618. In response to receiving one or more of the foregoing types of data, the journal module 654 is configured to generate a journal entry 656 including some or all of the foregoing types of data. On example of a generated journal entry 656 is shown with regard to FIG. 2M and discussed above.
[00172] The display module 630 is configured to receive the generated journal entry 656 and generate display data 632 representing the generated journal entry 656. For example, according to one embodiment, the display module 630 is configured to generate display data 632 representing a generated journal entry 656 that includes all of the following types of data: company selection data 642, location selection data 646, multiple sclerosis symptom selection data 650, particular potential thinking trap 638, feeling intensity difference data 628, particular potential automatic thought 610, and particular potential alternative thought 618, as shown, for example, in FIG. 2M. According to another embodiment, the display module 630 is configured to generate display data 632 representing a generated journal entry 656 that includes some, but not all, of the foregoing types of data. Regardless, the generated display data 632 may take the form of pixel data or the like capable of generating an image on a suitable display device, such as display 116 discussed above with regard to FIG. 1.
[00173] Among other advantages, the present disclosure provides electronic devices and methods for implementing a prescription digital therapeutic configured to treat depressive symptoms associated with MS. The digital therapeutic may administer cognitive behavioral therapy (CBT) to treat the depressive symptoms. More specifically, the digital therapeutic may implement both cognitive therapy as well as behavioral activation as part of the administered CBT. Administration of CBT via the digital therapeutics described herein may serve to correct distorted cognitions that can cause patients to have a negative view of themselves, the world, and the future.
[00174] The present disclosure also provides a digital therapeutic that includes a plurality of GUIs to help a user/patient understand situations, symptoms, and automatic thoughts related to their negative feelings; check their thoughts against a set of common cognitive distortions or“thinking traps”; and identify alternative thoughts that are more helpful and realistic. The patient/user may be provided with examples of automatic and alternative thoughts that were obtained from a large sample of people with MS. [00175] The present disclosure also provides a digital therapeutic to help patients/users focus on developing skills to cope with MS symptoms, such as brain fog and fatigue, related to depression. The digital therapeutic of the present disclosure provides 24/7 access to support and resources for treating depressive symptoms associated with MS.
[00176] The present disclosure also provides a digital therapeutic to reduce depressive symptoms associated with multiple sclerosis according to clinical measurements. For example, the digital therapeutic described herein improves patient condition according to one or more of the following clinical measurements: MADRS, BDI-II, and PHQ-9. For example, the digital therapeutic described herein creates physiological changes in patients.
[00177] Certain implementations of the disclosed technology are described above with reference to block and flow diagrams of systems and methods and/or computer program products according to example implementations of the disclosed technology. It will be understood that one or more blocks of the block diagrams and flow diagrams, and combinations of blocks in the block diagrams and flow diagrams, respectively, can be implemented by computer-executable program instructions. Likewise, some blocks of the block diagrams and flow diagrams may not necessarily need to be performed in the order presented, may be repeated, or may not necessarily need to be performed at all, according to some implementations of the disclosed technology.
[00178] The terminology used herein is for the purpose of describing particular exemplary configurations only and is not intended to be limiting. As used herein, the singular articles“a,”“an,” and“the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms“comprises,”“comprising,” “including,” and“having,” are inclusive and therefore specify the presence of features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. Additional or alternative steps may be employed. [00179] Although the following description uses terms“first,”“second,” etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a first touch could be termed a second touch, and, similarly, a second touch could be termed a first touch, without departing from the scope of the various described embodiments. The first touch and the second touch are both touches, but they are not the same touch.
[00180] Various implementations of the electronic devices, systems, techniques, and modules described herein can be realized in digital electronic and/or optical circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage resource, at least one input device, and at least one output device.
[00181] These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms“machine-readable medium” and“computer-readable medium” refer to any computer program product, non- transitory computer readable medium, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
[00182] The processes and logic flows described in this specification can be performed by one or more programmable processors, also referred to as data processing hardware, executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
[00183] To provide for interaction with a user, one or more aspects of the disclosure can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, or touch screen for displaying information to the user and optionally a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser. [00184] A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other implementations are within the scope of the following claims, including the following implementations, expressed as interrelated items:
Item 1. An electronic device for displaying feeling intensity inputs, the electronic device comprising:
a display;
an input device;
one or more processors; and
memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
generating, for display on the display, a journal entry, the journal entry indicating at least any difference between a first feeling intensity input and a second feeling intensity input.
Item 2. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for:
displaying, on the display, a feeling selection interface, the feeling selection interface presenting a plurality of feeling interface elements, each feeling interface element being associated with a particular feeling; and
while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs, the first sequence of inputs including a feeling selection input, the feeling selection input corresponding to a particular feeling interface element.
Item 3. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for:
displaying, on the display, a feeling spectrum interface, the feeling spectrum interface presenting a plurality of intensities associated with the particular feeling; and while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs, the second sequence of inputs including a first feeling intensity input, the first feeling intensity input corresponding to a first intensity of the plurality of intensities.
Item 4. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for:
displaying, on the display, an automatic thought selection interface, the automatic thought selection interface presenting a plurality of automatic thought interface elements, each automatic thought interface element being associated with a particular automatic thought; and
while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs, the third sequence of inputs including an automatic thought selection input, the automatic thought selection input corresponding to a particular automatic thought interface element.
Item 5. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for:
displaying, on the display, an alternative thought selection interface, the alternative thought selection interface presenting a plurality of alternative thought interface elements, each alternative thought interface element being associated with a particular alternative thought; and
while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input, the alternative thought selection input corresponding to a particular alternative thought interface element.
Item 6. The electronic device as in any of the preceding items, wherein the journal entry is modified to further indicate one or more particular thinking trap interface elements. Item 7. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for:
in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface element, the quick recap interface element indicating the particular automatic thought and the one or more particular thinking trap elements.
Item 8. The electronic device as in any of the preceding items, wherein the journal entry is modified to further indicate the particular alternative thought interface element.
Item 9. A computerized method for displaying feeling intensity inputs, the method comprising:
at an electronic device including a display and an input device:
receiving data corresponding to a first feeling intensity input; receiving data corresponding to a second feeling intensity input;
generating, for display on the display, a journal entry, the journal entry indicating at least any difference between the data corresponding to the first feeling intensity input and the data corresponding to the second feeling intensity input.
Item 10. The computerized method of Item 9, wherein the method further comprising: at the electronic device including a display and an input device:
displaying, on the display, a feeling selection interface, the feeling selection interface presenting a plurality of feeling interface elements, each feeling interface element being associated with a particular feeling; and
while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs, the first sequence of inputs including a feeling selection input, the feeling selection input corresponding to a particular feeling interface element. Item 11. The computerized method as in any one of Items 9 and 10, wherein the method further comprising:
at the electronic device including a display and an input device:
displaying, on the display, a feeling spectrum interface, the feeling spectrum interface presenting a plurality of intensities associated with the particular feeling; and
while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs, the second sequence of inputs including a first feeling intensity input, the first feeling intensity input corresponding to a first intensity of the plurality of intensities.
Item 12. The computerized method as in any one of Items 9, 10, and 11, wherein the method further comprising:
at the electronic device including a display and an input device:
displaying, on the display, an automatic thought selection interface, the automatic thought selection interface presenting a plurality of automatic thought interface elements, each automatic thought interface element being associated with a particular automatic thought; and
while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs, the third sequence of inputs including an automatic thought selection input, the automatic thought selection input corresponding to a particular automatic thought interface element.
Item 13. The computerized method as in any one of Items 9, 10, 11, and 12, wherein the method further comprising:
at the electronic device including a display and an input device:
displaying, on the display, an alternative thought selection interface, the alternative thought selection interface presenting a plurality of alternative thought interface elements, each alternative thought interface element being associated with a particular alternative thought; while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input, the alternative thought selection input corresponding to a particular alternative thought interface element.
Item 14. The computerized method as in any one of Items 9, 10, 11, 12, and 13, wherein the method further comprising:
at the electronic device including a display and an input device:
displaying, on the display, the feeling spectrum interface; and while displaying the feeling spectrum interface, receiving, via the input device, a fifth sequence of inputs, the fifth sequence of inputs including a second feeling intensity input, the second feeling intensity input corresponding to a second intensity of the plurality of intensities.
Item 15. The computerized method as in any one of Items 9, 10, 11, 12, 13, and 14, wherein the journal entry is modified to further indicate one or more particular thinking trap interface elements.
Item 16. The computerized method as in any one of Items 9, 10, 11, 12, 13, 14, and 15, wherein the method further comprising:
at the electronic device including a display and an input device:
displaying, on the display, a quick recap interface element, the quick recap interface element indicating the particular automatic thought and the one or more particular thinking trap elements.
Item 17. The computerized method as in any one of Items 9, 10, 11, 12, 13, 14, 15, and 16, wherein the journal entry is modified to further indicate the particular alternative thought interface element. Item 18. A computerized method for displaying feeling intensity inputs, the method comprising:
at an electronic device including a display and an input device:
determining any difference between a first intensity and a second intensity to provide feeling intensity difference data; and
displaying, on the display, the feeling intensity difference data.
Item 19. The computerized method of Item 18, the method further comprising:
receiving, via the input device, feeling assessment data, the feeling assessment data describing a feeling associated with a user; and
receiving, via the input device, first feeling intensity data, the first feeling intensity data describing a first intensity of the feeling associated with the user.
Item 20. The computerized method as in any one of Items 18 and 19, the method further comprising:
identifying a plurality of potential automatic thoughts based on the feeling associated with the user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought; and
receiving, via the input device, automatic thought selection data, the automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts.
Item 21. The computerized method as in any one of Items 18, 19, and 20, the method further comprising:
identifying a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought; and
receiving, via the input device, alternative thought selection data, the alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts. Item 22. An electronic device, comprising:
a display;
an input device;
one or more processors; and
memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
determining any difference between a first intensity and a second intensity to provide feeling intensity difference data; and
displaying, on the display, the feeling intensity difference data.
Item 23. The electronic device of Item 22, wherein the one or more programs also include instructions for:
receiving, via the input device, feeling assessment data, the feeling assessment data describing a feeling associated with a user; and
receiving, via the input device, first feeling intensity data, the first feeling intensity data describing a first intensity of the feeling associated with the user.
Item 24. The electronic device as in any one of Items 22 and 23, wherein the one or more programs also include instructions for:
identifying a plurality of potential automatic thoughts based on the feeling associated with the user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought; and
receiving, via the input device, automatic thought selection data, the automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts.
Item 25. The electronic device as in any one of Items 22, 23, and 24, wherein the one or more programs also include instructions for: identifying a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought; and
receiving, via the input device, alternative thought selection data, the alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts.
Item 26. A digital therapeutic for treating depressive symptoms associated with multiple sclerosis, the digital therapeutic comprising:
a display module, the display module configured to generate display data representing feeling intensity difference data.
Item 27. The digital therapeutic of Item 26 further comprising:
an automatic thought identification module, the automatic thought identification module configured to (i) identify a plurality of potential automatic thoughts based on feeling assessment data describing a feeling associated with a user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought and (ii) receive automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts.
Item 28. The digital therapeutic as in any one of Items 26 and 27, further comprising: an alternative thought identification module, the alternative thought identification module configured to (i) identify a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought and (ii) receive alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts.
Item 29. The digital therapeutic as in any one of Items 26, 27, and 28, further comprising: a feeling intensity module, the feeling intensity module configured to (i) receive first feeling intensity data describing a first intensity of the feeling associated with the user at a first point in time; (ii) receive second feeling intensity data describing a second intensity of the feeling associated with the user at a second point in time, the second point in time being later than the first point in time; and (iii) generate feeling intensity difference data, the feeling intensity difference data indicating any difference between the first intensity and the second intensity.
Item 30. The digital therapeutic as in any one of Items 26, 27, 28, and 29, further comprising:
a feeling assessment module, the feeling assessment module configured to receive the feeling assessment data describing the feeling associated with the user
Item 31. The digital therapeutic as in any one of Items 26, 27, 28, 29, and 30, further comprising:
a thinking traps module, the thinking traps module configured to (i) identify a plurality of potential thinking traps based on the feeling assessment data and (ii) receive thinking trap selection data identifying one or more particular potential thinking traps from among the plurality of potential thinking traps.

Claims (23)

CLAIMS What is claimed is:
1. An electronic device for treating depressive symptoms associated with multiple sclerosis, the electronic device comprising:
a display;
an input device;
one or more processors; and
memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
displaying, on the display, a feeling selection interface, the feeling selection interface presenting a plurality of feeling interface elements, each feeling interface element being associated with a particular feeling;
while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs, the first sequence of inputs including a feeling selection input, the feeling selection input corresponding to a particular feeling interface element;
in response to receiving the feeling selection input, displaying, on the display, a feeling spectrum interface, the feeling spectrum interface presenting a plurality of intensities associated with the particular feeling;
while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs, the second sequence of inputs including a first feeling intensity input, the first feeling intensity input corresponding to a first intensity of the plurality of intensities;
in response to receiving the first feeling intensity input, displaying, on the display, an automatic thought selection interface, the automatic thought selection interface presenting a plurality of automatic thought interface elements, each automatic thought interface element being associated with a particular automatic thought; while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs, the third sequence of inputs including an automatic thought selection input, the automatic thought selection input corresponding to a particular automatic thought interface element;
in response to receiving the automatic thought selection input, displaying, on the display, an alternative thought selection interface, the alternative thought selection interface presenting a plurality of alternative thought interface elements, each alternative thought interface element being associated with a particular alternative thought;
while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input, the alternative thought selection input corresponding to a particular alternative thought interface element;
in response to receiving the alternative thought selection input, displaying, on the display, the feeling spectrum interface;
while displaying the feeling spectrum interface, receiving, via the input device, a fifth sequence of inputs, the fifth sequence of inputs including a second feeling intensity input, the second feeling intensity input corresponding to a second intensity of the plurality of intensities;
generating, for display on the display, a journal entry, the journal entry indicating at least any difference between the first feeling intensity input and the second feeling intensity input;
in response to receiving the automatic thought selection input, displaying, on the display, a thinking traps interface, the thinking traps interface presenting a plurality of thinking trap interface elements associated with the particular automatic thought interface element, each thinking trap interface element being associated with a particular thinking trap;
while displaying the thinking traps interface, receiving, via the input device, a sixth sequence of inputs, the sixth sequence of inputs including one or more thinking trap selection inputs, the one or more thinking trap selection inputs corresponding to one or more particular thinking trap interface elements; and wherein the journal entry is modified to further indicate the one or more particular thinking trap interface elements.
2. The electronic device of claim 1, wherein the one or more programs further include instructions for:
in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface element, the quick recap interface element indicating the particular automatic thought and the one or more particular thinking trap elements.
3. The electronic device of claim 1, wherein the journal entry is modified to further indicate the particular alternative thought interface element.
4. The electronic device of claim 1, wherein the one or more programs further include instructions for:
in response to receiving the feeling selection input:
displaying, on the display, a company selection interface, the company selection interface presenting a plurality of company interface elements, each company interface element being associated with a particular relationship type; while displaying the company selection interface, receiving, via the input device, a seventh sequence of inputs, the seventh sequence of inputs including a company selection input, the company selection input corresponding to a particular company interface element; and
wherein the journal entry is modified to further indicate the particular company interface element.
5. The electronic device of claim 4, wherein the one or more programs further include instructions for: in response to receiving the feeling selection input:
displaying, on the display, a location selection interface, the location selection interface presenting a plurality of location interface elements, each location interface element being associated with a particular location;
while displaying the location selection interface, receiving, via the input device, an eighth sequence of inputs, the eighth sequence of inputs including a location selection input, the location selection input corresponding to a particular location interface element; and
wherein the journal entry is modified to further indicate the particular location interface element.
6. The electronic device of claim 5, wherein the one or more programs further include instructions for:
in response to receiving the feeling selection input:
displaying, on the display, a multiple sclerosis symptoms selection interface, the multiple sclerosis symptoms selection interface presenting a plurality of multiple sclerosis symptom interface elements, each multiple sclerosis symptom interface element being associated with a particular multiple sclerosis symptom;
while displaying the multiple sclerosis symptoms selection interface, receiving, via the input device, a ninth sequence of inputs, the ninth sequence of inputs including one or more multiple sclerosis symptom selection inputs, the one or more multiple sclerosis symptom selection inputs corresponding to one or more particular multiple sclerosis symptom interface elements; and
wherein the journal entry is modified to further indicate the one or more particular multiple sclerosis symptom interface elements.
7. The electronic device of claim 1, wherein the one or more programs further include instructions for: in response to receiving a mindfulness selection input, displaying, on the display, a mindfulness technique interface, the mindfulness technique interface presenting a plurality of mindfulness technique interface elements, each mindfulness technique interface element being associated with a particular mindfulness technique; and
in response to receiving a mindfulness technique selection input indicating selection of a mindfulness technique interface element corresponding to a particular mindfulness technique, displaying, on the display, mindfulness data corresponding to the particular mindfulness technique, the mindfulness data including at least one of audio, video, or interactive data.
8. The electronic device of claim 1, wherein the one or more programs further include instructions for:
in response to receiving a fatigue selection input, displaying, on the display, a fatigue type interface, the fatigue type interface presenting a plurality of fatigue type interface elements, each fatigue type interface element being associated with a particular fatigue type; and
in response to receiving a fatigue type selection input indicating selection of a fatigue type interface element corresponding to a particular fatigue type, displaying, on the display, fatigue type data corresponding to the particular fatigue type, the fatigue type data including at least one of audio, video, or interactive data.
9. A computerized method for treating depressive symptoms associated with multiple sclerosis, the method comprising:
at an electronic device including a display and an input device:
displaying, on the display, a feeling selection interface, the feeling selection interface presenting a plurality of feeling interface elements, each feeling interface element being associated with a particular feeling;
while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs, the first sequence of inputs including a feeling selection input, the feeling selection input corresponding to a particular feeling interface element;
in response to receiving the feeling selection input, displaying, on the display, a feeling spectrum interface, the feeling spectrum interface presenting a plurality of intensities associated with the particular feeling;
while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs, the second sequence of inputs including a first feeling intensity input, the first feeling intensity input corresponding to a first intensity of the plurality of intensities;
in response to receiving the first feeling intensity input, displaying, on the display, an automatic thought selection interface, the automatic thought selection interface presenting a plurality of automatic thought interface elements, each automatic thought interface element being associated with a particular automatic thought;
while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs, the third sequence of inputs including an automatic thought selection input, the automatic thought selection input corresponding to a particular automatic thought interface element;
in response to receiving the automatic thought selection input, displaying, on the display, an alternative thought selection interface, the alternative thought selection interface presenting a plurality of alternative thought interface elements, each alternative thought interface element being associated with a particular alternative thought;
while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input, the alternative thought selection input corresponding to a particular alternative thought interface element;
in response to receiving the alternative thought selection input, displaying, on the display, the feeling spectrum interface; while displaying the feeling spectrum interface, receiving, via the input device, a fifth sequence of inputs, the fifth sequence of inputs including a second feeling intensity input, the second feeling intensity input corresponding to a second intensity of the plurality of intensities;
generating, for display on the display, a journal entry, the journal entry indicating at least any difference between the first feeling intensity input and the second feeling intensity input;
in response to receiving the automatic thought selection input, displaying, on the display, a thinking traps interface, the thinking traps interface presenting a plurality of thinking trap interface elements associated with the particular automatic thought interface element, each thinking trap interface element being associated with a particular thinking trap;
while displaying the thinking traps interface, receiving, via the input device, a sixth sequence of inputs, the sixth sequence of inputs including one or more thinking trap selection inputs, the one or more thinking trap selection inputs corresponding to one or more particular thinking trap interface elements; and wherein the journal entry is modified to further indicate the one or more particular thinking trap interface elements.
10. The computerized method of claim 9, further comprising:
in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface element, the quick recap interface element indicating the particular automatic thought and the one or more particular thinking trap elements.
11. The computerized method of claim 9, wherein the journal entry is modified to further indicate the particular alternative thought interface element.
12. The computerized method of claim 9, further comprising:
in response to receiving the feeling selection input: displaying, on the display, a company selection interface, the company selection interface presenting a plurality of company interface elements, each company interface element being associated with a particular relationship type; while displaying the company selection interface, receiving, via the input device, a seventh sequence of inputs, the seventh sequence of inputs including a company selection input, the company selection input corresponding to a particular company interface element; and
wherein the journal entry is modified to further indicate the particular company interface element.
13. The computerized method of claim 12, further comprising:
in response to receiving the feeling selection input:
displaying, on the display, a location selection interface, the location selection interface presenting a plurality of location interface elements, each location interface element being associated with a particular location;
while displaying the location selection interface, receiving, via the input device, an eighth sequence of inputs, the eighth sequence of inputs including a location selection input, the location selection input corresponding to a particular location interface element; and
wherein the journal entry is modified to further indicate the particular location interface element.
14. The computerized method of claim 13, further comprising:
in response to receiving the feeling selection input:
displaying, on the display, a multiple sclerosis symptoms selection interface, the multiple sclerosis symptoms selection interface presenting a plurality of multiple sclerosis symptom interface elements, each multiple sclerosis symptom interface element being associated with a particular multiple sclerosis symptom; while displaying the multiple sclerosis symptoms selection interface, receiving, via the input device, a ninth sequence of inputs, the ninth sequence of inputs including one or more multiple sclerosis symptom selection inputs, the one or more multiple sclerosis symptom selection inputs corresponding to one or more particular multiple sclerosis symptom interface elements; and
wherein the journal entry is modified to further indicate the one or more particular multiple sclerosis symptom interface elements.
15. The computerized method of claim 9, further comprising:
in response to receiving a mindfulness selection input, displaying, on the display, a mindfulness technique interface, the mindfulness technique interface presenting a plurality of mindfulness technique interface elements, each mindfulness technique interface element being associated with a particular mindfulness technique; and
in response to receiving a mindfulness technique selection input indicating selection of a mindfulness technique interface element corresponding to a particular mindfulness technique, displaying, on the display, mindfulness data corresponding to the particular mindfulness technique, the mindfulness data including at least one of audio, video, or interactive data.
16. The computerized method of claim 9, further comprising:
in response to receiving a fatigue selection input, displaying, on the display, a fatigue type interface, the fatigue type interface presenting a plurality of fatigue type interface elements, each fatigue type interface element being associated with a particular fatigue type; and
in response to receiving a fatigue type selection input indicating selection of a fatigue type interface element corresponding to a particular fatigue type, displaying, on the display, fatigue type data corresponding to the particular fatigue type, the fatigue type data including at least one of audio, video, or interactive data.
17. A digital therapeutic for treating depressive symptoms associated with multiple sclerosis, the digital therapeutic comprising:
an automatic thought identification module, the automatic thought identification module configured to (i) identify a plurality of potential automatic thoughts based on feeling assessment data describing a feeling associated with a user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought and (ii) receive automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts; an alternative thought identification module, the alternative thought identification module configured to (i) identify a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought and (ii) receive alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts;
a feeling intensity module, the feeling intensity module configured to (i) receive first feeling intensity data describing a first intensity of the feeling associated with the user at a first point in time; (ii) receive second feeling intensity data describing a second intensity of the feeling associated with the user at a second point in time, the second point in time being later than the first point in time; and (iii) generate feeling intensity difference data, the feeling intensity difference data indicating any difference between the first intensity and the second intensity;
a thinking traps module, the thinking traps module configured to (i) identify a plurality of potential thinking traps based on the feeling assessment data and (ii) receive thinking trap selection data identifying one or more particular potential thinking traps from among the plurality of potential thinking traps; and
a display module, the display module configured to generate display data representing the feeling intensity difference data.
18. The digital therapeutic of claim 17, further comprising: a feeling assessment module, the feeling assessment module configured to receive the feeling assessment data describing the feeling associated with the user.
19. The digital therapeutic of claim 17, further comprising:
a journal module, the journal module configured to generate a journal entry comprising at least the feeling intensity difference data.
20. The digital therapeutic of claim 19, further comprising:
a company module, the company module configured to receive company selection data identifying, by relationship type, a person who accompanied the user at a time in which the user experienced the feeling; and
wherein the journal entry further comprises the company selection data.
21. The digital therapeutic of claim 20, further comprising:
a location module, the location module configured to receive location selection data identifying a location of the user at a time in which the user experienced the feeling; and
wherein the journal entry further comprises the location selection data.
22. The digital therapeutic of claim 21, further comprising:
a multiple sclerosis symptom module, the multiple sclerosis symptom module configured to receive multiple sclerosis symptom selection data identifying one or more multiple sclerosis symptoms associated with the user; and
wherein the journal entry further comprises the multiple sclerosis symptom selection data.
23. The digital therapeutic of claim 19, wherein the journal entry further comprises the thinking trap selection data.
AU2020257885A 2019-04-17 2020-04-13 Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis Abandoned AU2020257885A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2023241395A AU2023241395A1 (en) 2019-04-17 2023-10-08 Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201962835250P 2019-04-17 2019-04-17
US62/835,250 2019-04-17
DKPA201970328A DK201970328A1 (en) 2019-04-17 2019-05-24 Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis
DKPA201970328 2019-05-24
PCT/US2020/027919 WO2020214523A1 (en) 2019-04-17 2020-04-13 Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis

Related Child Applications (1)

Application Number Title Priority Date Filing Date
AU2023241395A Division AU2023241395A1 (en) 2019-04-17 2023-10-08 Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis

Publications (1)

Publication Number Publication Date
AU2020257885A1 true AU2020257885A1 (en) 2021-11-11

Family

ID=70482875

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2020257885A Abandoned AU2020257885A1 (en) 2019-04-17 2020-04-13 Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis

Country Status (3)

Country Link
AU (1) AU2020257885A1 (en)
CA (1) CA3136946A1 (en)
DK (1) DK201970328A1 (en)

Also Published As

Publication number Publication date
DK201970328A1 (en) 2020-11-23
CA3136946A1 (en) 2020-10-22

Similar Documents

Publication Publication Date Title
US11916888B2 (en) Systems and methods for ensuring data security in the treatment of diseases and disorders using digital therapeutics
Bernini et al. Cognitive telerehabilitation for older adults with neurodegenerative diseases in the COVID-19 era: a perspective study
AU2023241395A1 (en) Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis
Miller et al. Web-based self-management for patients with multiple sclerosis: a practical, randomized trial
Shore Telepsychiatry: videoconferencing in the delivery of psychiatric care
Kelly et al. The role of mutual-help groups in extending the framework of treatment
Da-Silva et al. Wristband Accelerometers to motiVate arm Exercises after Stroke (WAVES): a pilot randomized controlled trial
JP7432070B2 (en) Systems and methods for clinical curation of crowdsourced data
Tidman et al. Effects of a community-based exercise program on mobility, balance, cognition, sleep, activities of daily living, and quality of life in PD: a pilot study
Kruzan et al. The perceived utility of smartphone and wearable sensor data in digital self-tracking technologies for mental health
Minen et al. The functionality, evidence, and privacy issues around smartphone apps for the top neuropsychiatric conditions
Figueroa et al. Who benefits most from adding technology to depression treatment and how? An analysis of engagement with a texting adjunct for psychotherapy
AU2020257885A1 (en) Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis
Greywoode et al. Behavioral digital therapeutics in gastrointestinal conditions: where are we now and where should we go?
Kannampallil et al. Design and formative evaluation of a virtual voice-based coach for problem-solving treatment: observational study
Thangavel et al. Information and Communication Technology for Managing Social Isolation and Loneliness Among People Living With Parkinson Disease: Qualitative Study of Barriers and Facilitators
US20240066260A1 (en) Provision of sessions with individually targeted visual stimuli toalleviate chronic pain in users
Jackowiak et al. Delayed dopamine agonist withdrawal syndrome after deep brain stimulation for Parkinson disease
Rende et al. Telepractice experiences in a university training clinic
Gilani et al. Professional and peer social support-oriented mhealth app: a platform for adolescents with depressive symptomatology
Rajankar et al. Eat well: smart intervention plan for bulimia in the Indian context
Nelson et al. A 31-year-old female with suicidal intent
KR20230117125A (en) A Method for Correlating Behavior and Health Status for Treatment Programs of Neurofluid Behavioral Therapy
Schweitzer User Engagement With Apps for Depression
Palanker Stories Of COVID-19: Chronic Disease Care Is Essential Care

Legal Events

Date Code Title Description
PC1 Assignment before grant (sect. 113)

Owner name: PEAR THERAPEUTICS (US), INC.

Free format text: FORMER APPLICANT(S): PEAR THERAPEUTICS, INC.

MK5 Application lapsed section 142(2)(e) - patent request and compl. specification not accepted