AU2020200809A1 - Mental health application - Google Patents

Mental health application Download PDF

Info

Publication number
AU2020200809A1
AU2020200809A1 AU2020200809A AU2020200809A AU2020200809A1 AU 2020200809 A1 AU2020200809 A1 AU 2020200809A1 AU 2020200809 A AU2020200809 A AU 2020200809A AU 2020200809 A AU2020200809 A AU 2020200809A AU 2020200809 A1 AU2020200809 A1 AU 2020200809A1
Authority
AU
Australia
Prior art keywords
user interface
displaying
user
icon
goal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
AU2020200809A
Inventor
Nicole McIntosh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Endeavour Mental Health Recovery Clubhouse
Original Assignee
Endeavour Mental Health Recovery Clubhouse
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Endeavour Mental Health Recovery Clubhouse filed Critical Endeavour Mental Health Recovery Clubhouse
Priority to AU2020200809A priority Critical patent/AU2020200809A1/en
Publication of AU2020200809A1 publication Critical patent/AU2020200809A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Abstract

A computer-implemented method is described including displaying user interfaces including information on selected topics and receiving, at the user input device, user input indicating goals in relation to the first topic and the second topic. A selectable icon is displayed that includes or is displayed with an indication that it relates to goals. Responsive to input selecting the selectable icon, a user interface displays indicia based on the goals input by the user. Also described is a computer-implemented method that includes, displaying a plurality of icons, each icon displayed with at least one of colour and indicia representative of an emotion or mood. Selections are made by the user. After expiration of a time period, a diagrammatic representation of a proportion of selections of the plurality of icons within the first plurality of distinct selections can be displayed. 1/11 Operating system 124 106 Health application 126 Location servicer 128 Contacts 130 Calendar 132 Processing device(s) 102 Display controller 110 Touch screen 110 interface 108 controller 112 104 Network interface 114 Other device or communication controller 116 Touch-sensitive Communications Other input or display 118 circuitry 120 output devices 122 Server system 1OOA Fig 1

Description

1/11
Operating system 124 106
Health application 126
Location servicer 128
Contacts 130
Calendar 132
Processing device(s)
102 Display controller 110
Touch screen 110 interface 108 controller 112
104 Network interface 114
Other device or communication controller 116
Touch-sensitive Communications Other input or display 118 circuitry 120 output devices 122
Server system 1OOA
Fig 1
Mental health application
Field
The present disclosure relates generally to a method at a device, for example a portable electronic computing and communication device executing instructions of one or more applications, for providing useful functionality for or in relation to persons with a mental health dysfunction. The disclosure also relates to devices and computer readable media configured to perform the method.
Background
An increasing number of people are experiencing a mental health dysfunction. For example it has been reported that 3 million Australians are living with anxiety or depression. Mental health support is available in a number of forms, including telephone and online chat support services, access to mental health professionals and to the counselling, programs and pharmaceuticals that they can offer, online reading resources and online forums, amongst others.
Many of the resources available inform of what actions a person may do to achieve good outcomes. Some of these are based on a questionnaire or similar to assist in focussing the areas of action. The collection of actions may be called a tool-set or similar.
More recently a range of mental health applications have been developed. A publication by P Chandrashekar "Do mental health mobile apps work: evidence and recommendations for designing high-efficacy mental health mobile apps" MHealth Vol 4, No 3 (March 2018) reported that smart-phone based applications can be effective, but to be effective they must be evidence-based and carefully designed.
The inventors have researched the needs of mental health patients and have devised features implementable in electronic devices, such as smartphones, that may assist to achieve favourable mental health outcomes.
Summary
The disclosure generally relates to methods at a computing device, for example a smartphone or tablet. The methods provide user interfaces for human-computer interaction.
According to some embodiments, a computer-implemented method includes, at a computing device with a display and a user input device: displaying, on the display, a first user interface and a second user interface, the first and second user interfaces including information on a topic selected from the group connecting with oneself, connecting with others, physical activity, life skills, mental activities, wherein the first user interface includes information on a first topic from the group and the second user interface includes information on a second topic from the group, different to the first topic; receiving, at the user input device, user input indicating request to input a goal in relation to the first topic and user input indicating a goal in relation to the first topic; receiving, at the user input device, user input indicating request to input a goal in relation to the second topic and user input indicating a goal in relation to the second topic; storing the user input indicating a goal in relation to the first topic and the indicating a goal in relation to the second topic; ?0 displaying, on the display, a first selectable icon, wherein the selectable icon includes or is displayed with an indication that it relates to goals; receiving, at the user input device, input selecting the selectable icon and responsive to the input displaying a third user interface, the third user interface displaying indicia based on the stored user input that indicates the goal in relation to the first topic and the goal in relation to the second topic.
According to some embodiments, a computer-implemented method includes, at a computing device with a display and a user input device: displaying, on the display, a plurality of icons, each icon displayed with at least one of colour and indicia representative of an emotion or mood; within a first time period including at least a day receiving, at the user input device, a first plurality of distinct selections by the user, each selection selecting an icon of the plurality of icons and retaining a record in memory, the record based on the respective received selection; after expiration of the first time period, displaying on the display a diagrammatic representation of a proportion of selections of the plurality of icons within the first plurality of distinct selections; within a second time period including at least a day receiving, at the user input device, a second plurality of distinct selections by the user, each selection selecting an icon of the plurality of icons and retaining a record in memory, the record based on the respective received selection, wherein the second time period is different to the first time period; after expiration of the second time period, displaying on the display a diagrammatic representation of a proportion of selections of the plurality of icons within the second plurality of distinct selections; wherein the number of distinct selections in the first and second plurality of distinct selections is different and dependent on the number of times a user makes a selection from the plurality of icons.
As used herein, unless the context clearly requires otherwise, the terms "first", "second", "third" and so forth are used to indicate different instances of the item referred to. The terms are not intended to indicate any particular order or other temporal ?0 relationship. For example "a first user interface" is different to "a second user interface", but the two interfaces may be displayed at any time relative to each other.
Further embodiments will become apparent from the following description, given by way of example and with reference to the accompanying drawings and/or from the appended claims.
Brief description of the drawings
Figure 1 represents, in block diagram form, an exemplary computational system.
Figures 2 to 20 show diagrammatic presentations of aspects of a user interface provided by a computational system, for example the computational system of Figure 1.
Detailed description of the embodiments
Figure 1 represents, in block diagram form, an exemplary computational system 100 configured to perform functions according to the present disclosure.
System 100 is a general purpose computer processing system. It will be appreciated that Figure 1 does not illustrate all functional or physical components of a computer processing system. It will also be appreciated that the particular type of computer processing system will determine the appropriate hardware and architecture, and alternative computer processing systems suitable for implementing features of the present disclosure may have additional, alternative, or fewer components than those depicted.
In some embodiments the system 100 is a portable computing device, such as a smartphone or portable computer. In some embodiments the system 100 is a mobile computing device, in particular a handheld mobile computing device such as a smartphone.
Computer processing system 100 includes at least one processing unit 102. The processing unit 102 may be a single computer processing device (e.g. a central processing unit, graphics processing unit, or other computational device), or may include a plurality of computer processing devices. In some instances all processing will be performed by processing unit 102, however in other instances processing may also ?0 be performed by remote processing devices accessible and useable (either in a shared or dedicated manner) by the system 100.
Through a communications bus 104, which may include one or more than one communication bus or communication lines, the processing unit 102 is in data communication with computer memory 106, which includes one or more machine readable storage (memory) devices storing instructions and/or data for controlling operation of the processing system 100. The memory includes a system memory (e.g. a BIOS), volatile memory (e.g. random access memory such as one or more DRAM modules), and non-volatile memory (e.g. one or more hard disk or solid state drives).
System 100 also includes an input/output interface 108, via which system 100 interfaces with various devices and/or networks. Generally speaking, other devices may be integral with system 100, or may be separate. Where a device is separate from system 100, connection between the device and system 100 may be via wired or wireless hardware and communication protocols, and may be a direct or an indirect (e.g. networked) connection.
The example system 100 includes interfaces to a display and a touch screen. The I/O system 108 includes a display controller 110 and a touch screen controller 112 operably connected with a touch sensitive display 118. The system also includes communications controllers, for example a network interface 114 operably connected to communications circuitry 120, for example radio circuitry for wireless communication with a network access point. The system may include other input and output devices 122 and associated controllers 116. For example the devices may include physical buttons, point and click devices, keyboards, accelerometers, GPS devices, cameras, speakers, microphones, amongst others. Any of these input/output devices may be configured to provide aspects of the user input or output as described herein.
System 100 stores or has access to computer applications (also referred to as software or programs) - i.e. computer readable instructions and data which, when executed by the processing unit 102, configure system 100 to receive, process, and output data. Instructions and data can be stored on non-transient machine readable medium accessible to system 100. For example, instructions and data may be stored on ?0 a non-transient memory component of memory 106.
Applications accessible to system 100 will typically include an operating system application 124 such as Microsoft Windows@, Apple OSX, Apple IOS, Android, Unix, or Linux. System 100 also stores or has access to applications which, when executed by the processing unit 102, configure system 100 to perform various computer implemented processing operations described herein. In some cases part or all of a given computer-implemented method will be performed by system 100 itself, while in other cases processing may be performed by other devices in data communication with system 100. Another application accessible to system 100 includes health application 126.
The following description of user interfaces is made reference to the health application 126. The user interfaces are shown and described in functional terms. It will be appreciated that the aesthetic appearance of the user interfaces may be varied widely, including the use of various pictorial and textual indicia. The text provided is by way of example only and different text or no text (e.g. a picture or diagram only) may be used instead, while providing the same navigation, input, output or other user interface functionality as described.
Figure 2 shows a diagrammatic presentation of a user interface 200. The user interface 200 includes a plurality of user selectable icons, selection of which causes the system 100 to perform one or more operations, as described herein below. The user selectable icons are selected by operating a touch-sensitive display or other input device of the system.
In some embodiments the user interface 200 is displayed responsive to and following launch of an application, for example the health application 126. In other embodiments the user interface 1 is displayed following display of another user interface of the health application 126 for receiving input identifying a user of the system 100, for example a page for entering login information such as a user name and password.
The user interface is structured to provide ready access to a range of functions and information of use to a user for managing health, in particular mental health. The structure provides an intuitive interface to the user that reduces the burden on the user ?0 in finding and operating the system 100 to achieve health objectives and to perform certain health monitoring activities. This structure includes one or more of the aspects described below.
An icon 201, in this example called "Clubhouse" provides an access point to recovery tools. In some embodiments in response to selection of the icon 201 the system 100 transitions to a user interface 300 depicted in Figure 3, which includes icons depicting the recovery tools. The recovery tools include one or more of a connections tool, a physical activity tool, a life skills tool, a mental activity tool and a goals tool. Each tool includes one or more of the functions as described below.
A connections tool, indicated by Relate icon 302 in user interface 300, when selected, provides access to information and functions relating to connecting. For example, when Relate icon 302 is selected, a user interface 400 as shown in Figure 4 is displayed. User interface 400 includes information on connecting, in this example both information 401 relating to connecting with one-self and information 402, 403 for connecting with others. The information on connecting with others includes information 402 for connecting with family and friends and information 403 for connecting with the wider community.
The user interface 400 includes an icon 404, in this example labelled "Goal" for setting one or more goals. Selection of the Goal icon 404 opens a text selection or entry input 405. The text selection or entry input 405 may appear as part of the user interface 400 or appear as a new user interface. The text selection or entry input 405 includes a field or selector 406 for entering a goal and an icon 407 for saving the entered text or selection. The input is saved by the system, for example to memory 106, for use as described herein.
A physical activity tool, indicated by Do icon 303 in user interface 300, when selected, provides access to information and functions relating to physical activity. For example, when Do icon 303 is selected, a user interface 500 as shown in Figure 5 is displayed. User interface 500 includes information on physical activity, in this example information 501 relating to activities for the home, information 502 relating to activities for completing with friends and family and information 503 relating to activities for ?0 completing in the community.
The user interface 500 includes a Goal icon 504. Selection of the Goal icon 404 causes the system 100 to operate to enable recording of a goal, as described in relation to the Goal icon 404 and text selection or entry input 405. The goal is saved in association with information identifying it as a physical activity goal.
A life skills tool, indicated by Live icon 304 in user interface 300, when selected, provides access to information and functions relating to certain life skills. For example, when Live icon 304 is selected, a user interface 600 as shown in Figure 6 is displayed. User interface 600 includes information on life skills, in this example information 601 relating to diet, information 602 relating to sleep, information 603 relating to giving and information 604 relating to budgeting and finances.
The user interface 600 includes a Goal icon 605. Selection of the Goal icon 605 causes the system 100 to operate to enable recording of a goal, as described in relation to the Goal icon 404 and text selection or entry input 405. The goal is saved in association with information identifying it as a life skills goal.
A mental activity tool, indicated by Be icon 305 in user interface 300, when selected, provides access to information and functions relating to mental activities for promoting mental health. For example, when Be icon 305 is selected, a user interface 700 as shown in Figure 7 is displayed. User interface 700 includes information on mental health exercises, in this example information 701 relating to activities for relaxing or remaining calm, information 702 relating to activities for grounding oneself and information 703 relating to activities for meditation.
The user interface 700 includes a Goal icon 704. Selection of the Goal icon 704 causes the system 100 to operate to enable recording of a goal, as described in relation to the Goal icon 404 and text selection or entry input 405. The goal is saved in association with information identifying it as a mental activity goal.
The information provided in user interfaces 400, 500, 600 and 700 may be stored locally, for example in memory 106 and/or may be accessed from a remote location (e.g. a server system 100A shown in Figure 1) and received by the system 100 via communications circuitry 120. In some embodiments the information is variable. For ?0 example, location information may be entered into, received by, or determined by, the system 100. In one example the location information is entered during a registration process and/or during a log in process. In another example the location information is determined based on location services 128 available at the system 100, for instance based on a GPS receiver. The location information may determine what information is provided.
For example, the information 402, 403 for connecting with others may be based on the determined or entered location of the user relative to other people or facilities. Friends and family may for example have the health application 126 installed and the application may be configured to communicate, based on preferences, location information with selected other users of the health application 126. The location of community clubs and facilities may be stored in a database and the information 502,
503 may be selected from a database at or accessible by the server system 100A based on location information of the user received from the system 100. It will be appreciated that information in the other user interfaces may be customised in an analogous manner, with the customisation based on location information, user preferences, date and/or time and/or other variables.
Each of the user interfaces 400, 500, 600 and 700 include an icon 410, 510, 610, 710 respectively, in this example labelled "Add", for editing the information provided on the user interface. Responsive to selecting the Add icon, the system 100 provides a user interface to enter additional information and/or edit existing information provided in the associated user interface.
For example, the family and friends information may be updated with contact information and/or links to access social media of the family and friends. The health application 126 may include an icon for linking a contact to the family and friends information. The health application 126 may receive the information from a contacts application 130 on the system 100. For example, contacts application 130 and health application 126 may be configured so that the contacts application has a "share" function available to make information on a contact available to the health application and/or the health application 126 can "import" contact information.
A goals tool, indicated by a trophy icon 306 in user interface 300, when selected, ?0 provides access to information and functions relating to goals of the user. The goals of the user may include information entered through any of the Goal icons described herein. For example, when the trophy icon 306 is selected, a user interface 800 as shown in Figure 8 is displayed. The user interface 800 is formed, for example, by the system 100 by reading the memory 106 and retrieving the information or selections entered by the user through one or more user interfaces, for example a text selection or entry input provided responsive to selecting a Goal icon 404, 504, 605, 704.
In some embodiments, the user interface 800 is structured to group the display of goals relating to each module described above together. Each group of goals may be displayed with an indicator of its category, for example displayed below a heading "Relate", "Be", "Do" or "Live". In some embodiments the user interface 800 includes an icon to enter or select timing information for a goal. The timing information may, for example, be target date for completing the goal (e.g. this Saturday) or a periodic target for repeatedly completing the goal (e.g. every Saturday). For example, selection of a displayed calendar icon 801 may open a calendar application 132, allowing entry of an appointment or recurring appointment to achieve the goal. Some information may be automatically populated, for example based on information associated with a currently selected goal displayed in the user interface 800. Alternatively the timing information may be entered into a field in user interface 800 and either automatically populated into the calendar application 132 or updated responsive to a user action, for example selection of a "Save & add to calendar" icon 802, to save additions or changes to goals and/or their timing and for the information to be stored for use by the calendar application 132. In other embodiments the save function and/or entry to the calendar may be automatic after entry of a goal.
In some embodiments the calendar application 132 is specific to the health application 126. For example the calendar application 132 may be implemented as module, routine or other part of the health application 126 or as a dedicated calendar application for the health application 126. In other embodiments the calendar application 132 is a general calendar application, used both for goals from the health application and other purposes, for example the user arranging meetings and other events as per existing calendar applications. In either case, the calendar application 132 may be ?0 configured to cause the system 100 to provide one or more notifications when an event associated with goal is due in the calendar. For example, the notifications may include a reminder to perform a physical activity, to make contact with a connection, to perform some meditation, to go shopping to buy food.
The user interface 800 may also include an icon 802 for adding or editing goals. Selection of the icon 802 may, for example, allow the goals displayed in the user interface 800 to be edited directly, for example using a keyboard of the system. The keyboard may be a hardware keyboard or a soft keyboard displayed on a touch screen display. In some embodiments editing of a goal, for example a name or description of a goal and/or timing information associated with the goal is automatically populated in the calendar application 132. The population may occur, for example, responsive to the selection of the icon 801.
Returning to the user interface 200, an icon 202, in this example called "Care plan/clinical document" provides access to a user interface 900 shown in Figure 9. The user interface 900 includes icons 901, 902 and 903 to enable documents or information to be uploaded, deleted and sent respectively. For example upload icon 901 causes the system to display a user interface for selecting a document to be uploaded to a server. In some embodiments the document is stored associated with a document category. The document category may be selected by the user when uploading the document. Three example categories are care plan, prescriptions and ect. Subsequently documents saved to each category may be viewed or downloaded by selecting the corresponding icon 904, 905 or 906. The user interface 900 may include an icon 907 to access the calendar application 126, for example to enter appointments for taking medicine prescribed to the user, for example as described in a prescription document and/or a care plan document.
Figure 10 shows a user interface 1000, which is an interface of the calendar application 126. In some embodiments the calendar application 126 includes functionality to add goals, including one or more goals enterable following selection of one of the Goal icons 404, 504, 605 or 704 described herein. For example, selection of icon 1001 may open a user interface like that shown in Figure 4B for entry of the goal. New goals entered through the calendar application may also be categorised, allowing ?0 for example display in the relevant group in the user interface 800. A goal may be displayed in the calendar application. For example the "Action 1" and "Action 2" indicia in user interface 1000 may each relate to a goal and may be associated with a time (e.g. 2 pm on a day) or time period (e.g. the day without any specific associated time).
In addition to entering goals, the calendar application 126 may include functionality to add reminders, as per a traditional calendar application. A new reminder icon 1002 may initiate the functionality of the calendar application 126 to enter a new reminder. The reminders for example may include reminders to take medication according to a prescription or care plan.
The calendar application 126 may further include an icon 1003 for indicating that a goal has been completed. For example a user may select a goal displayed in the calendar application, for example "Action 1" and then select icon 1003. The system 100 records, for example in memory 106 that the user had indicated that the action has been completed and the goal achieved. An analogous function may be provided other reminders. Goals and/or reminders may each be associated with an icon 1004 displayed adjacent to or otherwise proximate to the goal or reminder to enable indication of and/or to indicate achievement of the goal or completion of an action in relation to a reminder. For example each of the icons 1004 may change from white to green when the associated goal or reminder has been indicated by user input to have been completed. The indication may be selection of the icon 1004 itself via a touch screen display. In some embodiments the user interface 800 includes an indication that goals have been completed. In some embodiments the system 100 monitors for completion of a set of one or more goals. Upon receiving input that all goals in the set have been completed, the system may display an interface with a celebratory message, for example indicia 1101 shown in Figure 11.
In the user interface 200, an icon 203, in this example called "Journal" may provide access to user interface 1200 for entering information into an electronic journal. The user interface 1200 may include time information 1201, in the example shown the days of the current week and one or more text entry fields 1202 for making journal entries, in this example entries for each day of the week. Entries are stored, for example in memory 106 and may be retrieved. For example a toggle between the journal and ?0 calendar interfaces may enable selection of a time period in the calendar and then the display of journal entries for that time period in the journal. User interface 1200 may include an icon 1203 for switching to the calendar application 126 to enable this. Other mechanisms for viewing past entries may be provided without switching to the calendar application.
Another icon 204 in the user interface 200, in this example called "Mood Map" provides an access point to one or more self-evaluation tools. In some embodiments the one or more self-evaluation tools includes a mood monitor as described below.
In some embodiments a user interface 1300 of a self-evaluation tool display a plurality of selectable icons 1301 to 1305. Each selectable icon 1301-1305 has a colour and/or diagrammatic and/or textual representation of a mood or emotion. For example colours may include green, blue, pink, red and yellow. Diagrammatic representations may include a smiling face, sad face, angry face, scared face and excited face. Textual representations may include, for example, text alone or labels for the colours or diagrams. An example user interface 1300 is shown in Figure 13.
In some embodiments, responsive to selection of a mood or emotion in user interface 1300, the system 100 displays another selection user interface 1400, for example as shown in Figure 14. The user interface 1400 includes selectable icons 1401 to 1405 that provide an option to select further details of the selected mood or emotion. For example, if a smiling face was selected in user interface 1300, user interface may display selectable items for each of gratitude, excited, pleased, joy and hope.
In the example shown in Figure 14, an icon 1406 indicates the mood or emotion selected in user interface1300, for example by replicating it. Further, field 1407 provides an option for a user to enter still further details, for example as text. A save icon 1408 causes the system to save the selection(s) to the memory 106. The selections are saved associated with time information that indicates when the selection was made. The time information may include for example the date and time. A skip icon 1409 allows a user to enter a selection without providing additional details, for example to select an icon in user interface 1300 without further selection or entry of information in user interface 1400.
In some embodiments, after entering selections or information user interface ?0 1300 and (optionally) user interface 1400, a message interface 1450 is displayed. Depending on what selection of mood or emotion was made, the system 100 displays a well-being suggestion to (further) improve mood. For example, as shown in Figure 15, the message may be "we noticed that you've been feeling _, maybe have a look at our Clubhouse for strategies to lift your mood". than you can choose to go back to the clubhouse or to the weekly mood view. The message interface 1450 may include a non specific message for all users, which may change over time. Selectable icons 1451 and 1452 allow a user to navigate to an overview of selected mood and emotions or to the Clubhouse user interface 300.
Figure 16 shows an example display of an overview of selected mood and emotions. As described above the system 100 records selections. In some embodiments there is no restriction on the number of times a user may utilise the self evaluation tool. For example, the system 100 will select any number of selections during the course of a single day. The example of Figure 16 shows a weekly overview. Additional or alternative periods of time, for example for a day, 3 days, fortnight, month or year may also be accessible through navigation icons provided by the system 100. The example of Figure 16 shows a pie-chart representation of the number of times that each emotion or mood was selected. In some embodiments the pie-chart displays the higher level selections, for example those from user interface 1300. In some embodiments lower level selections, for example selections from user interface 1400 are displayed or displayable. For example on selecting a segment from the pie chart of Figure 16 a further pie chart may show the number of times each sub-category was selected. If a functionality is provided that enables a save of a selection without selection of a subcategory, this may also be indicated in the further pie chart. Additionally, any text entered, for example in user interface 1400 is also displayed or displayable, for example on selecting the relevant segment. It will be appreciated that there are display mechanisms other than a pie chart to represent the selection frequency or volume of moods or emotions that may be used instead of or in addition to the pie chart.
In some embodiments, the user interface 200 provides access to other records and resources. The reference materials may be stored centrally, for example at a web ?0 server accessible by the system 100. For example, selection of a Library icon 205 may cause the system 100 to display an information resources user interface 1700 (see Figure 17) containing reference materials. Selection of an Emergency services icon 206 may cause the system 100 to display a user interface 1800 (see Figure 18) with contact information for emergency services providers relevant to mental health. Selection of a Forum/testimonials icon 207 may cause the system 100 to display user interfaces at which users can enter and review testimonials from other users and enter and review discussions in a forum. Selection of a personal info icon 208 will cause display of basic information such as Name, DOB, likes, dislikes, email, address. This information may also include some wellness support questions to be answered. The information may be entered during registration or later, for instance by selecting a displayed text field. An example user interface 1900 (see Figure 19) shows personal information that may be included. Selection of a resource directory icon 209 may cause a user interface 2000 (see Figure 20) to be displayed containing resources for obtaining products and services. The example user interface in Figure 20 includes several categories of widely used products and services. An icon 2001 allows a user to add a resource to the directory. In some embodiments the added resources are entered locally only, In some embodiments additions by users are sent to a central location for evaluation and potential inclusion in a common resources directory for all users. In some embodiments the resources are stored centrally relative to a location and filtered for relevance dependent on location information from the relevant user device.
It will be understood that the invention disclosed and defined in this specification extends to all alternative combinations of two or more of the individual features mentioned or evident from the text or drawings. All of these different combinations constitute various alternative aspects of the invention.
For example, the locations of particular items within user interfaces described may be varied, while providing substantially the same functionality. Examples include combining icons from user interfaces 200 and 300 into a single user interface or separating icons from those user interfaces into more than two user interfaces. In another example the functionality of the applications stored in memory 106 may be provided in distinct software applications or as modules or routines within a single application.

Claims (15)

  1. A computer-implemented method including: at a computing device with a display and a user input device: displaying, on the display, a first user interface and a second user interface, the first and second user interfaces including information on a topic selected from the group connecting with oneself, connecting with others, physical activity, life skills, mental activities, wherein the first user interface includes information on a first topic from the group and the second user interface includes information on a second topic from the group, different to the first topic; receiving, at the user input device, user input indicating request to input a goal in relation to the first topic and user input indicating a goal in relation to the first topic; receiving, at the user input device, user input indicating request to input a goal in relation to the second topic and user input indicating a goal in relation to the second topic; storing the user input indicating a goal in relation to the first topic and the indicating a goal in relation to the second topic; displaying, on the display, a first selectable icon, wherein the ?0 selectable icon includes or is displayed with an indication that it relates to goals; receiving, at the user input device, input selecting the selectable icon and responsive to the input displaying a third user interface, the third user interface displaying indicia based on the stored user input that indicates the goal in relation to the first topic and the goal in relation to the second topic.
  2. 2. The computer implemented method of claim 1, wherein displaying the first user interface includes displaying a second selectable icon and wherein the user input indicating a request to input a goal in relation to the first topic includes selection of the second selectable icon; and displaying the second user interface includes displaying a third selectable icon and wherein the user input indicating a request to input a goal in relation to the second topic includes selection of the third selectable icon.
  3. 3. The computer implemented method of claim 1 or claim 2, further including the computing device: displaying, on the display, a fourth user interface, wherein displaying the fourth user interface includes simultaneously displaying a fourth selectable icon and a fifth selectable icon, displaying the first user interface in response to selection of the fourth selectable icon; displaying the second user interface in response to selection of the fourth selectable icon.
  4. 4. The computer implemented method of claim 3, wherein displaying the fourth user interface includes simultaneously displaying with the fourth and fifth selectable icons, a sixth selectable icon and a seventh selectable icon, and wherein the method includes: in relation to selection of the sixth selectable icon, displaying on the display, a fifth user interface including information on a topic selected from the ?0 group: connecting with oneself, connecting with others, physical activity, life skills, mental activities; in relation to selection of the seventh selectable icon, displaying on the display, a sixth user interface including information on a topic selected from the group: connecting with oneself, connecting with others, physical activity, life skills, mental activities; wherein each of the first, second, fifth and sixth user interface include information on different topics in said group; the fifth and sixth user interfaces are each configured with functionality to enter a goal in relation to their respective topics; the third user interface displays indicia that indicates an entered goal for each topic associated with the first, second, fifth and sixth user interface.
  5. 5. The method of claim 3 or claim 4, further including the computing device: displaying, on the display, one or more interactive user interfaces for a user to input information on current mental state and then displaying a seventh user interface, the seventh user interface including an eighth selectable icon; receiving, at the user input device, a selection of the eighth selectable icon and in response displaying, on the display, the fourth user interface.
  6. 6. The method of any one of claims 3 to 5, wherein displaying the first selectable icon includes displaying the first selectable in the fourth user interface, simultaneously with the fourth and fifth selectable icons.
  7. 7. The method of any one of claims 1 to 6, further including the computing device: maintaining a calendar for the user; receiving, at the user input device, user input indicating a time associated with the goal in relation to the first topic and entering into the calendar an event based on the received goal and associated time; receiving a request to display a user interface of the calendar over a time period that includes the associated time; displaying indicia representing the received goal within the user interface ?0 of the calendar.
  8. 8. The method of claim 7, wherein the third user interface includes a ninth selectable icon and the method includes displaying a user interface for entering the calendar event responsive to selection of the ninth selectable icon.
  9. 9. The method of claim 7 or claim 8, wherein the third user interface includes indicia representing the time associated with the goal in relation to the first topic, displayed to indicate a connection between the indicia representing the time and the indicia indicating the goal in relation to the first topic.
  10. 10. A computer-implemented method including: at a computing device with a display and a user input device: displaying, on the display, a plurality of icons, each icon displayed with at least one of colour and indicia representative of an emotion or mood; within a first time period including at least a day receiving, at the user input device, a first plurality of distinct selections by the user, each selection selecting an icon of the plurality of icons and retaining a record in memory, the record based on the respective received selection; after expiration of the first time period, displaying on the display a diagrammatic representation of a proportion of selections of the plurality of icons within the first plurality of distinct selections; within a second time period including at least a day receiving, at the user input device, a second plurality of distinct selections by the user, each selection selecting an icon of the plurality of icons and retaining a record in memory, the record based on the respective received selection, wherein the second time period is different to the first time period; after expiration of the second time period, displaying on the display a diagrammatic representation of a proportion of selections of the plurality of icons within the second plurality of distinct selections; wherein the number of distinct selections in the first and second ?0 plurality of distinct selections is different and dependent on the number of times a user makes a selection from the plurality of icons.
  11. 11. The method of claim 10, wherein the process of displaying, on the display, a plurality of icons, each icon displayed with at least one of colour and indicia representative of an emotion or mood includes: receiving a request and displaying the plurality of icons responsive to the request; ceasing display of the plurality of icons after a said selection by a user; receiving a further request and displaying the plurality of icons responsive to the further request.
  12. 12. The method of claim 10 or claim 11, further including: after receiving a selection of an icon in the plurality of icons, displaying on the display a further plurality of icons, each icon in the further plurality of icons displayed with at least one of colour and indicia representative of an emotion or mood related to the emotion or mode represented by the selected icon.
  13. 13. The method of any one of claims 10 to 12, further including: within the first time period, after receiving a selection of the plurality of distinct selections, displaying on the display: feedback to the user, the feedback based on the selection; and a further selectable icon; initiating a process for performing the method of any one of claims 1 to 9 responsive to selection of the further selectable icon.
  14. 14. An electronic device including a display and a user input device operably connected to a processor and associated memory, the memory containing instructions that cause the processor implement the method of any one of claims 1 to 13.
  15. 15. Non-transient computer readable memory containing instructions to cause an ?0 electronic device to perform the method of any one of claims 1 to 13.
AU2020200809A 2020-02-05 2020-02-05 Mental health application Pending AU2020200809A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2020200809A AU2020200809A1 (en) 2020-02-05 2020-02-05 Mental health application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2020200809A AU2020200809A1 (en) 2020-02-05 2020-02-05 Mental health application

Publications (1)

Publication Number Publication Date
AU2020200809A1 true AU2020200809A1 (en) 2021-08-19

Family

ID=77274238

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2020200809A Pending AU2020200809A1 (en) 2020-02-05 2020-02-05 Mental health application

Country Status (1)

Country Link
AU (1) AU2020200809A1 (en)

Similar Documents

Publication Publication Date Title
US20210158952A1 (en) Virtual mental health platform
Hilliard et al. User preferences and design recommendations for an mHealth app to promote cystic fibrosis self-management
Kahn et al. What it takes: characteristics of the ideal personal health record
US20120259926A1 (en) System and Method for Generating and Transmitting Interactive Multimedia Messages
US20120259927A1 (en) System and Method for Processing Interactive Multimedia Messages
US20120159335A1 (en) Integrated System and Method for Implementing Messaging, Planning, and Search Functions in a Mobile Device
US8803690B2 (en) Context dependent application/event activation for people with various cognitive ability levels
CN113785295A (en) Configuring context-based restrictions for computing devices
US20150286787A1 (en) System and method for managing healthcare
US8887066B1 (en) Communicating plans for users of a social networking system
US8600981B1 (en) Using activity status to adjust activity rankings
US20130227462A1 (en) Dual time zone view and weather application in electronic device
US9805163B1 (en) Apparatus and method for improving compliance with a therapeutic regimen
US20160342955A1 (en) Multi-entity event coordination server and system
EP3723100A1 (en) Service architecture support method and system for medical/nursing support system
Ramos User-centered design, experience, and usability of an electronic consent user interface to facilitate informed decision-making in an HIV clinic
Miller et al. Bursting the information bubble: identifying opportunities for pediatric patient-centered technology
US8010662B2 (en) Soliciting data indicating at least one subjective user state in response to acquisition of data indicating at least one objective occurrence
Proctor et al. Understanding and improving cross-cultural decision making in design and use of digital media: a research agenda
US20100131504A1 (en) Hypothesis based solicitation of data indicating at least one objective occurrence
US11322241B2 (en) Electronic methods and systems for processing information related to intake of supplements by a user
JP2017174166A (en) Facility reservation providing server and facility reservation providing program
AU2020200809A1 (en) Mental health application
KR102225719B1 (en) Method and computing device for providing a favorites menu on a computing device
US20200259768A1 (en) Method, program, and information processing device