US20170046496A1 - Methods for tracking and responding to mental health changes in a user - Google Patents

Methods for tracking and responding to mental health changes in a user Download PDF

Info

Publication number
US20170046496A1
US20170046496A1 US15/233,732 US201615233732A US2017046496A1 US 20170046496 A1 US20170046496 A1 US 20170046496A1 US 201615233732 A US201615233732 A US 201615233732A US 2017046496 A1 US2017046496 A1 US 2017046496A1
Authority
US
United States
Prior art keywords
user
emotion value
value
emotion
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/233,732
Inventor
Amanda Johnstone
Oliver Rozynski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Social Health Innovations Inc
Original Assignee
Social Health Innovations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Social Health Innovations Inc filed Critical Social Health Innovations Inc
Priority to US15/233,732 priority Critical patent/US20170046496A1/en
Assigned to Social Health Innovations, Inc. reassignment Social Health Innovations, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JOHNSTONE, AMANDA, ROZYNSKI, OLIVER
Publication of US20170046496A1 publication Critical patent/US20170046496A1/en
Priority to US16/424,299 priority patent/US11430567B2/en
Priority to US17/864,166 priority patent/US20230178229A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06F19/3406
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • G06F16/2228Indexing structures
    • G06F17/30321
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16ZINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
    • G16Z99/00Subject matter not provided for in other main groups of this subclass

Definitions

  • This invention relates generally to the field of mental health and more specifically to a new and useful method for tracking and responding to mental health changes in the field of mental health.
  • FIG. 1 is a flowchart representation of a method
  • FIG. 2 is a flowchart representation of one variation of the method
  • FIGS. 3A and 3B are graphical representations of one variation of the method
  • FIG. 4 is a flowchart representation of one variation of the method
  • FIG. 5 is a flowchart representation of one variation of the method
  • FIG. 6 is a flowchart representation of one variation of the method
  • FIG. 7 is a flowchart representation of one variation of the method
  • FIG. 8 is a graphical representation of one variation of the method
  • FIG. 9 is a graphical representation of one variation of the method.
  • FIG. 10 is a graphical representation of one variation of the method.
  • a method S 100 for tracking and responding to mental health changes includes: rendering a graphical object within a graphical user interface in Block S 110 ; in response to a swipe input over the graphical user interface by a user, indexing an emotion value represented on the graphical object in Block S 120 ; updating a color value and a virtual viscosity represented on the graphical object based on the emotion value represented on the graphical object in Block S 130 ; in response to an input on the graphical user interface, submitting the emotion value to an external entity in Block S 150 ; and in response to the emotion value equaling a trigger value, prompting the external entity to contact the user in Block S 150 .
  • one variation of the method S 100 includes: rendering a graphical object within a graphical user interface in Block S 110 ; indexing an emotion value assigned to the graphical object through a spectrum of emotion values according to a direction of an input into the graphical user interface in Block S 120 ; within the graphical user interface, updating the graphical object to visually correspond to the emotion value assigned to the graphical object in Block S 130 ; recording submission of a final emotion value through the graphical user interface in Block S 140 ; and in response to the final emotion value equaling a trigger value, distributing a prompt to monitor the user to an external entity in Block S 150 .
  • another variation of the method S 100 includes: rendering a graphical object within a graphical user interface in Block S 110 ; indexing an emotion value assigned to the graphical object according to a direction of a swipe input over the graphical object within the graphical user interface in Block S 120 ; updating a color value and a virtual viscosity of the graphical object to correspond to the emotion value assigned to the graphical object in Block S 130 ; distributing a dynamic visual object, representing the graphical object in a color value and a virtual viscosity corresponding to a final emotion value selected by the user, to a first recipient elected by the user in Block S 150 ; in response to the final emotion value equaling a first trigger value, prompting a second recipient to contact the user in Block S 150 ; and in response to the final emotion value equaling a second trigger value representing a less content state of the user than the first trigger value, distributing the final emotion value to a mental health entity in Block S 150 .
  • yet another variation of the method S 100 includes: at a computing device linked to the user, receiving an emotion value, on a spectrum of emotion values, selected by the user in Block S 140 ; recording the emotion value in a database of emotion values entered by the user in Block S 150 ; enabling access to a process on the computing device in response to receipt of the emotion value in Block S 150 ; and, in response to the emotion value equaling a trigger value corresponding to a low state of contentment, distributing a second prompt to a mental health representative to monitor the user in Block S 150 .
  • Blocks of the method S 100 can be executed on a user's computing device and/or within a computer network to track the user's emotional state—such as the user's current mood, general feeling, perceived emotional health, perceived relationship condition, and/or perceived financial condition, etc.—through simple gestures entered manually by the user over time and to selectively prompt others to support the user based on quantitative or qualitative values entered by the user to represent the user's emotional state.
  • Blocks of the method S 100 can be implemented by a standalone native mental health application, a native employment application issued by an employer, an alternate lock screen, and/or an alternate keyboard, etc. (hereinafter a “graphical user interface”) executing on a smartphone, tablet, or other computing device owned by or assigned to a user.
  • the user can access the graphical user interface to submit an emotion value representing the user's current emotional state, such as in response to a prompt automatically issued by the computing device or in order to access additional content or functions on the computing device.
  • a computing device executing Blocks of the method S 100 S 100 renders a virtual sphere (i.e., a “graphical object”) on a touchscreen, moves the virtual sphere according to swipe inputs entered by the user over the touchscreen, and updates an emotion value represented by (e.g., displayed on), a color of, and/or a geometry of the virtual sphere in response to each swipe input entered by the user.
  • the user can then confirm the emotion value represented by the virtual sphere, such as by selecting a confirm region of the touchscreen; and the computing device can upload the final emotion value to a remote database, push the emotion value to a user-elected recipient, and/or distribute prompts to friends, family, therapists, emergency responders, etc. based on the emotion value, such as if the emotion value equals one of a preset trigger value associated with such prompts.
  • a computing device e.g., a smartphone, tablet, augmented or virtual reality headset, heads-up or eyes-up display, monitoring device, wearable sensor, implanted device, smart glasses, or smartwatch, etc.
  • executing Blocks of the method S 100 can therefore enable a user to quickly and seamlessly enter a qualitative or quantitative value representing her perceived emotional state or perceived emotional wellbeing and then selectively engage others to support the user when emotion values entered by the user indicate that the user may be in need of such support.
  • an alternate keyboard executing Blocks of the method S 100 can enable the user to send a graphical object (e.g., a dynamic virtual sphere, a GIF, an emoticon) representing the user's perceived emotional state to one selected recipient through a native text messaging application; the alternate keyboard can also store the emotion value selected by the user for the graphical object, prompt a friend or family member of the user other than the selected recipient to contact or otherwise support the user if the selected emotion value equals a low trigger value, and prompt yet another entity (e.g., a therapist, an emergency responder) to contact or monitor the user if the selected emotion value equals an even lower trigger value.
  • a graphical object e.g., a dynamic virtual sphere, a GIF, an emoticon
  • the alternate keyboard can also store the emotion value selected by the user for the graphical object, prompt a friend or family member of the user other than the selected recipient to contact or otherwise support the user if the selected emotion value equals a low trigger value, and prompt yet another entity (e.
  • a computing device e.g., a smartphone or tablet
  • Blocks of the method S 100 into a lock screen, prompt the user to both enter a passcode and to select (or enter) an emotion value in order to unlock the computing device, record a selected emotion value upon receipt of a correct passcode, and then selectively prompt friends, family, a therapist, or an emergency responder, etc. to connect with the patient if the selected emotion value is sufficiently low or equals corresponding trigger values.
  • the method S 100 can be: implemented in a consumer application (e.g., a communication application) or process to enable a user enter emotion values, to track the user's emotion values over time, and to automatically prompt other humans to support the user; and/or implemented in an enterprise application to track the emotional state of employees, to gate (i.e., limit) access to sensitive processes or information when the user's selected emotion values are sufficiently low, and to automatically prompt involve a mental health professional to monitor the user when such need is indicated by the user's selected emotion values.
  • a consumer application e.g., a communication application
  • an enterprise application to track the emotional state of employees, to gate (i.e., limit) access to sensitive processes or information when the user's selected emotion values are sufficiently low, and to automatically prompt involve a mental health professional to monitor the user when such need is indicated by the user's selected emotion values.
  • Blocks of the method S 100 can be implemented by a computing device (e.g., a smartphone, a tablet, a smart watch), within a computer system, across a computer network and can be hosted by a mental health clinic, a hospital, an employer, a school, a government or agency, a military or defense force, or any other entity to collect simple (e.g., basic) mental health-related data manually entered by one or more users within a user population.
  • a computing device e.g., a smartphone, a tablet, a smart watch
  • a computing device e.g., a smartphone, a tablet, a smart watch
  • a computer system across a computer network and can be hosted by a mental health clinic, a hospital, an employer, a school, a government or agency, a military or defense force, or any other entity to collect simple (e.g., basic) mental health-related data manually entered by one or more users within a user population.
  • Blocks of the method S 100 can be executed on any
  • Blocks of the method S 100 collect, handle, and respond to emotion values entered by the user through a computing device.
  • an emotion value can represent the user's perceived overall emotional state, perceived emotional wellbeing, perceived intensity of a mood, or emotional stability, etc. at a particular moment in time in a single quantitative or qualitative value that can be simply and quickly entered into the computing device by the user.
  • the computing device maintains a range of quantitative emotion values from “1” through “10,” inclusive; indexes through this range of values according to inputs into the computing device; and renders a single current emotion value from this range on the graphical object (e.g., a virtual sphere) shown within the graphical user interface at any one time, as shown in FIG. 1 .
  • the graphical object e.g., a virtual sphere
  • each emotion value within the range of emotion values can correspond to a particular response (or particular response range) on a continuum of responses, including: “emergency” for an emotion value of “1”; “need support” for an emotion value of “2”; “very poor” for an emotion value of “3”; “poor” for an emotion value of “4”; “average” for an emotion value of “5”; “ok” for an emotion value of “6”; “good” for an emotion value of “7”; “great” for an emotion value of “8”; “excellent” for an emotion value of “9”; and “extraordinary” for an emotion value of “10” (or vice versa).
  • the computing device executing Blocks of the method S 100 can index through these quantitative emotion values in order based on a direction of each swipe input entered into the computing device by the user.
  • the computing device can enable the user to reach any emotion value representing her current emotional state with as little as five simple and identical swipe (or scroll) inputs into the computing device.
  • the computer system can also update a color of, a virtual viscosity of, and/or a number value shown on a graphical object (e.g., a virtual sphere) rendered in the graphical user interface to represent the current quantitative emotion value selected by the user, as described below and shown in FIG. 2 .
  • a graphical object e.g., a virtual sphere
  • an augmented reality headset, a virtual reality headset, or a pair of smart glasses executing Blocks of the method S 100 and worn by a user—advances through the set of emotion values in response to a detected eye wink or head nod performed by the user wearing the computing device.
  • a smartwatch executing Blocks of the method S 100 and worn by a user—advances through the set of emotion values in response to a detected raise of the user's hand.
  • a smartphone or tablet executing Blocks of the method S 100 and carried by a user—advances forward through the set of emotion values in response to a detected tilt of the smartphone in a first direction and advances backward through the set of emotion values in response to a detected tilt of the smartphone in an opposite direction.
  • the smartphone can advance through the set of emotion values when the smartphone is shaken.
  • the computing device can present a text field within the graphical user interface and enable the user to type a single digit number (e.g., from “0” to “9”)—corresponding to a quantitative emotion value—into the text field.
  • the graphical user interface can render a number pad with discrete input regions labeled “1” through “10” (or “0” through “9”), and the user can select one of these numbers to enter a quantitative emotion value.
  • Blocks of the method S 100 are executed remotely, such as by a remote server or internal server within a network, upon receipt of a number value from a device issued to a user.
  • a user e.g., an astronaut, naval pilot, or military personnel
  • the satellite phone can transmit the SMS text message to a remote server (e.g., hosted by a space agency or military group) that then processes the emotion value as described below in order to track and respond to the user's emotions over time.
  • a remote server e.g., hosted by a space agency or military group
  • the method S 100 can implement any other quantity of discrete integer values across any other range of values to represent the user's perceived emotional state.
  • a computing device executing Blocks of the method S 100 implements a continuous range of quantitative emotion values, such as from “1.00” to “10.00,” inclusive (e.g., 901 discrete possible values).
  • the computing device executing Blocks of the method S 100 can implement a graphical slider bar and enable the user to move a slider to any of 901 positions between “1.00” (e.g., for “the worst”) and “10.00” (e.g., for “the best”) on the bar to enter a quantitative emotion value. Therefore, the computing device can enable the user to select an emotion value representing her current emotional state by moving the slider along the bar in a single left or right swipe input.
  • the computing device can also update the graphical user interface rendered on the display of the computing device to show a quantitative value (e.g., “7.14” or “3.56”) and/or a qualitative value (e.g., “great” or “poor”) corresponding to each position of the slider on the bar as the user manipulates the slider.
  • a quantitative value e.g., “7.14” or “3.56”
  • a qualitative value e.g., “great” or “poor”
  • a computing device executing Blocks of the method S 100 can collect, handle, and respond to a range of qualitative emotion values.
  • the computing device can store a set of qualitative emotion values including: “my life is ending,” “worst day ever,” “things are terrible,” “I've been worse,” “I'm fine, “I'm good,” “I'm great, “I'm excellent,” and “I'm phenomenal.”
  • the computer system can index through this set of qualitative emotion values based on inputs entered by the user until the user reaches a relevant qualitative emotion value.
  • the computing device can render a slider bar within the graphical user interface, label discrete positions along the bar with one of each of these qualitative emotion values, and enable the user to select one of these qualitative emotion values by moving the slider along the bar to a corresponding position.
  • the computing device can populate a dropdown menu within the graphical user interface with each of these qualitative emotion values and enable the user to select one of these qualitative emotion values by navigating through the dropdown menu.
  • a computing device executing Blocks of the method S 100 can automatically transform brainwave data received from a headset—such as an Electroencephalography (EEG) or Quantitative Electroencephalography (qEEG) headset containing one or more EEG electrodes—worn by a user into an emotion value.
  • a headset such as an Electroencephalography (EEG) or Quantitative Electroencephalography (qEEG) headset containing one or more EEG electrodes—worn by a user into an emotion value.
  • EEG Electroencephalography
  • qEEG Quantitative Electroencephalography
  • Blocks of the method S 100 are described as collecting, handling, and responding to a set of quantitative emotion values spanning values “1” through “10,” inclusive.
  • the method S 100 can implement any other range of quantitative or qualitative emotion values and can pair each emotion value within this range with any other suitable response, response type, or emotion intensity, etc. of the user at the time the emotion value was received.
  • Block S 110 of the method S 100 recites rendering a graphical object within a graphical user interface
  • Block S 120 of the method S 100 recites, in response to a swipe input over the graphical user interface by a user, indexing an emotion value represented on the graphical object
  • Block S 130 of the method S 100 recites updating a color value and a virtual viscosity represented on the graphical object based on the emotion value represented on the graphical object.
  • the computing device can execute Block S 110 to represent a current emotion value in the form of a graphical (i.e., visual) object; and the computing device can execute Blocks S 120 and S 130 to update the size, shape, geometry, color, and/or emotion value represented by (or on) the graphical object in response to an input entered by the user into the computing device.
  • Block S 110 to represent a current emotion value in the form of a graphical (i.e., visual) object
  • Blocks S 120 and S 130 to update the size, shape, geometry, color, and/or emotion value represented by (or on) the graphical object in response to an input entered by the user into the computing device.
  • the computing device renders the graphical object on a touchscreen in Block S 110 ; and indexes through the range of emotion values in response to a swipe on the touchscreen and an index direction (i.e., increasing or decreasing) corresponding to the direction of the swipe on the touchscreen (e.g., up or down, respectively) in Block S 120 , as shown in FIG. 2 .
  • the computer system can index through the range of emotion values by a single step per discrete swipe input on the touchscreen in order to preserve deliberate emotion value changes and selections entered by the user.
  • the computing device in response to an upward swipe input over the graphical object, can index from an emotion value of “5” to an emotion value of “6”; similarly, in response to a downward swipe input over the graphical object, the computing device can index from an emotion value of “5” to an emotion value of “4.”
  • the computer system can implement an inertial model to scroll through the range of emotion values based on speed, distance, and/or direction of a swipe input in order to index through multiple emotion values within the range of emotion values in response to a single swipe input entered by the user in Block S 120 .
  • the computing device can then refresh the graphical user interface rendered on the display of the computing device with an updated graphical object—representing the current emotion value—as the selected emotion value is changed by the user.
  • the computing device can thus record this new emotion value in Block S 120 and visually alter (or “update”) the virtual sphere to visually correspond to the current emotion value in Block 130 .
  • the computing device can refresh the touchscreen to illustrate the virtual sphere rolling about its spherical center (and about an axis parallel to the lateral axis of the touchscreen) in a direction corresponding to the direction of the swipe input on the touchscreen.
  • the computing device can update the graphical object with an alternate color, geometry, and/or virtual viscosity, etc. in response to a change in the emotion value selection.
  • the computing device renders the virtual sphere in a warmer color and in an increasingly amorphous form in response to selection of emotion values corresponding to increasingly content states of the user (e.g., in response to selections of greater emotion values) in Block S 130 , and vice versa.
  • the computing device renders the graphical object: in blue for a current emotion value of “1”; in blue-green for a current emotion value of “2”; in green for a current emotion value of “3”; in yellow-green for a current emotion value of “4”; and in a similar range of colors up to orange for an emotion value of “10.”
  • the computing device can also set a virtual viscosity of the sphere that is inversely proportional to the emotion value currently selected.
  • the computing device can render a wax-like virtual sphere that appears relatively rigid for a selected emotion value of “1” but appears more malleable as the selected emotion value increases, such as exhibiting less than 2% deformation of its perimeter from a circle in the plane of the touchscreen for a selected number of 1 but exhibiting up to 20% deformation of its perimeter from the circle for a selected number of 10, and the computing device can update the graphical object rendered on the touchscreen in Block S 120 accordingly based on the emotion value currently selected. Therefore, in this example, the virtual sphere can appear as hard wax for a selected emotion value of “1” but appear to warm and soften into a more malleable wax ball as the selected emotion value approaches “10.”
  • the computing device can also render a graphical object of any other virtual shape, geometry, color, viscosity, etc., such as a virtual cube, a virtual three-dimensional amoeba, or a graphical avatar.
  • a graphical object of any other virtual shape, geometry, color, viscosity, etc. such as a virtual cube, a virtual three-dimensional amoeba, or a graphical avatar.
  • Blocks of the method S 100 can be implemented by any other device or system and can collect emotion values from the user in any other way and in any other format.
  • Block S 140 recites recording submission of a final emotion value through the graphical user interface.
  • the computing device can record a final emotion value selected by the user, store the final emotion value with a time that the emotion value was entered (e.g., in the form of a timestamp) locally on the computing device, and/or upload the emotion value and related metadata (e.g., timestamp, user ID, computing device location) to a remote server or remote database for storage and subsequent handling in Block S 150 .
  • a time that the emotion value was entered e.g., in the form of a timestamp
  • related metadata e.g., timestamp, user ID, computing device location
  • the computing device records the current emotion value shown within the graphical user interface (e.g., rendered on the graphical object) in response to a double-tap input over the graphical object, in response to a single-tap input over a “submit” region or other virtual button outside the perimeter of the graphical object rendered on the touchscreen, or in response to any other secondary input into the computing device.
  • the current emotion value shown within the graphical user interface e.g., rendered on the graphical object
  • the computing device records the current emotion value shown within the graphical user interface (e.g., rendered on the graphical object) in response to a double-tap input over the graphical object, in response to a single-tap input over a “submit” region or other virtual button outside the perimeter of the graphical object rendered on the touchscreen, or in response to any other secondary input into the computing device.
  • the computing device can also combine metadata with the selected emotion value to generate an emotion package for this emotion value submission, and the computing device can store this emotion package locally, upload the emotion package to a remote server or remote database for remote storage, and/or transmit the emotion package to a computing device associated with a contact or mental health professional affiliated with the user.
  • the computing device can combine the emotion value entered by the user with: the user's username, email address, device ID, or anonymized UUID paired with the user's name or ID in a remote DNS; the last GPS location of the computing device; a time and date of submission of the emotion value (e.g., a “timestamp”); a speed of selection and submission of the emotion value (i.e., the duration of time between a first swipe input over the graphical object and submission of the emotion value); and/or action permissions and/or action triggers (described below) stored locally on the computing device.
  • a time and date of submission of the emotion value e.g., a “timestamp”
  • a speed of selection and submission of the emotion value i.e., the duration of time between a first swipe input over the graphical object and submission of the emotion value
  • action permissions and/or action triggers described below
  • the computer system can thus query an integrated geospatial position sensor for the geospatial location of the computing device once an emotion value is submitted by the user and then store this location of the computing device at an approximate time the final emotion value was entered by the user with the selected emotion value in the emotion package.
  • the computing device can also combine an emotion value selected by the user with a subset of metadata types specific to the selected emotion value to generate the emotion package for the emotion value submission.
  • the computing device can: combine a selected emotion value and a username only into an emotion package for selected emotion values above and including “4”; combine a selected emotion value, a username, a contact trigger (described below), and contact information for a personal contact entered previously by the user for a selected emotion value of “3”; combine a selected emotion value, username, and a therapist trigger (described below) for selected emotion value of “2”; and combine a selected emotion value, a username, an emergency trigger (described below), and a last known GPS location of the computing device for selected emotion value of “1.”
  • the computing device can combine such metadata with an emotion value selected by the user within an alternate lock screen, within a native mental health application, within a native messaging (e.g., email, SMS text message) application executing an alternate keyboard, or within any other interface executing on the computing device, such as described below.
  • a native mental health application e.g., email, SMS text message
  • a native messaging application e.g., email, SMS text message
  • Block S 160 which recites, prior to rendering the graphical object within the graphical user interface, prompting the user to populate a support group with a set of contacts.
  • the computing device can prompt and/or enable the user to identify one or more friends, family members, mentors, therapists, or other persons who may offer support, guidance, consolation, and/or compassion to the user.
  • the computer system can record names, email addresses, phone number, usernames, account IDs, or other contact information of persons thus selected by the user, populate a support group affiliated with the user with these selected persons, and associate one or more of these members of the user's support group with select triggers implemented in Block S 150 .
  • Blocks of the method S 100 are executed locally on the computing device by a native mental health application
  • the native mental health application accesses a list contacts stored on the computing device, such as in a native contacts application, and prompts the user to select a subset of (e.g., up to five) contacts from the contacts list to form a support group for the user.
  • the native mental health application can then retrieve names, phone numbers, email addresses, and/or other contact information for these selected contacts from the contacts list.
  • the native mental health application can prompt the user to select one or more contacts from a list of the user's friends or acquaintances enumerated within an online social network, such as by accessing this list of friends or acquaintances through a native social networking application executing on the computing device; the native mental health application can then implement similar methods and techniques to populate the user's support group with relevant contact information.
  • the native mental health application functions as a user portal into an emotion tracking and support platform hosting many users
  • the native mental health application can prompt the user to select one or more other users also on the emotion tracking and support platform.
  • the native mental health application can thus link the user's account to accounts of other users on the emotion tracking and support platform to define a support group for the user.
  • the native mental health application (or a remote computer system or remote server hosting the native mental health application) can communicate a prompt to each contact selected by the user—such as through a text message, email, communication with the online social network, or in-application notification—to confirm that the contact agrees to support the user.
  • the native mental health application (and/or the emotion tracking and support platform) can therefore enable the user to select one or more contacts to offer support to the user and to confirm that these contacts agree to join the user's support group.
  • the native mental health application can also prompt the user to join a support network for one or more contacts in her support group or to join a support group for one other person on the emotion tracking and support platform for each contact the user adds to her support group in order to maintain a target or minimum ratio of supporting and supported users on the emotion tracking and support platform.
  • the native mental health application can also gate (i.e., restrict) the user's access to an emotion value submitted by a contact for whom the user has joined a support group until the user enters her own emotion value.
  • the native mental health application (or the emotion tracking and support platform) can: revealing a second emotion value—previously submitted by a contact of the user through a second instance of the native mental health application executing on a second computing device—to the user at the user's computing device in response to receipt of an emotion value from the user.
  • the native mental health application can: generate a support group of five friends, family members, and/or mentors of the user; incorporate the user into the support group of each of these contacts; and only present recent emotion values of these other contacts to the user once the user has entered her own emotion value.
  • the native mental health application can require that the user enter at least one emotion value per twenty-four-hour period in order to see all emotion values entered by these related contacts within the same period of time. Therefore, in order to access information needed to support another contact, the user must supply her own emotion value to members of her support group, thereby enabling these other members of the user's support group to access information needed to support the user.
  • the native mental health application can thus execute on the user's computing device: to populate and maintain a support group for the user; to enable the user to modify account settings and to adjust her support group over time; to control privacy settings; etc.
  • the computing device can also execute Blocks S 110 , S 120 , S 130 , etc. through the native mental health application or through separate applications, such as within an alternate keyboard accessed in a native email, text messaging, or other application or through a lock screen on the computing device, as described below.
  • the computing device can then distribute prompts to monitor, contact, or show support for the user in Block S 150 , such as by distributing an emotion value to one or more members of the user's support group, regardless of recipients selected by the user, when the emotion value equals “2” or “3” as described below.
  • Block S 170 which recites: receiving a first category label for the final emotion value and storing the final emotion value with the first category label; and recording submission of a second emotion value through the graphical user interface, receiving a second category label different from the first category label for the second emotion value, and storing the second emotion value with the second category label.
  • the computing device can associate an emotion value entered by the user with a specific emotion category and thus automatically determine the context of each emotion value and/or enable members of the user's support group to manually determine the context of the user's emotion value.
  • the computing device defaults to labeling the graphical object—representing a current emotion value—with an “overall emotional state” category label and writes an “overall emotional state” label to an emotion value entered by the user when the overall emotional state is current.
  • the computing device can also enable the user to select an alternative category label, such as from a drop down menu, by swiping through a sequence of preset category labels, or by typing a custom category label into a text field.
  • the computing device can enable the user to select one of perceived financial stability, perceived relationship comfort, perceived sleep quality, perceived anxiety level, perceived anger level, perceived physical health, perceived level of confidence, perceived energy level, and/or perceived stress level, etc. of the user before submitting an emotion value, as shown in FIG.
  • the computing device can then write this label to metadata stored with the emotion value.
  • the computing device can enable the user to enter a second emotion value and to select the same or other label for the second emotion value. For example, within one use episode, the computing device can record: a first emotion value with a first category label corresponding to an overall emotional state of the user; and a second emotion value with a second category label for the second emotion value corresponding to perceived financial stability.
  • the computing device can link a unique support group to each category selected by the user.
  • the computing device can also implement different trigger values to trigger transmission of a prompt to a member of a support group for each category label.
  • the computing device can transmit the user's ID, the emotion value, and/or a prompt to contact the user to each member of the user's primary support group in Block S 150 ; in this example, when the user enters an emotion value of “1” or “2” with a “financial security” category label, the computing device can transmit the user's ID (e.g., name), the emotion value, the category label, and/or a prompt—to contact the user—to the user's financial advisor noted in a second finance support group.
  • the user's ID e.g., name
  • the computing device can: prompt a member of a first support group affiliated with the user and associated with a first category label to contact the user in response to an emotion value equaling the trigger value for the first category label; and prompt a member of a second support group affiliated with the user and associated with a second category label to contact the user in response to an emotion value equaling the trigger value for the second category label.
  • the computing device can selectively distribute a prompt to contact the user to members of the user's support group based on the emotion value, the corresponding category label, and an expertise noted for each member of the user's support group. For example, when the user enters an emotion value of “2” with an “overall emotional state” category label, the computing device can transmit the user's ID, the emotion value, and/or a prompt to support the user to each member of the user's primary support group in Block S 150 ; in this example, when the user enters an emotion value of “2” with a “financial security” category label, the computing device can transmit the user's ID, the emotion value, the category label, and/or a prompt—to contact the user—to the user's parents noted in the user's support group; and, when the user enters an emotion value of “2” with a “friendships” category label, the computing device can transmit the user's ID, the emotion value, the category label, and/or a prompt to contact the user to the persons—
  • the computing device can thus prioritize transmission of prompts—to contact the user—to select members of the user's support group in response to receipt of an emotion value based on each member's known (e.g., user- or member-entered) expertise and the category label assigned to the emotion value.
  • Block S 150 of the method S 100 recites, in response to the final emotion value equaling a trigger value, distributing a prompt—to monitor or contact the user—to an external entity.
  • the computing device or a remote computer system in cooperation with the computing device selectively communicates a prompt to an external entity to contact the user—such as immediately or within some time window via textual communication (e.g., SMS text message), in a phone or video conferencing call, or in person—upon receipt of a user-entered emotion value indicating that the user may benefit from immediate support, as shown in FIGS. 1 and 6 .
  • textual communication e.g., SMS text message
  • the computing device (or the native mental health application, the emotion tracking and support platform, etc.) distributes prompts to support the user to external entities only when an emotion value equals a trigger value indicating that the user is experiencing relatively low contentment, relatively high anxiety, and/or relatively high concern such that persons possibly in positions to support the user are notified of the user's emotional state only when the user is in greatest need.
  • the computing device can: communicate a first prompt to monitor the user to a computing device associated with a therapist, psychologist, or other mental health professional affiliated with the user upon receipt of an emotion value equaling a first trigger value of “1” from the user; push a notification to interact with the user to a computing device associated with a known contact of the user (e.g., all or a subset of members of the user's support group) in response to the final emotion value equaling a second trigger value of “2” or “3”; and withhold communication of emotion values to persons other than those explicitly selected by the user to receive the user's emotion value (e.g., a recipient of a private text message selected by the user, as described below) for emotion values greater than “3,” exclusive.
  • a known contact of the user e.g., all or a subset of members of the user's support group
  • the final emotion value equaling a second trigger value of “2” or “3”
  • emotion values e.g., a recipient of a private text
  • the computing device (or the native mental health application, the emotion tracking and support platform) can therefore: distribute a dynamic visual object—representing the graphical object in a color value and a virtual viscosity corresponding to an emotion value selected by the user—to a first recipient elected by the user, such as when the user elects to send an emotion value to the user through a native text messaging application or through a native social networking application executing on the computing device as described below; prompt a second recipient (e.g., a member of the user's support group) to contact the user when the emotion value entered by the user equals a first trigger value (e.g., “2” or “3”); and distributes the emotion value to a mental health entity (e.g., a therapist, a psychologist, an emergency responder) when the final emotion value equals a second trigger value (e.g., “1”) representing a less content state of the user than the first trigger value.
  • a mental health entity e.g., a therapist, a psychologist, an
  • the native mental health application, lock screen, or alternate keyboard, etc., executing Blocks of the method S 100 on the computing device can communicate a static prompt to the user, such as “How are you?”; the user can thus submit an emotion value corresponding to her general feeling or mood.
  • the computing device can communicate one or more distinct prompts to the user at any one time. For example, the computing device can cycle through a prepopulated set of prompts directed toward the user's current feeling or mood about a particular life area, such as a first prompt for general feeling, a second prompt for emotional health, a third prompt for relationship condition, and/or a fourth prompt for financial condition.
  • the computing device can represent responses to each prompt with a discrete graphical object (e.g., a virtual sphere) specific to the prompt, wherein the graphical object is numbered and colored according to the emotion value selected by the user.
  • the computing device can render these discrete graphical objects together or independently across a set of slides.
  • the computing device can render one graphical object representing a response to the first prompt related to the user's general feeling or mood, and the computing device can render a second graphical object representing responses to emotional health-, relationship-, and finance-related prompts, as shown in FIG. 3B .
  • the computing device can therefore combine the selected emotion value with a pointer to a corresponding prompt or prompt type issued to the user prior to submission of the emotion value, as shown in FIG. 1 .
  • the computing device can dispatch the police and an ambulance to the user's last recorded GPS location in response to submission of an emotion value of “1.”
  • the computing device or the native mental health application, the emotion tracking and support platform
  • the computing device in response to a submitted emotion value of “2,” can also transmit a notification to the user's personal or company therapist to call the user or automatically access a calendar on the user's computing device, compare the user's calendar to the therapist's calendar, automatically schedule a call or in-person meeting as soon as possible based on availability of the user and the therapist (or move less important meetings off of the therapist's calendar in order to see the user), and then notify the user and/or the therapist of the scheduled meeting time.
  • the computing device can push a textual notification to one contact in a preset shortlist of contacts (i.e., members of the user's support group) previously entered by the user in response to a submitted emotion value of “3.”
  • the computing device can cycle through a shortlist of contacts previously entered by the user, such as entered by the user upon installation and first login into a standalone native mental health application, an alternate mental health keyboard, an alternate lock screen application, etc. to populate a personal contacts shortlist, including a phone number and email address for each of a parent, a sibling, a spouse, a best friend, and a mentor.
  • the computing device can execute no action in response to a submitted emotion value of “4” or greater other than to store the submitted emotion value in a local or remote database of emotion values entered by the user and/or to send the emotion value to a contact explicitly selected by the user to receive the emotion value.
  • the computing device can: serve a prompt to the user to indicate the user's feeling regarding her finances; transmit a trigger to the user's bank and credit institutions to reject all transactions over $20 if the user submits an emotion value of “1” in response to a finance-related prompt; and access the user's calendar and automatically schedule a call or a meeting with a financial planner if the user submits an emotion value of “2” or less in response to this prompt.
  • the computing device (or the native mental health application, the emotion tracking and support platform) can implement the foregoing methods and techniques in Block S 150 based on a category selected by the user when entering an emotion value (i.e., rather than based on textual or audio prompt served by the computing device to the user over time), as described above.
  • the computing device can implement standard action trigger and emotion value pairs across a population of users.
  • the computing device can match standard action triggers (e.g., notify a therapist, notify support group members) to select emotion values based on trends or frequencies in emotion values entered by the particular user.
  • the computing device can “normalize” emotion values assigned to action triggers for a particular user based on historical emotion values entered by the user, such as by shifting an action trigger relative to the spectrum of possible quantitative emotion values based on a recent trend in emotion values submitted by the user or based on an average emotion value submitted by the user over an extended period of time (e.g., six months).
  • the computing device can also maintain custom action triggers and corresponding emotion values specific to the user (or to a subset of users within a population), such as triggers to notify a therapist for users who have currently engaged therapists and triggers to notify a best friend for users who have not currently engaged therapists.
  • the computing device can assign trigger values to emotion values entered by the user based on historical user emotion value data. For example, the computing device can quantify an emotional stability of the user based on a current emotion value and emotion values previously entered by the user based on a magnitude of total deviation from an average emotion value entered by the user over a period of time.
  • the computing device can label the user as unstable and set a trigger value to connect the user with others at “5” in order to “catch” or slow the user's transition to emotion values of “1.” If the user regularly enters emotion values between “3” and “7,” the computing device can label the user as moderately stable and set a trigger value to connect the user with others at “3” in order to prompt support of the user when the user is most in need.
  • the computing device can label the user as highly stable and set a trigger value to connect the user with others at “4” in order to prompt support of the user when the user's emotional state is declining outside of a regular bound.
  • the computing device can characterize the user's emotional stability according to any other schema or schedule and can implement this characterization in any other way to set trigger values for the user.
  • the computing device can also implement a mental health history of the user to assign triggers to emotion values entered by the user. For example, for a user with a history of self-harm, the computing device can: prompt members of the user's support group to offer support to the user in response to entry of an emotion value of “3” or “4”; prompt a mental health professional to monitor the user in response to an emotion value of “1” or “2”; and dispatch an emergency responder to the user's last known location if the user does not check in with another person or with the user's therapist within one hour of entering an emotion value of “1.” However, for a user with a history of harm to others, the computing device can: prompt a mental health professional to monitor the user in response to an emotion value of “1,” “2,” or “3”; dispatch an emergency responder to the user's last known location if the user does not check in with another person or with the user's therapist within one hour of entering an emotion value of “2”; and immediately dispatch an emergency responder to the user's
  • the computing device can automatically assign such trigger values to emotion values for a user, such as based on digital medical records of the user.
  • a mental health professional such as a human health professional
  • an employer such as a human health professional
  • members of the user's support group such as a human health professional
  • any other entity can manually assign these triggers to emotion values for the user (or for a population of users).
  • the computing device can also implement dynamic trigger values.
  • the computing device can prompt the support group member to contact the user when the user's entered emotion values are trending downward and take a rapid decline.
  • the computing device can prompt members of the user's support group to contact the user when the user subsequently enters an emotion value of “5.”
  • the computing device can notify members of the user's support group to contact the user to support the user following this rapid decline in the user's emotional state.
  • the computing device can serve a variety of prompts to external entities.
  • the computing device (or the native mental health application, the emotion tracking and support platform) can transmit an electronic textual notification—including the user's name, location, and concern (e.g., for self-harm or harm to others) from emotion value metadata—to the emergency responder.
  • the computing device can: place an automated call to the emergency responder; or transmit an electronic notification to a mental health professional with metadata for the low emotion value and a prompt to alert an emergency responder.
  • the computing device can notify an emergency responder in any other way in Block S 150 .
  • the computing device can transmit an electronic textual notification—such as in the form of a text message, email, or in-application notification—to the mental health professional in Block S 150 .
  • an electronic textual notification such as in the form of a text message, email, or in-application notification—to the mental health professional in Block S 150 .
  • the user has an existing relationship with the mental health professional—such as for therapy, psychological testing, or pain management, etc.
  • the computing device can generate a textual communication including the user's name, the user's recent emotion value, the user's location, the user's contact information (e.g., phone number), and/or a prompt to call, message, or otherwise monitor the user.
  • the computing device can prompt a mental health professional to contact the user directly or to monitor the user, such as by tracking the location of the user's computing device through a web-based doctor portal or by contacting persons physically near the user or family or friends of the user.
  • the computing device can issue a notification to the user to confirm that the user would like to connect to a mental health professional, and the computing device can automatically select a mental health professional from a directory and connect the user to the mental health professional, such as via text, email, or phone call, in Block S 150 following confirmation from the user.
  • the computer system can prompt a mental health professional to contact or monitor the user in any other way in Block S 150 in response to entry of a emotion value equaling a corresponding trigger value.
  • the computing device in another implementation in which the user enters an emotion value equaling a trigger value for contacting a peer or a member of a support group, the computing device (or the native mental health application, the emotion tracking and support platform) can push a notification to show concern for the user to such a recipient.
  • the computing device can push an electronic notification—such as a text message, email, or in-application notification to call or message the user—a member of the user's support group, regardless of the recipient of the emotion value elected by the user, in response to entry of an emotion value of “2” or “3.”
  • the computing device can generate a notification including the user's phone number, the user's email address, a link to a messaging center in which the recipient can message the user, etc., and the recipient can use this contact information to contact the user.
  • the computing device can additionally or alternatively generate such a notification that includes prompts to the recipient to show care or concern for the user in other ways.
  • the computing device can generate a notification that recites, “Jess is feeling down. Send her a song that has special meaning to your relationship with her.”
  • the computing device can incorporate a link (e.g., a website URL, an in-application link) to a digital music service (e.g., a commercial music streaming service) into the notification; upon selection of the link, the recipient can access the digital music service, navigate to a particular song, and select this song to share with the user; the emotion tracking and support platform or other messaging service can load a link to this song into a new communication from the recipient to the user and transmit this new communication to the user; upon receipt of this new communication at the computing device, the user can select this link, and the computing device can stream the song from the digital music service.
  • a link e.g., a website URL, an in-application link
  • a digital music service e.g.
  • the computing device can thus prompt and enable a contact of the user (e.g., a member of the user's support group) to send the user a song that may have special meaning to the user rather than calling or text messaging the user.
  • the computing device or the native mental health application, the emotion tracking and support platform
  • the computing device can generate a notification that recites, “Jess is feeling down. Send her an old photo of the two of you doing something you really enjoyed,” and the computing device can send this notification to a recipient upon entry of a low emotion value by the user.
  • the emotion tracking and support platform can then communicate a digital photograph—selected by the recipient at the recipient's computing device (e.g., smartphone)—back to the user's computing device, and the user's computing device can render the digital photograph within a notification or within the native mental health application executing on the user's computing device.
  • the computing device can similarly prompt the recipient to send a video, digital sketch, or other media to the user in response to entry of a low emotion value by the user.
  • the computing device or the native mental health application, the emotion tracking and support platform
  • the computing device can generate a notification that includes a prompt to physically visit the user and can send this notification to a particular recipient, such as a recipient physically near the user.
  • the computing device can: query an internal GPS sensor for the user's current location; cross reference the user's location to locations of members of the user's support group to prioritize members of the support group by proximity to the user (e.g., by building, street, campus, city, region, state); and select a member of the support group nearest the user to receive this notification.
  • the computing device can generate a notification that includes a prompt to send the user a physical gift.
  • the computing device can include the user's current location or address, a link to an online florist, and a prompt to send flowers to the user in a notification sent to the recipient; upon receipt of the notification, the recipient can select the link to access the online florist, select a flower arrangement, and submit payment and delivery details in order to initiate delivery of flowers to the user.
  • the computing device can prompt the recipient to send a pair of socks, a meal, a coffee, or other tangible good to the user in response to entry of a low emotion value by the user, such as by incorporating links to online retailers through which the recipient may select and purchase a good for delivery to the user.
  • the computing device (or the native mental health application, the emotion tracking and support platform) can also generate a notification to subsidize a session with the mental health professional and then communicate this notification to one or more members of the user's support group.
  • the computing device can prompt members of the user's support group to supply all or partial payment for the user to complete a session with a therapist or other mental health professional, such as if the user enters a low emotion value (e.g., a “1” or a “2”) with an “overall emotional state” category label and enters a similarly low emotion value (e.g., a “1,” “2,” or “3”) with a “financial security” category label.
  • the emotion tracking and support platform can thus collect funds from these recipients and release these funds to the user or to a selected therapist following completion of a session.
  • recipients of such a prompt can transfer funds to the user or to the therapist through other external currency transfer services.
  • the computing device (or the native mental health application, the emotion tracking and support platform) can generate and transmit a notification including a prompt for any other action to a member of the user's support group or other recipient in response to entry by the user of an emotion value equaling a corresponding trigger value.
  • the computing device can also index through these notification types over time.
  • the computing device can also selectively issue these various notification types to recipients based on: one or more category labels entered by the user; proximity of recipients to the user; perceived financial security of recipients (e.g., by sending a prompt to send funds or to send a physical gift only to recipients with current emotion values with “financial security” category labels exceeding “5”); a familial or friendship status with the user; an age gap with the user (e.g., by prompting recipients of the same age to send a song but prompting recipients substantially older than the user to call the user); etc.
  • the computing device may disguise a consequence of entry of a emotion value from the user.
  • the user may be more likely to enter honest emotion values over time.
  • receipt of flowers, a song, or other digital or physical media following entry of a low emotion value may abstract such a gift or media from the low emotion value, thereby curbing the user's expectations that a gift of other media may follow entry of a low emotion value such that the user may be more likely to enter honest emotion values over time.
  • the computing device executes Blocks of the method S 100 within an alternate lock screen, as shown in FIG. 8 .
  • Block S 110 the computing device can render the graphical object on the touchscreen in response to an initial input that transitions the computing device out of a standby mode.
  • the user can then swipe up or down on the touchscreen to pull an emotion value—in the range of emotion values—into the graphical object, and the computing device can update the graphical object rendered in the lock screen accordingly in Block S 120 .
  • the user can continue to swipe over the lock screen to cycle through the range of emotion values until an appropriate emotion value is reached, and the computing device can update the graphical object—including its color, viscosity, and emotion value—in Blocks S 120 and S 130 according to each additional swipe entered by the user.
  • the user can enter a passcode through an alphanumeric keypad adjacent the graphical object or in an alternate screen of the lock screen to unlock the computing device and enter the emotion value.
  • the user can double tap the graphical object, select an alternate region on the touchscreen, trace a passcode pattern, or enter any other input into a touchscreen of the computing device to unlock the computing device and to submit the selected emotion value.
  • the computing device can then record this emotion value in Block S 130 , unlock the computing device, and then handle the selected emotion value as described herein.
  • the computing device e.g., a smartphone
  • a graphical object e.g., a virtual sphere
  • the computing device can alter the initial emotion value applied to the graphical object with each subsequent unlock cycle (or with each subsequent entry of an emotion value) in order to require the user's active attention when swiping to the appropriate number.
  • the computing device can encourage the user to deliberately and attentively swipe to an appropriate number rather than simply thoughtlessly swiping to an emotion value and selecting a submit key.
  • the computing device can change how swipe directions (e.g., up or down) correspond to index directions (e.g., increasing and decreasing) for cycling through the range of emotion values and can modify swipe distances that trigger an index event for cycling through the range of emotion values for each new unlock cycle to achieve similar effects.
  • swipe directions e.g., up or down
  • index directions e.g., increasing and decreasing
  • the computing device can: pseudorandomly assign an initial emotion value—in a spectrum of emotion values corresponding to unique contentment states of the user—to the graphical object; and index through this spectrum of emotion values in a loop—sequentially or in a pseudorandom order—according to a sequence of inputs into the graphical user interface.
  • the computing device can prompt the user to pay greater attention to which emotion value is selected before submitting a final emotion value, which may elicit greater honesty in emotion value submissions by the user.
  • the computing device can characterize an integrity of an emotion value entered by the user based on a number of swipe (or other) inputs entered by the user into the computing device before reaching a final emotion value. For example, if the number of swipe inputs entered by the user historically falls between no swipe input and two swipe inputs for a pseudorandom order of emotion values but the user's selected emotion values exhibit large swings between “1” to “10,” the computing device can characterize the honesty of the user's inputs (or engagement with the native mental health application) as low. However, if the user regularly swipes between one and six times when selecting an emotion value and enters emotion values than exhibit relatively smooth changes over time, the computing device can characterize the honesty of the user's inputs (or engagement with the native mental health application) as high.
  • the computing device executes Blocks of the method S 100 within an alternate keyboard, as shown in FIG. 7 .
  • the computing device can render the graphical object within a keyboard area as the user drafts or reviews textual content within any one or more native applications executing on the computing device.
  • the alternate keyboard includes an instance of the graphical object rendered within a defined keyboard area, and the alternate keyboard updates the emotion value, color, and/or geometry, etc. of the graphical object within the keyboard area in Blocks S 120 and S 130 in response to swipe gestures entered by the user over the keyboard area, such as described above.
  • the user can switch to the alternate keyboard and swipe over the graphical object to select an emotion value, as in Blocks S 110 and S 120 .
  • the alternate keyboard can load a static image of the graphical object—including the final color, shape, and emotion value of the graphical object—into a message preview within the text messaging application in Block S 140 , as shown in FIG. 1 .
  • the text messaging application can then transmit the image of the graphical object to a selected recipient in response to selection of a virtual “send” button by the user.
  • the alternate keyboard can alternatively load a dynamic image, such as a GIF or video file showing the virtual object rotating about its center, deforming according to its assigned viscosity corresponding to the selected emotion value, and/or pulsing within a narrow color band corresponding to selected emotion value, such as described above.
  • a dynamic image such as a GIF or video file showing the virtual object rotating about its center
  • the alternate keyboard can also generate, store, and/or distribute the selected emotion value and select metadata.
  • the alternate keyboard can store the selected emotion value, the time and date the emotion value was sent to the recipient, and other metadata, as described above, in local memory on the computing device.
  • the alternate keyboard can also encrypt and push these data to a remote database for storage with other user-specific data, or the alternate keyboard can anonymize and push these data to a remote database for storage with other anonymized population data.
  • the recipient can push to the user—through a similar native text messaging application—a prompt to submit an emotion value corresponding to the user's general mood or feeling, the user's emotional health, the user's relationship condition, or the user's financial condition, and the user can switch to the alternate keyboard, swipe the graphical object to select an appropriate emotion value, and transmit this emotion value back to the recipient.
  • the recipient's computing device can thus execute a second instance of the alternate keyboard, and the recipient can open this second instance of the alternate keyboard within a text messaging application executing on the recipient's computing device to select from a prepopulated list of prompts to send to the user.
  • the instance of the alternate keyboard executing on the recipient's computing device can include options to send any one or more of the following prompts to the user: “How are you feeling?,” “How are things with your spouse?,” “How is your emotional health?,” and/or “How are things going financially?”
  • the recipient can select a particular prompt from this prepopulated list of prompts and send this prompt to the user, and the instance of the alternate keyboard executing on the user's computing device can write a flag for a prompt type corresponding to the particular metadata stored with the emotion value submitted in response to the particular prompt.
  • the instance of the alternate keyboard executing on the user's computing device can implement natural language processing techniques to determine a prompt type from one or more messages sent by the recipient to the user—or vice versa—leading up to the time that the user submitted the emotion value to the recipient; the instance of the alternate keyboard executing on the user's computing device can thus add a flag for the corresponding prompt type to metadata associated with the selected emotion value.
  • the alternate keyboard can also automatically transmit (or trigger transmission of) a prompt to an emergency responder, such as described above, in response to transmission of an emotion value of “1” to the recipient.
  • the alternate keyboard can similarly connect the user with a therapist or to a contact on a preset contact shortlist, such as described above, in response to transmission of an emotion value of “2” and an emotion value of “3,” respectively, to the recipient.
  • the alternate keyboard can implement similar methods and techniques within a native email application executing on the user's computing device. For example, while composing an email within the native email application, the user can select the alternate keyboard, swipe the graphical object to select a relevant emotion value, and then insert a static or dynamic image representing the selected emotion value and the graphical object into an open email draft.
  • the alternate keyboard can store the selected emotion value and related metadata and distribute relevant action triggers, as described above.
  • the alternate keyboard can also scan the body of the user's email, such as including text entered by the user and text in earlier emails in the same email thread—and implement natural language processing techniques to determine the context of (i.e., a prompt type associated with) the selected emotion value, as described above.
  • the alternate keyboard can thus interface with a native text messaging application and/or a native email application executing on the user's computing device to enable the user to send an emotion value to a recipient while enabling additional emotion value tracking and action trigger handling, as described above.
  • the alternate keyboard can also interface with a native application specific to a health clinic or to an employer to enter emotion values into fields corresponding to physical health—, mental health-, and/or productivity-related prompts.
  • the native messaging application can characterize emotion values typed into the message preview by the user and can implement methods and techniques described herein accordingly.
  • the computing device can access an alternate keyboard containing the graphical object and enabling the user to cycle through emotion values assigned to the graphical object by entering swipe (or other) inputs into the alternative keyboard.
  • the computing device can write the graphical object—including color, virtual viscosity, and/or other parameters corresponding to the final emotion value—to a message or other text field within a textual communication application (e.g., a native email application, a native SMS text messaging application, or a native social networking application, etc.) executing on the computing device and then transmit this message, including the graphical object, to a recipient selected by the user.
  • a textual communication application e.g., a native email application, a native SMS text messaging application, or a native social networking application, etc.
  • the recipient of the graphical object can thus quickly, visually distinguish the user's emotional state based on the color, emotion number, and other parameters represented by the graphical object.
  • the graphical object can be colored and move according to the user's selected emotion value, as described above.
  • the computing device can share this graphical object with a recipient explicitly selected by the user (e.g., one person in a private text message, the user's social feed in an online social network) when the emotion value exceeds general or user-specific trigger values with the textual communication application executing on the computing device.
  • the computing device can share the graphical object and/or emotion value with other entities (e.g., member of the user's support group, the user's therapist) if the entered emotion value equals a trigger value assigned to the user, as described above, such as automatically or following additional confirmation from the user to share the emotion value.
  • entities e.g., member of the user's support group, the user's therapist
  • the computing device can: access an alternate digital keyboard—in replacement of an alphanumeric keyboard—within a native textual communication application executing on a mobile computing device and render the graphic object within a message preview area of the alternate digital keyboard in Block S 110 ; index the emotion value represented on the graphical object in response to and in a direction corresponding to a swipe input over the message preview area in Block S 120 ; transmitting the graphical object to a recipient selected in the native textual communication application through a private messaging channel in response to receipt of a command at the native textual communication application to send the graphical object to the recipient, such as in the form of a digital file, in Block S 150 ; and withhold the final emotion value from the external entity in response to the final emotion value differing from the trigger value in Block S 150 .
  • the computing device can also: record a final emotion value selected by the user within the native text messaging application to a database of emotion values entered by the user in Block S 140 ; transmit a form of the final emotion value and an identifier of the user to a mental health representative in response to the final emotion value equaling the trigger value (e.g., “1”) in Block S 150 ; and/or communicate a form of the final emotion value to a contact specified in the user's support group and distinct from the original user-elected recipient in response to the final emotion value exceeding the trigger value and a threshold intervention value exceeding the final emotion value (e.g., equal to a second trigger value of “2” or “3”).
  • the computing device can therefore execute Blocks of the method S 100 to distribute the user's emotion value from one recipient selected by the user to many recipients suited to support the user in a time of need.
  • the computing device can: access an alternate digital keyboard within a native online social networking application executing on a mobile computing device and render the graphic object within a message preview area of the alternate digital keyboard in Block S 110 ; update a visual form of the graphical object within the preview area to correspond to the emotion value in response to the input into the graphical user interface in Block S 120 ; post the graphical object to a social feed within an online social network in response to receipt of a command at the native online social networking application to post the graphical object to the online social network; and communicate a prompt to a contact affiliated with the user in response to the final emotion value equaling a trigger value, wherein the prompt directs the contact to the social feed or to contact the user directly.
  • the computing device can therefore execute Blocks of the method S 100 to distribute the user's emotion value from many recipients to one particular well-suited to support the user in a time of need.
  • Blocks of the method S 100 are executed within a native mental health application executing on the user's computing device.
  • a mental health clinic, rehabilitation center, hospital, or similar institution can track mental health statuses of in-patient and out-patient populations over time through instances of the native mental health application executing on personal or institution-issued computing devices.
  • Each instance of the native mental health application executing on a computing device affiliated with a user e.g., a patient
  • the native mental health application can issue regular prompts—such as general, mental health-, relationship, or finance-related prompts, as described above—to the user to check-in with an emotion value submission, such as every morning, every afternoon, and every evening.
  • the native mental health application can issue such prompts at dynamic intervals.
  • the native mental health application can issue a higher frequency of check-in prompts to the user for lower emotion values last submitted by the user.
  • the native mental health application can issue prompts to the user at a frequency of every ten minutes for a last emotion value of “1,” every 30 minutes for a last emotion value of “2,” every hour for a last emotion value of “3,” every two hours for a last emotion value of “4,” etc.
  • the native mental health application can also issue prompts to the user at frequencies based on whether the user is inpatient or outpatient (e.g., 20% higher check-in prompt frequencies for outpatients than for inpatients) and/or based on trends in emotion values submitted by the user (e.g., a lower frequency of check-in prompts if the user is trending toward higher submitted emotion values).
  • a therapist or mental health specialist can access a dashboard (e.g., within a native mental health management application, within a web browser) to manually issue a check-in prompt to all or a subset of the affiliated patient population.
  • the native mental health application can also automatically connect the user with this therapist, this specialist, an advisor, or an other contact based on an emotion value submitted by the user, as described above.
  • each patient in the patient population can be associated with a particular health condition or set of health conditions, and the native mental health application (or the native mental health management application, the remote computer system, a remote database, etc.) can assign a general prompt, a set of lower-level prompts, and/or a set of action triggers to a particular patient based on the particular patient's current health condition.
  • the native mental health application or the native mental health management application, the remote computer system, a remote database, etc.
  • an instance of the native mental health application executing on the first user's smartphone can connect the first user with a member in the first user's support group in response to submission of an emotion value of “3”; connect the first user with her sponsor in response to submission of an emotion value of “2”; and dispatch the first user's sponsor, a psychotherapist, and/or a police officer to the last known GPS location of the first user's smartphone in response to submission of an emotion value of “1.”
  • an instance of the native mental health application executing on the second user's smartphone can connect the second user with a close friend in response to submission of an emotion value of “3”; connect the second user with her psychiatrist in response to submission of an emotion value of “2”; and dispatch the second user's psychiatrist and an ambulance to the last known GPS location of the second user's smartphone in response to submission of an emotion value of “1” by the second user
  • Blocks of the method S 100 are executed within a native employer application executing on a computing device (e.g., a smartphone, a tablet) affiliated with the user.
  • a computing device e.g., a smartphone, a tablet
  • an employer can track mental health statuses of its employees during (and outside of) work hours through instances of the native employer application executing on personal or employer-issued computing devices.
  • Each instance of the native employer application executing on a computing device affiliated with a user e.g., an employee
  • emotion values entered by an employee in response to a general feeling, mood, anxiety, and/or other mental or physical health-related prompt may remain concealed to the employer.
  • an instance of the native employer application or a remote computer system cooperating with the instance of the native employer application can anonymize emotion values submitted by employees before communicating these data through an employer portal to track mental health, anxiety, etc. across an employee population, or the native application and/or remote computer system can withhold employee-entered emotion values from representatives of the employer (e.g., managers) and instead automatically distribute these data to select third-party services contracted by the employer.
  • the native employer application and/or the affiliated remote computer system can automatically distribute notifications to emergency personnel in response to emotion values of 1 submitted by employees and automatically distribute employee data and prompts to third-party contracted therapists or psychologists in response to emotion values of 2 submitted by employees, as described above.
  • the native employer application executing Blocks of the method S 100 can also serve productivity-related prompts to users through an internal graphical user interface, through the alternate lock screen, and/or through an alternate keyboard within a native messaging application.
  • the native employer application can communicate emotion values entered by employees in response to productivity-related prompts to an employer representative, such as to an human resources representative or to the user's manager to track the user's feelings about productivity during work hours.
  • an instance of the native employer application or a remote computer system hosting the native employer application can automatically identify trends in an employee's feeling about her productivity and submit related suggestions to a representative of the employer (e.g., a manager) accordingly.
  • the remote computer system can aggregate and process these emotion values to determine that the user tends to feel most productive between the hours of 8 AM and 1 PM (e.g., a time window corresponding to high frequencies of emotion values above 6), tends to feel much less productive between the hours of 1 PM and 3 PM (e.g., a time window corresponding to high frequencies of emotion values below 5), and tends to regain productivity between 3 PM and 5 PM on weekdays.
  • the remote computer system can issue a suggestion to the employee's manager to implement a break period from 1 PM to 2 PM for the employee during a one-week trial period.
  • the remote computer system can continue to receive and track emotion values entered by the employee in response to productivity-related prompts issued during work hours; if the employee shows increased productivity in the hour from 2 PM to 3 PM, the remote computer system can transmit a suggestion to the employee's manager to extend the trial period or to make the break period mandatory for the user.
  • an airline issues a native employer application to its grounds crews, pilots, and/or stewards to collect regular check-ins from its employees during operating hours.
  • a native employer application executing on a smartphone issued to a pilot
  • the native employer application can prompt the pilot to check-in before a flight assigned to the pilot, such as: 1 day before, 12 hours before, 8 hours before, 4 hours before, 2 hours before, 1 hour before, 30 minutes before, and 5 minutes before doors are closed to initiate the flight.
  • the airline can automatically ground the pilot if he enters an emotion value of “1” or “2” within three days of the departure of an assigned flight, and the airline can automatically ground the pilot if he enters more than three emotion values of “3” within 24 hours of the scheduled departure of a flight assigned to the pilot.
  • the native employer application can prompt the pilot to check-in during the flight, such as every hour throughout the flight.
  • the native employer application can also increase a check-in frequency during a flight, such as once at the first hour, again at the fourth hour, at the eight hour, at the tenth hour, at the twelfth hour, at the thirteenth hour, at the fourteenth hour, etc. during the flight.
  • the native employer application can interface with a remote computer system (e.g., a remote server) and computing devices issued to a second pilot and stewards on the plane with the pilot to issue prompts related to the emotion values submitted by the pilot.
  • a remote computer system e.g., a remote server
  • computing devices issued to a second pilot and stewards on the plane with the pilot to issue prompts related to the emotion values submitted by the pilot.
  • the remote server can broadcast notifications to smartphones affiliated with the second pilot and onboard stewards to remove the pilot from the cockpit.
  • an instance of the native employer application can execute on a tablet issued to a grounds crewman to collect emotion values from the grounds crewman through an alternate lock screen, as described above.
  • the native employer application can require the grounds crewman to unlock the tablet by entering an emotion value in response to a prompt of one or more types, as described above, in order to access a digital pre-flight checklist for the aircraft or a refueling checklist, as shown in FIG. 9 .
  • the employer can require the grounds crewman to open the native employer application and enter an emotion value at select times during work hours or before performing certain tasks, such as before opening bay doors on an aircraft to load or unload luggage.
  • the native employer application can execute on a digital control panel mounted on or near a cockpit door of an aircraft and can execute Blocks of the method S 100 to prompt a pilot (or steward) to enter an emotion value into the digital control panel before unlocking the cockpit door.
  • the computing device can execute Blocks of the method S 100 to gate access to content on the computing device (e.g., an application executing on the computing device) until an emotion value is selected and submitted by the user.
  • a mining operation contracts a mental health application that prompts miners employed by the mining operation to submit emotion values in response to mental health-related prompts when punching in, when punching out, at the beginning of each shift, and/or at the beginning of each break period, such as through a digital punch clock arranged at an entrance of a mining facility or through computing devices issued to each miner or miner group within the mining operation.
  • the mental health application and/or an associated remote computer system can automatically discharge a therapist—employed by the mining operation or contracted by the operation—to a particular miner or automatically instruct the particular miner to visit an onsite therapist if the particular miner enters an emotion value of “1” or “2,” such as described above.
  • Blocks of the method S 100 can be executed by equipment rather than by discrete computing devices (e.g., a smartphone, a tablet, a smart watch) assigned to particular employees. For example, when an employee swipes a badge, enters a fingerprint, or logs into a machine with an username and password within an employer's facility, an interlock on the machine can execute Blocks of the method S 100 to present the user with a prompt to enter an emotion value corresponding to the user's current mood, feeling, confidence, or productivity, etc., such as by swiping a rendered graphical object or by typing an emotion value on a touch screen within the interlock, as described above. The interlock can then grant access to the user upon receipt of an emotion value thus entered.
  • discrete computing devices e.g., a smartphone, a tablet, a smart watch
  • the interlock can associate the submitted emotion value with an employee ID, name, or other identifier of the user to generate an emotion package, as described above and then push this emotion package to a remote computer system; the computer system can then implement methods and techniques described above to selectively connect the employee with a preselected contact, a therapist, or an emergency responder based on general or custom action triggers, as described herein.
  • Blocks of the method S 100 are executed within a native student application executing on a computing device assigned to or accessed by a student.
  • a native student application executing on a computing device assigned to or accessed by a student.
  • an elementary school, middle school, or high school can issue mobile computing devices executing Blocks of the method S 100 to students in order to track mental health statuses of its students during (and outside of) school hours.
  • each instance of the native student application can implement methods and techniques described above to collect emotion values entered by a corresponding student in response to one or more prompts, such as in response to check-in prompts issued during certain times of the day (e.g., at the beginning of a school day and following a lunch break) or at the start of each class.
  • Each instance of the native student application can thus respond to action triggers, as described above, by connecting a student to a counselor, advisor, or teacher based on select emotion values entered by the student.
  • Each instance of the native student application can also collect student-entered emotion values into a journal for select distribution to parents and/or teachers affiliated with the school.
  • the native mental health application is executed by an electronic gun case, an electronic gun lock installed on a gun, or a smartphone or other mobile device wirelessly paired to the electronic gun case or electronic gun lock.
  • the native mental health application can prompt a user to enter an emotion value before unlocking the electronic gun case or electronic gun lock and can implement methods and techniques described above to notify a member of a support group, a therapist, or an emergency responder (e.g., policy, a security agency, etc.) if the user enters corresponding emotion values, such as “3,” “2,” or “1,” respectively before retrieving or unlocking a gun.
  • Blocks of the method S 100 can be executed by a native gun ownership application executing on a smartphone.
  • the native gun ownership application can implement methods and techniques described above to regularly (e.g., daily) prompt a gun owner to enter an emotion value.
  • a substantially low emotion value such as “1” or “2”
  • the native gun ownership application can communicate the location of the smartphone—and therefore the location of the gun owner—and a warning to an emergency responder (e.g., a police department, a security agency) and/or to other people near the gun owner, such as the gun owner's family members, friends, neighbors, coworkers, etc.
  • an emergency responder e.g., a police department, a security agency
  • the native mental health application executing on a computing device—executes Blocks of the method S 100 to prompt a user to enter an emotion value when the computing device attempts to connect to a wireless network, such as a cellular network or a Wi-Fi hub within a hospital, airport terminal, or military base.
  • the computing device can return an emotion value entered by the user to the wireless network in order to gain access to the wireless network, and a remote computer system can execute other Blocks of the method S 100 to respond to the user's emotion value, as described above.
  • Blocks of the method S 100 can be executed in any other employment, teaching, or operations environment to collect and track feedback pertaining to mental health, physical health, satisfaction, and/or productivity, etc. from one or more employees or affiliates.
  • the systems and methods of the embodiments can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions.
  • the instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, hardware/firmware/software elements of a user computer or mobile device, wristband, smartphone, or any suitable combination thereof.
  • Other systems and methods of the embodiment can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions.
  • the instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above.
  • the computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device.
  • the computer-executable component can be a processor but any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Economics (AREA)
  • Computing Systems (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

One variation of a method for tracking and responding to mental health changes in a user, the method includes: rendering a graphical object within a graphical user interface; indexing an emotion value assigned to the graphical object through a spectrum of emotion values according to a direction of an input into the graphical user interface; within the graphical user interface, updating the graphical object to visually correspond to the emotion value assigned to the graphical object; recording submission of a final emotion value through the graphical user interface; and in response to the final emotion value equaling a trigger value, distributing a prompt to monitor the user to an external entity.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 62/203,083, filed on 10 Aug. 2015, which is incorporated in its entirety by this reference.
  • TECHNICAL FIELD
  • This invention relates generally to the field of mental health and more specifically to a new and useful method for tracking and responding to mental health changes in the field of mental health.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 is a flowchart representation of a method;
  • FIG. 2 is a flowchart representation of one variation of the method;
  • FIGS. 3A and 3B are graphical representations of one variation of the method;
  • FIG. 4 is a flowchart representation of one variation of the method;
  • FIG. 5 is a flowchart representation of one variation of the method;
  • FIG. 6 is a flowchart representation of one variation of the method;
  • FIG. 7 is a flowchart representation of one variation of the method;
  • FIG. 8 is a graphical representation of one variation of the method;
  • FIG. 9 is a graphical representation of one variation of the method; and
  • FIG. 10 is a graphical representation of one variation of the method.
  • DESCRIPTION OF THE EMBODIMENTS
  • The following description of the embodiments of the invention is not intended to limit the invention to these embodiments but rather to enable a person skilled in the art to make and use this invention. Variations, configurations, implementations, example implementations, and examples described herein are optional and are not exclusive to the variations, configurations, implementations, example implementations, and examples they describe. The invention described herein can include any and all permutations of these variations, configurations, implementations, example implementations, and examples.
  • 1. Method
  • As shown in FIG. 1, a method S100 for tracking and responding to mental health changes includes: rendering a graphical object within a graphical user interface in Block S110; in response to a swipe input over the graphical user interface by a user, indexing an emotion value represented on the graphical object in Block S120; updating a color value and a virtual viscosity represented on the graphical object based on the emotion value represented on the graphical object in Block S130; in response to an input on the graphical user interface, submitting the emotion value to an external entity in Block S150; and in response to the emotion value equaling a trigger value, prompting the external entity to contact the user in Block S150.
  • As shown in FIG. 1, one variation of the method S100 includes: rendering a graphical object within a graphical user interface in Block S110; indexing an emotion value assigned to the graphical object through a spectrum of emotion values according to a direction of an input into the graphical user interface in Block S120; within the graphical user interface, updating the graphical object to visually correspond to the emotion value assigned to the graphical object in Block S130; recording submission of a final emotion value through the graphical user interface in Block S140; and in response to the final emotion value equaling a trigger value, distributing a prompt to monitor the user to an external entity in Block S150.
  • As shown in FIG. 6, another variation of the method S100 includes: rendering a graphical object within a graphical user interface in Block S110; indexing an emotion value assigned to the graphical object according to a direction of a swipe input over the graphical object within the graphical user interface in Block S120; updating a color value and a virtual viscosity of the graphical object to correspond to the emotion value assigned to the graphical object in Block S130; distributing a dynamic visual object, representing the graphical object in a color value and a virtual viscosity corresponding to a final emotion value selected by the user, to a first recipient elected by the user in Block S150; in response to the final emotion value equaling a first trigger value, prompting a second recipient to contact the user in Block S150; and in response to the final emotion value equaling a second trigger value representing a less content state of the user than the first trigger value, distributing the final emotion value to a mental health entity in Block S150.
  • As shown in FIGS. 6 and 9, yet another variation of the method S100 includes: at a computing device linked to the user, receiving an emotion value, on a spectrum of emotion values, selected by the user in Block S140; recording the emotion value in a database of emotion values entered by the user in Block S150; enabling access to a process on the computing device in response to receipt of the emotion value in Block S150; and, in response to the emotion value equaling a trigger value corresponding to a low state of contentment, distributing a second prompt to a mental health representative to monitor the user in Block S150.
  • 2. Applications
  • Generally, Blocks of the method S100 can be executed on a user's computing device and/or within a computer network to track the user's emotional state—such as the user's current mood, general feeling, perceived emotional health, perceived relationship condition, and/or perceived financial condition, etc.—through simple gestures entered manually by the user over time and to selectively prompt others to support the user based on quantitative or qualitative values entered by the user to represent the user's emotional state. In particular, Blocks of the method S100 can be implemented by a standalone native mental health application, a native employment application issued by an employer, an alternate lock screen, and/or an alternate keyboard, etc. (hereinafter a “graphical user interface”) executing on a smartphone, tablet, or other computing device owned by or assigned to a user. At various instances over time, the user can access the graphical user interface to submit an emotion value representing the user's current emotional state, such as in response to a prompt automatically issued by the computing device or in order to access additional content or functions on the computing device.
  • In one example, a computing device executing Blocks of the method S100 S100 renders a virtual sphere (i.e., a “graphical object”) on a touchscreen, moves the virtual sphere according to swipe inputs entered by the user over the touchscreen, and updates an emotion value represented by (e.g., displayed on), a color of, and/or a geometry of the virtual sphere in response to each swipe input entered by the user. The user can then confirm the emotion value represented by the virtual sphere, such as by selecting a confirm region of the touchscreen; and the computing device can upload the final emotion value to a remote database, push the emotion value to a user-elected recipient, and/or distribute prompts to friends, family, therapists, emergency responders, etc. based on the emotion value, such as if the emotion value equals one of a preset trigger value associated with such prompts.
  • A computing device (e.g., a smartphone, tablet, augmented or virtual reality headset, heads-up or eyes-up display, monitoring device, wearable sensor, implanted device, smart glasses, or smartwatch, etc.) executing Blocks of the method S100 can therefore enable a user to quickly and seamlessly enter a qualitative or quantitative value representing her perceived emotional state or perceived emotional wellbeing and then selectively engage others to support the user when emotion values entered by the user indicate that the user may be in need of such support. For example, an alternate keyboard executing Blocks of the method S100 can enable the user to send a graphical object (e.g., a dynamic virtual sphere, a GIF, an emoticon) representing the user's perceived emotional state to one selected recipient through a native text messaging application; the alternate keyboard can also store the emotion value selected by the user for the graphical object, prompt a friend or family member of the user other than the selected recipient to contact or otherwise support the user if the selected emotion value equals a low trigger value, and prompt yet another entity (e.g., a therapist, an emergency responder) to contact or monitor the user if the selected emotion value equals an even lower trigger value. In another example, a computing device (e.g., a smartphone or tablet) can incorporate Blocks of the method S100 into a lock screen, prompt the user to both enter a passcode and to select (or enter) an emotion value in order to unlock the computing device, record a selected emotion value upon receipt of a correct passcode, and then selectively prompt friends, family, a therapist, or an emergency responder, etc. to connect with the patient if the selected emotion value is sufficiently low or equals corresponding trigger values. Therefore, the method S100 can be: implemented in a consumer application (e.g., a communication application) or process to enable a user enter emotion values, to track the user's emotion values over time, and to automatically prompt other humans to support the user; and/or implemented in an enterprise application to track the emotional state of employees, to gate (i.e., limit) access to sensitive processes or information when the user's selected emotion values are sufficiently low, and to automatically prompt involve a mental health professional to monitor the user when such need is indicated by the user's selected emotion values.
  • Blocks of the method S100 can be implemented by a computing device (e.g., a smartphone, a tablet, a smart watch), within a computer system, across a computer network and can be hosted by a mental health clinic, a hospital, an employer, a school, a government or agency, a military or defense force, or any other entity to collect simple (e.g., basic) mental health-related data manually entered by one or more users within a user population. However, Blocks of the method S100 can be executed on any other computer system and in conjunction with any other entity or service.
  • 3. Emotion Value
  • Blocks of the method S100 collect, handle, and respond to emotion values entered by the user through a computing device. Generally, an emotion value can represent the user's perceived overall emotional state, perceived emotional wellbeing, perceived intensity of a mood, or emotional stability, etc. at a particular moment in time in a single quantitative or qualitative value that can be simply and quickly entered into the computing device by the user.
  • In this implementation, the computing device: maintains a range of quantitative emotion values from “1” through “10,” inclusive; indexes through this range of values according to inputs into the computing device; and renders a single current emotion value from this range on the graphical object (e.g., a virtual sphere) shown within the graphical user interface at any one time, as shown in FIG. 1. For example, each emotion value within the range of emotion values can correspond to a particular response (or particular response range) on a continuum of responses, including: “emergency” for an emotion value of “1”; “need support” for an emotion value of “2”; “very poor” for an emotion value of “3”; “poor” for an emotion value of “4”; “average” for an emotion value of “5”; “ok” for an emotion value of “6”; “good” for an emotion value of “7”; “great” for an emotion value of “8”; “excellent” for an emotion value of “9”; and “extraordinary” for an emotion value of “10” (or vice versa). In this example and as described below, the computing device executing Blocks of the method S100 can index through these quantitative emotion values in order based on a direction of each swipe input entered into the computing device by the user. In particular, by first presenting the emotion value “5” in a virtual sphere shown in the graphical user interface on a display of the computing device, indexing toward “1” in response to each swipe-down or (scroll-down) input event at the computing device, and indexing toward “10” in response to each swipe-up or (scroll-up) input event at the computing device, the computing device can enable the user to reach any emotion value representing her current emotional state with as little as five simple and identical swipe (or scroll) inputs into the computing device. The computer system can also update a color of, a virtual viscosity of, and/or a number value shown on a graphical object (e.g., a virtual sphere) rendered in the graphical user interface to represent the current quantitative emotion value selected by the user, as described below and shown in FIG. 2.
  • In another example, an augmented reality headset, a virtual reality headset, or a pair of smart glasses—executing Blocks of the method S100 and worn by a user—advances through the set of emotion values in response to a detected eye wink or head nod performed by the user wearing the computing device. In another example, a smartwatch—executing Blocks of the method S100 and worn by a user—advances through the set of emotion values in response to a detected raise of the user's hand. In yet another example, a smartphone or tablet—executing Blocks of the method S100 and carried by a user—advances forward through the set of emotion values in response to a detected tilt of the smartphone in a first direction and advances backward through the set of emotion values in response to a detected tilt of the smartphone in an opposite direction. Alternatively the smartphone can advance through the set of emotion values when the smartphone is shaken.
  • In another example, the computing device can present a text field within the graphical user interface and enable the user to type a single digit number (e.g., from “0” to “9”)—corresponding to a quantitative emotion value—into the text field. Similarly, the graphical user interface can render a number pad with discrete input regions labeled “1” through “10” (or “0” through “9”), and the user can select one of these numbers to enter a quantitative emotion value. In one variation, Blocks of the method S100 are executed remotely, such as by a remote server or internal server within a network, upon receipt of a number value from a device issued to a user. For example, a user (e.g., an astronaut, naval pilot, or military personnel) can draft a SMS text message containing a numeric value at a satellite phone; the satellite phone can transmit the SMS text message to a remote server (e.g., hosted by a space agency or military group) that then processes the emotion value as described below in order to track and respond to the user's emotions over time. However, the method S100 can implement any other quantity of discrete integer values across any other range of values to represent the user's perceived emotional state.
  • In a similar implementation, a computing device executing Blocks of the method S100 implements a continuous range of quantitative emotion values, such as from “1.00” to “10.00,” inclusive (e.g., 901 discrete possible values). For example, the computing device executing Blocks of the method S100 can implement a graphical slider bar and enable the user to move a slider to any of 901 positions between “1.00” (e.g., for “the worst”) and “10.00” (e.g., for “the best”) on the bar to enter a quantitative emotion value. Therefore, the computing device can enable the user to select an emotion value representing her current emotional state by moving the slider along the bar in a single left or right swipe input. In this example, the computing device can also update the graphical user interface rendered on the display of the computing device to show a quantitative value (e.g., “7.14” or “3.56”) and/or a qualitative value (e.g., “great” or “poor”) corresponding to each position of the slider on the bar as the user manipulates the slider.
  • Alternatively, a computing device executing Blocks of the method S100 can collect, handle, and respond to a range of qualitative emotion values. For example, the computing device can store a set of qualitative emotion values including: “my life is ending,” “worst day ever,” “things are terrible,” “I've been worse,” “I'm fine, “I'm good,” “I'm great, “I'm excellent,” and “I'm phenomenal.” In this example, the computer system can index through this set of qualitative emotion values based on inputs entered by the user until the user reaches a relevant qualitative emotion value. For example, the computing device can render a slider bar within the graphical user interface, label discrete positions along the bar with one of each of these qualitative emotion values, and enable the user to select one of these qualitative emotion values by moving the slider along the bar to a corresponding position. Similarly, the computing device can populate a dropdown menu within the graphical user interface with each of these qualitative emotion values and enable the user to select one of these qualitative emotion values by navigating through the dropdown menu.
  • In yet another implementation, a computing device executing Blocks of the method S100 can automatically transform brainwave data received from a headset—such as an Electroencephalography (EEG) or Quantitative Electroencephalography (qEEG) headset containing one or more EEG electrodes—worn by a user into an emotion value.
  • Hereinafter, Blocks of the method S100 are described as collecting, handling, and responding to a set of quantitative emotion values spanning values “1” through “10,” inclusive. However, the method S100 can implement any other range of quantitative or qualitative emotion values and can pair each emotion value within this range with any other suitable response, response type, or emotion intensity, etc. of the user at the time the emotion value was received.
  • 4. Graphical Object and Graphical Object Updates
  • Block S110 of the method S100 recites rendering a graphical object within a graphical user interface; Block S120 of the method S100 recites, in response to a swipe input over the graphical user interface by a user, indexing an emotion value represented on the graphical object; and Block S130 of the method S100 recites updating a color value and a virtual viscosity represented on the graphical object based on the emotion value represented on the graphical object. Generally, the computing device can execute Block S110 to represent a current emotion value in the form of a graphical (i.e., visual) object; and the computing device can execute Blocks S120 and S130 to update the size, shape, geometry, color, and/or emotion value represented by (or on) the graphical object in response to an input entered by the user into the computing device.
  • In one implementation, the computing device: renders the graphical object on a touchscreen in Block S110; and indexes through the range of emotion values in response to a swipe on the touchscreen and an index direction (i.e., increasing or decreasing) corresponding to the direction of the swipe on the touchscreen (e.g., up or down, respectively) in Block S120, as shown in FIG. 2. In this implementation, the computer system can index through the range of emotion values by a single step per discrete swipe input on the touchscreen in order to preserve deliberate emotion value changes and selections entered by the user. For example, in response to an upward swipe input over the graphical object, the computing device can index from an emotion value of “5” to an emotion value of “6”; similarly, in response to a downward swipe input over the graphical object, the computing device can index from an emotion value of “5” to an emotion value of “4.” Alternatively, the computer system can implement an inertial model to scroll through the range of emotion values based on speed, distance, and/or direction of a swipe input in order to index through multiple emotion values within the range of emotion values in response to a single swipe input entered by the user in Block S120. The computing device can then refresh the graphical user interface rendered on the display of the computing device with an updated graphical object—representing the current emotion value—as the selected emotion value is changed by the user.
  • As the user cycles through the preset range of emotion values, selects an emotion value, or enters an emotion value, the computing device can thus record this new emotion value in Block S120 and visually alter (or “update”) the virtual sphere to visually correspond to the current emotion value in Block 130. For example, as the user enters a swipe input into the touchscreen, the computing device can refresh the touchscreen to illustrate the virtual sphere rolling about its spherical center (and about an axis parallel to the lateral axis of the touchscreen) in a direction corresponding to the direction of the swipe input on the touchscreen. Furthermore, in Block S130, the computing device can update the graphical object with an alternate color, geometry, and/or virtual viscosity, etc. in response to a change in the emotion value selection. In one implementation, the computing device renders the virtual sphere in a warmer color and in an increasingly amorphous form in response to selection of emotion values corresponding to increasingly content states of the user (e.g., in response to selections of greater emotion values) in Block S130, and vice versa. In one example, the computing device renders the graphical object: in blue for a current emotion value of “1”; in blue-green for a current emotion value of “2”; in green for a current emotion value of “3”; in yellow-green for a current emotion value of “4”; and in a similar range of colors up to orange for an emotion value of “10.” The computing device can also set a virtual viscosity of the sphere that is inversely proportional to the emotion value currently selected. For example, in Block S110, the computing device can render a wax-like virtual sphere that appears relatively rigid for a selected emotion value of “1” but appears more malleable as the selected emotion value increases, such as exhibiting less than 2% deformation of its perimeter from a circle in the plane of the touchscreen for a selected number of 1 but exhibiting up to 20% deformation of its perimeter from the circle for a selected number of 10, and the computing device can update the graphical object rendered on the touchscreen in Block S120 accordingly based on the emotion value currently selected. Therefore, in this example, the virtual sphere can appear as hard wax for a selected emotion value of “1” but appear to warm and soften into a more malleable wax ball as the selected emotion value approaches “10.”
  • However, the computing device can also render a graphical object of any other virtual shape, geometry, color, viscosity, etc., such as a virtual cube, a virtual three-dimensional amoeba, or a graphical avatar. Furthermore, Blocks of the method S100 can be implemented by any other device or system and can collect emotion values from the user in any other way and in any other format.
  • 5. Emotion Value Submission
  • Block S140 recites recording submission of a final emotion value through the graphical user interface. Generally, in Block S140, the computing device can record a final emotion value selected by the user, store the final emotion value with a time that the emotion value was entered (e.g., in the form of a timestamp) locally on the computing device, and/or upload the emotion value and related metadata (e.g., timestamp, user ID, computing device location) to a remote server or remote database for storage and subsequent handling in Block S150.
  • In one implementation, the computing device records the current emotion value shown within the graphical user interface (e.g., rendered on the graphical object) in response to a double-tap input over the graphical object, in response to a single-tap input over a “submit” region or other virtual button outside the perimeter of the graphical object rendered on the touchscreen, or in response to any other secondary input into the computing device.
  • The computing device can also combine metadata with the selected emotion value to generate an emotion package for this emotion value submission, and the computing device can store this emotion package locally, upload the emotion package to a remote server or remote database for remote storage, and/or transmit the emotion package to a computing device associated with a contact or mental health professional affiliated with the user. For example, in Block S140, the computing device can combine the emotion value entered by the user with: the user's username, email address, device ID, or anonymized UUID paired with the user's name or ID in a remote DNS; the last GPS location of the computing device; a time and date of submission of the emotion value (e.g., a “timestamp”); a speed of selection and submission of the emotion value (i.e., the duration of time between a first swipe input over the graphical object and submission of the emotion value); and/or action permissions and/or action triggers (described below) stored locally on the computing device. In this example, the computer system can thus query an integrated geospatial position sensor for the geospatial location of the computing device once an emotion value is submitted by the user and then store this location of the computing device at an approximate time the final emotion value was entered by the user with the selected emotion value in the emotion package.
  • In Block S140, the computing device can also combine an emotion value selected by the user with a subset of metadata types specific to the selected emotion value to generate the emotion package for the emotion value submission. For example, the computing device can: combine a selected emotion value and a username only into an emotion package for selected emotion values above and including “4”; combine a selected emotion value, a username, a contact trigger (described below), and contact information for a personal contact entered previously by the user for a selected emotion value of “3”; combine a selected emotion value, username, and a therapist trigger (described below) for selected emotion value of “2”; and combine a selected emotion value, a username, an emergency trigger (described below), and a last known GPS location of the computing device for selected emotion value of “1.”
  • The computing device can combine such metadata with an emotion value selected by the user within an alternate lock screen, within a native mental health application, within a native messaging (e.g., email, SMS text message) application executing an alternate keyboard, or within any other interface executing on the computing device, such as described below.
  • 6. Support Group
  • As shown in FIG. 5, one variation of the method S100 includes Block S160, which recites, prior to rendering the graphical object within the graphical user interface, prompting the user to populate a support group with a set of contacts. Generally, in Block S160, the computing device can prompt and/or enable the user to identify one or more friends, family members, mentors, therapists, or other persons who may offer support, guidance, consolation, and/or compassion to the user. The computer system can record names, email addresses, phone number, usernames, account IDs, or other contact information of persons thus selected by the user, populate a support group affiliated with the user with these selected persons, and associate one or more of these members of the user's support group with select triggers implemented in Block S150.
  • In one implementation in which Blocks of the method S100 are executed locally on the computing device by a native mental health application, the native mental health application accesses a list contacts stored on the computing device, such as in a native contacts application, and prompts the user to select a subset of (e.g., up to five) contacts from the contacts list to form a support group for the user. The native mental health application can then retrieve names, phone numbers, email addresses, and/or other contact information for these selected contacts from the contacts list. Similarly, the native mental health application can prompt the user to select one or more contacts from a list of the user's friends or acquaintances enumerated within an online social network, such as by accessing this list of friends or acquaintances through a native social networking application executing on the computing device; the native mental health application can then implement similar methods and techniques to populate the user's support group with relevant contact information. In another implementation in which the native mental health application functions as a user portal into an emotion tracking and support platform hosting many users, the native mental health application can prompt the user to select one or more other users also on the emotion tracking and support platform. The native mental health application can thus link the user's account to accounts of other users on the emotion tracking and support platform to define a support group for the user. In this foregoing implementation, the native mental health application (or a remote computer system or remote server hosting the native mental health application) can communicate a prompt to each contact selected by the user—such as through a text message, email, communication with the online social network, or in-application notification—to confirm that the contact agrees to support the user.
  • The native mental health application (and/or the emotion tracking and support platform) can therefore enable the user to select one or more contacts to offer support to the user and to confirm that these contacts agree to join the user's support group. The native mental health application can also prompt the user to join a support network for one or more contacts in her support group or to join a support group for one other person on the emotion tracking and support platform for each contact the user adds to her support group in order to maintain a target or minimum ratio of supporting and supported users on the emotion tracking and support platform.
  • As shown in FIG. 5, the native mental health application can also gate (i.e., restrict) the user's access to an emotion value submitted by a contact for whom the user has joined a support group until the user enters her own emotion value. In particular, the native mental health application (or the emotion tracking and support platform) can: revealing a second emotion value—previously submitted by a contact of the user through a second instance of the native mental health application executing on a second computing device—to the user at the user's computing device in response to receipt of an emotion value from the user. For example, the native mental health application can: generate a support group of five friends, family members, and/or mentors of the user; incorporate the user into the support group of each of these contacts; and only present recent emotion values of these other contacts to the user once the user has entered her own emotion value. In this example, the native mental health application can require that the user enter at least one emotion value per twenty-four-hour period in order to see all emotion values entered by these related contacts within the same period of time. Therefore, in order to access information needed to support another contact, the user must supply her own emotion value to members of her support group, thereby enabling these other members of the user's support group to access information needed to support the user.
  • The native mental health application can thus execute on the user's computing device: to populate and maintain a support group for the user; to enable the user to modify account settings and to adjust her support group over time; to control privacy settings; etc. The computing device can also execute Blocks S110, S120, S130, etc. through the native mental health application or through separate applications, such as within an alternate keyboard accessed in a native email, text messaging, or other application or through a lock screen on the computing device, as described below.
  • The computing device can then distribute prompts to monitor, contact, or show support for the user in Block S150, such as by distributing an emotion value to one or more members of the user's support group, regardless of recipients selected by the user, when the emotion value equals “2” or “3” as described below.
  • 7. Categories
  • As shown in FIGS. 4 and 10, one variation of the method S100 includes Block S170, which recites: receiving a first category label for the final emotion value and storing the final emotion value with the first category label; and recording submission of a second emotion value through the graphical user interface, receiving a second category label different from the first category label for the second emotion value, and storing the second emotion value with the second category label. Generally, in this variation, the computing device can associate an emotion value entered by the user with a specific emotion category and thus automatically determine the context of each emotion value and/or enable members of the user's support group to manually determine the context of the user's emotion value.
  • In one implementation, the computing device defaults to labeling the graphical object—representing a current emotion value—with an “overall emotional state” category label and writes an “overall emotional state” label to an emotion value entered by the user when the overall emotional state is current. In this implementation, the computing device can also enable the user to select an alternative category label, such as from a drop down menu, by swiping through a sequence of preset category labels, or by typing a custom category label into a text field. For example, the computing device can enable the user to select one of perceived financial stability, perceived relationship comfort, perceived sleep quality, perceived anxiety level, perceived anger level, perceived physical health, perceived level of confidence, perceived energy level, and/or perceived stress level, etc. of the user before submitting an emotion value, as shown in FIG. 3A. The computing device can then write this label to metadata stored with the emotion value. Immediately upon receipt of a first labeled (i.e., categorized) emotion value at a later time, the computing device can enable the user to enter a second emotion value and to select the same or other label for the second emotion value. For example, within one use episode, the computing device can record: a first emotion value with a first category label corresponding to an overall emotional state of the user; and a second emotion value with a second category label for the second emotion value corresponding to perceived financial stability.
  • In the foregoing variation in which the computing device populates and maintains a support group for the user, the computing device can link a unique support group to each category selected by the user. The computing device can also implement different trigger values to trigger transmission of a prompt to a member of a support group for each category label. For example, when the user enters an emotion value of “2” or “3” with an “overall emotional state” category label, the computing device can transmit the user's ID, the emotion value, and/or a prompt to contact the user to each member of the user's primary support group in Block S150; in this example, when the user enters an emotion value of “1” or “2” with a “financial security” category label, the computing device can transmit the user's ID (e.g., name), the emotion value, the category label, and/or a prompt—to contact the user—to the user's financial advisor noted in a second finance support group. Therefore, the computing device can: prompt a member of a first support group affiliated with the user and associated with a first category label to contact the user in response to an emotion value equaling the trigger value for the first category label; and prompt a member of a second support group affiliated with the user and associated with a second category label to contact the user in response to an emotion value equaling the trigger value for the second category label.
  • Alternatively, the computing device can selectively distribute a prompt to contact the user to members of the user's support group based on the emotion value, the corresponding category label, and an expertise noted for each member of the user's support group. For example, when the user enters an emotion value of “2” with an “overall emotional state” category label, the computing device can transmit the user's ID, the emotion value, and/or a prompt to support the user to each member of the user's primary support group in Block S150; in this example, when the user enters an emotion value of “2” with a “financial security” category label, the computing device can transmit the user's ID, the emotion value, the category label, and/or a prompt—to contact the user—to the user's parents noted in the user's support group; and, when the user enters an emotion value of “2” with a “friendships” category label, the computing device can transmit the user's ID, the emotion value, the category label, and/or a prompt to contact the user to the persons—noted in the user's support group—of approximately the same age as the user but not related to the user. The computing device can thus prioritize transmission of prompts—to contact the user—to select members of the user's support group in response to receipt of an emotion value based on each member's known (e.g., user- or member-entered) expertise and the category label assigned to the emotion value.
  • 6. Action Triggers
  • Block S150 of the method S100 recites, in response to the final emotion value equaling a trigger value, distributing a prompt—to monitor or contact the user—to an external entity. Generally, in Block S150, the computing device (or a remote computer system in cooperation with the computing device) selectively communicates a prompt to an external entity to contact the user—such as immediately or within some time window via textual communication (e.g., SMS text message), in a phone or video conferencing call, or in person—upon receipt of a user-entered emotion value indicating that the user may benefit from immediate support, as shown in FIGS. 1 and 6.
  • In one implementation, the computing device (or the native mental health application, the emotion tracking and support platform, etc.) distributes prompts to support the user to external entities only when an emotion value equals a trigger value indicating that the user is experiencing relatively low contentment, relatively high anxiety, and/or relatively high concern such that persons possibly in positions to support the user are notified of the user's emotional state only when the user is in greatest need. For example, the computing device can: communicate a first prompt to monitor the user to a computing device associated with a therapist, psychologist, or other mental health professional affiliated with the user upon receipt of an emotion value equaling a first trigger value of “1” from the user; push a notification to interact with the user to a computing device associated with a known contact of the user (e.g., all or a subset of members of the user's support group) in response to the final emotion value equaling a second trigger value of “2” or “3”; and withhold communication of emotion values to persons other than those explicitly selected by the user to receive the user's emotion value (e.g., a recipient of a private text message selected by the user, as described below) for emotion values greater than “3,” exclusive.
  • In Block S150, the computing device (or the native mental health application, the emotion tracking and support platform) can therefore: distribute a dynamic visual object—representing the graphical object in a color value and a virtual viscosity corresponding to an emotion value selected by the user—to a first recipient elected by the user, such as when the user elects to send an emotion value to the user through a native text messaging application or through a native social networking application executing on the computing device as described below; prompt a second recipient (e.g., a member of the user's support group) to contact the user when the emotion value entered by the user equals a first trigger value (e.g., “2” or “3”); and distributes the emotion value to a mental health entity (e.g., a therapist, a psychologist, an emergency responder) when the final emotion value equals a second trigger value (e.g., “1”) representing a less content state of the user than the first trigger value.
  • In another implementation, the native mental health application, lock screen, or alternate keyboard, etc., executing Blocks of the method S100 on the computing device can communicate a static prompt to the user, such as “How are you?”; the user can thus submit an emotion value corresponding to her general feeling or mood. Alternatively, the computing device can communicate one or more distinct prompts to the user at any one time. For example, the computing device can cycle through a prepopulated set of prompts directed toward the user's current feeling or mood about a particular life area, such as a first prompt for general feeling, a second prompt for emotional health, a third prompt for relationship condition, and/or a fourth prompt for financial condition. In this example, the computing device can represent responses to each prompt with a discrete graphical object (e.g., a virtual sphere) specific to the prompt, wherein the graphical object is numbered and colored according to the emotion value selected by the user. The computing device can render these discrete graphical objects together or independently across a set of slides. Alternatively, the computing device can render one graphical object representing a response to the first prompt related to the user's general feeling or mood, and the computing device can render a second graphical object representing responses to emotional health-, relationship-, and finance-related prompts, as shown in FIG. 3B. In Block S140, the computing device can therefore combine the selected emotion value with a pointer to a corresponding prompt or prompt type issued to the user prior to submission of the emotion value, as shown in FIG. 1.
  • In a similar example, for a prompt relating to the user's general mood or feeling, the computing device (or remote computer system) can dispatch the police and an ambulance to the user's last recorded GPS location in response to submission of an emotion value of “1.” In particular, the computing device (or the native mental health application, the emotion tracking and support platform) can dispatch an emergency responder to a last geospatial location of the user's device—extracted from emotion value metadata, as described above—in response to the most recent emotion value entered by the user equaling a trigger value corresponding to a low state of contentment, such as an emotion value of “1” indicating a lowest level of contentment, a highest anxiety, or a greatest level of stress. In this example, in response to a submitted emotion value of “2,” the computing device can also transmit a notification to the user's personal or company therapist to call the user or automatically access a calendar on the user's computing device, compare the user's calendar to the therapist's calendar, automatically schedule a call or in-person meeting as soon as possible based on availability of the user and the therapist (or move less important meetings off of the therapist's calendar in order to see the user), and then notify the user and/or the therapist of the scheduled meeting time. Furthermore, in this example, the computing device can push a textual notification to one contact in a preset shortlist of contacts (i.e., members of the user's support group) previously entered by the user in response to a submitted emotion value of “3.” In this example, the computing device can cycle through a shortlist of contacts previously entered by the user, such as entered by the user upon installation and first login into a standalone native mental health application, an alternate mental health keyboard, an alternate lock screen application, etc. to populate a personal contacts shortlist, including a phone number and email address for each of a parent, a sibling, a spouse, a best friend, and a mentor. However, in this example, the computing device can execute no action in response to a submitted emotion value of “4” or greater other than to store the submitted emotion value in a local or remote database of emotion values entered by the user and/or to send the emotion value to a contact explicitly selected by the user to receive the emotion value.
  • In another example, the computing device can: serve a prompt to the user to indicate the user's feeling regarding her finances; transmit a trigger to the user's bank and credit institutions to reject all transactions over $20 if the user submits an emotion value of “1” in response to a finance-related prompt; and access the user's calendar and automatically schedule a call or a meeting with a financial planner if the user submits an emotion value of “2” or less in response to this prompt.
  • The computing device (or the native mental health application, the emotion tracking and support platform) can implement the foregoing methods and techniques in Block S150 based on a category selected by the user when entering an emotion value (i.e., rather than based on textual or audio prompt served by the computing device to the user over time), as described above.
  • In Block S150, the computing device can implement standard action trigger and emotion value pairs across a population of users. Alternatively, the computing device can match standard action triggers (e.g., notify a therapist, notify support group members) to select emotion values based on trends or frequencies in emotion values entered by the particular user. For example, the computing device can “normalize” emotion values assigned to action triggers for a particular user based on historical emotion values entered by the user, such as by shifting an action trigger relative to the spectrum of possible quantitative emotion values based on a recent trend in emotion values submitted by the user or based on an average emotion value submitted by the user over an extended period of time (e.g., six months). The computing device can also maintain custom action triggers and corresponding emotion values specific to the user (or to a subset of users within a population), such as triggers to notify a therapist for users who have currently engaged therapists and triggers to notify a best friend for users who have not currently engaged therapists.
  • Therefore, in the foregoing implementation, the computing device can assign trigger values to emotion values entered by the user based on historical user emotion value data. For example, the computing device can quantify an emotional stability of the user based on a current emotion value and emotion values previously entered by the user based on a magnitude of total deviation from an average emotion value entered by the user over a period of time. In this example, if the user regularly enters emotion values between “1” and “10” with similar frequency, the computing device can label the user as unstable and set a trigger value to connect the user with others at “5” in order to “catch” or slow the user's transition to emotion values of “1.” If the user regularly enters emotion values between “3” and “7,” the computing device can label the user as moderately stable and set a trigger value to connect the user with others at “3” in order to prompt support of the user when the user is most in need. Furthermore, if the user regularly enters emotion values between “6” and “9,” the computing device can label the user as highly stable and set a trigger value to connect the user with others at “4” in order to prompt support of the user when the user's emotional state is declining outside of a regular bound. However, the computing device can characterize the user's emotional stability according to any other schema or schedule and can implement this characterization in any other way to set trigger values for the user.
  • The computing device can also implement a mental health history of the user to assign triggers to emotion values entered by the user. For example, for a user with a history of self-harm, the computing device can: prompt members of the user's support group to offer support to the user in response to entry of an emotion value of “3” or “4”; prompt a mental health professional to monitor the user in response to an emotion value of “1” or “2”; and dispatch an emergency responder to the user's last known location if the user does not check in with another person or with the user's therapist within one hour of entering an emotion value of “1.” However, for a user with a history of harm to others, the computing device can: prompt a mental health professional to monitor the user in response to an emotion value of “1,” “2,” or “3”; dispatch an emergency responder to the user's last known location if the user does not check in with another person or with the user's therapist within one hour of entering an emotion value of “2”; and immediately dispatch an emergency responder to the user's last known location in response to entry of an emotion value of “1.” Furthermore, for a user with no known history of self-harm or harm to others, the computing device can: prompt a mental health professional to contact the user in response to an emotion value of “1”; and prompt members of the user's support group to contact or otherwise support the user in response to entry of an emotion value of “2” or “3.”
  • The computing device (or the native mental health application, the emotion tracking and support platform) can automatically assign such trigger values to emotion values for a user, such as based on digital medical records of the user. Alternatively, the user, a mental health professional, an employer, members of the user's support group, or any other entity can manually assign these triggers to emotion values for the user (or for a population of users).
  • The computing device (or the native mental health application, the emotion tracking and support platform) can also implement dynamic trigger values. In particular, rather than notifying a member of the user's support group when the user enters an emotion value of “2,” the computing device can prompt the support group member to contact the user when the user's entered emotion values are trending downward and take a rapid decline. For example, if the user enters a sequence of emotion values from “9” to “8,” then “7, and “then “6,” over a period of several hours or two days, the computing device can prompt members of the user's support group to contact the user when the user subsequently enters an emotion value of “5.” Similarly, if the user regularly enters emotion values of “8,” “9,” and “10” but now enters a “4,” the computing device can notify members of the user's support group to contact the user to support the user following this rapid decline in the user's emotional state.
  • 7. Prompts
  • In Block S150, the computing device can serve a variety of prompts to external entities. In one implementation in which the user enters an emotion value equaling a trigger value for contacting an emergency responder, the computing device (or the native mental health application, the emotion tracking and support platform) can transmit an electronic textual notification—including the user's name, location, and concern (e.g., for self-harm or harm to others) from emotion value metadata—to the emergency responder. Alternatively, the computing device can: place an automated call to the emergency responder; or transmit an electronic notification to a mental health professional with metadata for the low emotion value and a prompt to alert an emergency responder. However, the computing device can notify an emergency responder in any other way in Block S150.
  • In another implementation in which the user enters an emotion value equaling a trigger value for contacting a mental health professional (or other professional service provider), the computing device (or the native mental health application, the emotion tracking and support platform) can transmit an electronic textual notification—such as in the form of a text message, email, or in-application notification—to the mental health professional in Block S150. For example, if the user has an existing relationship with the mental health professional—such as for therapy, psychological testing, or pain management, etc. or if the mental health professional is affiliated with an employer or other organization employing the user—the computing device can generate a textual communication including the user's name, the user's recent emotion value, the user's location, the user's contact information (e.g., phone number), and/or a prompt to call, message, or otherwise monitor the user. In particular, in Block S150, the computing device can prompt a mental health professional to contact the user directly or to monitor the user, such as by tracking the location of the user's computing device through a web-based doctor portal or by contacting persons physically near the user or family or friends of the user. However, if the user is not currently affiliated with a mental health professional, the computing device can issue a notification to the user to confirm that the user would like to connect to a mental health professional, and the computing device can automatically select a mental health professional from a directory and connect the user to the mental health professional, such as via text, email, or phone call, in Block S150 following confirmation from the user. However, the computer system can prompt a mental health professional to contact or monitor the user in any other way in Block S150 in response to entry of a emotion value equaling a corresponding trigger value.
  • In another implementation in which the user enters an emotion value equaling a trigger value for contacting a peer or a member of a support group, the computing device (or the native mental health application, the emotion tracking and support platform) can push a notification to show concern for the user to such a recipient. For example and as described above, the computing device can push an electronic notification—such as a text message, email, or in-application notification to call or message the user—a member of the user's support group, regardless of the recipient of the emotion value elected by the user, in response to entry of an emotion value of “2” or “3.” In this example, the computing device can generate a notification including the user's phone number, the user's email address, a link to a messaging center in which the recipient can message the user, etc., and the recipient can use this contact information to contact the user.
  • The computing device can additionally or alternatively generate such a notification that includes prompts to the recipient to show care or concern for the user in other ways. For example, the computing device can generate a notification that recites, “Jess is feeling down. Send her a song that has special meaning to your relationship with her.” In this example, the computing device can incorporate a link (e.g., a website URL, an in-application link) to a digital music service (e.g., a commercial music streaming service) into the notification; upon selection of the link, the recipient can access the digital music service, navigate to a particular song, and select this song to share with the user; the emotion tracking and support platform or other messaging service can load a link to this song into a new communication from the recipient to the user and transmit this new communication to the user; upon receipt of this new communication at the computing device, the user can select this link, and the computing device can stream the song from the digital music service. The computing device can thus prompt and enable a contact of the user (e.g., a member of the user's support group) to send the user a song that may have special meaning to the user rather than calling or text messaging the user. However, in this example, the computing device (or the native mental health application, the emotion tracking and support platform) can interface with a music streaming service in any other way to replay a song selected by the user's contact for the user.
  • In another example, the computing device can generate a notification that recites, “Jess is feeling down. Send her an old photo of the two of you doing something you really enjoyed,” and the computing device can send this notification to a recipient upon entry of a low emotion value by the user. In this example, the emotion tracking and support platform can then communicate a digital photograph—selected by the recipient at the recipient's computing device (e.g., smartphone)—back to the user's computing device, and the user's computing device can render the digital photograph within a notification or within the native mental health application executing on the user's computing device. The computing device can similarly prompt the recipient to send a video, digital sketch, or other media to the user in response to entry of a low emotion value by the user. However, the computing device (or the native mental health application, the emotion tracking and support platform) can prompt a recipient to send any other media to the user and can present such media to the user in any other way.
  • In another example, the computing device can generate a notification that includes a prompt to physically visit the user and can send this notification to a particular recipient, such as a recipient physically near the user. In this example, if the user enters a low emotion value into her computing device, the computing device can: query an internal GPS sensor for the user's current location; cross reference the user's location to locations of members of the user's support group to prioritize members of the support group by proximity to the user (e.g., by building, street, campus, city, region, state); and select a member of the support group nearest the user to receive this notification.
  • In yet another example, the computing device can generate a notification that includes a prompt to send the user a physical gift. In this example, the computing device can include the user's current location or address, a link to an online florist, and a prompt to send flowers to the user in a notification sent to the recipient; upon receipt of the notification, the recipient can select the link to access the online florist, select a flower arrangement, and submit payment and delivery details in order to initiate delivery of flowers to the user. Similarly, the computing device can prompt the recipient to send a pair of socks, a meal, a coffee, or other tangible good to the user in response to entry of a low emotion value by the user, such as by incorporating links to online retailers through which the recipient may select and purchase a good for delivery to the user.
  • In another example, if the user enters a low emotion value equaling a trigger value for involvement of a therapist or other mental health professional, the computing device (or the native mental health application, the emotion tracking and support platform) can also generate a notification to subsidize a session with the mental health professional and then communicate this notification to one or more members of the user's support group. In particular, the computing device (or the native mental health application, the emotion tracking and support platform) can prompt members of the user's support group to supply all or partial payment for the user to complete a session with a therapist or other mental health professional, such as if the user enters a low emotion value (e.g., a “1” or a “2”) with an “overall emotional state” category label and enters a similarly low emotion value (e.g., a “1,” “2,” or “3”) with a “financial security” category label. The emotion tracking and support platform can thus collect funds from these recipients and release these funds to the user or to a selected therapist following completion of a session. Alternatively, recipients of such a prompt can transfer funds to the user or to the therapist through other external currency transfer services.
  • However, the computing device (or the native mental health application, the emotion tracking and support platform) can generate and transmit a notification including a prompt for any other action to a member of the user's support group or other recipient in response to entry by the user of an emotion value equaling a corresponding trigger value. The computing device can also index through these notification types over time. The computing device can also selectively issue these various notification types to recipients based on: one or more category labels entered by the user; proximity of recipients to the user; perceived financial security of recipients (e.g., by sending a prompt to send funds or to send a physical gift only to recipients with current emotion values with “financial security” category labels exceeding “5”); a familial or friendship status with the user; an age gap with the user (e.g., by prompting recipients of the same age to send a song but prompting recipients substantially older than the user to call the user); etc.
  • Therefore, by varying a notification type sent to a recipient in Block S150 in response to a low emotion value entered by the user, the computing device may disguise a consequence of entry of a emotion value from the user. In particular, if the user is not certain that a phone call or text message from a family member or friend will follow entry of a low emotion value but rather that a wider variety of support modes are possible, the user may be more likely to enter honest emotion values over time. Similarly, receipt of flowers, a song, or other digital or physical media following entry of a low emotion value may abstract such a gift or media from the low emotion value, thereby curbing the user's expectations that a gift of other media may follow entry of a low emotion value such that the user may be more likely to enter honest emotion values over time.
  • 8. Lock Screen
  • In one variation, the computing device executes Blocks of the method S100 within an alternate lock screen, as shown in FIG. 8.
  • In this variation, in Block S110, the computing device can render the graphical object on the touchscreen in response to an initial input that transitions the computing device out of a standby mode. The user can then swipe up or down on the touchscreen to pull an emotion value—in the range of emotion values—into the graphical object, and the computing device can update the graphical object rendered in the lock screen accordingly in Block S120. The user can continue to swipe over the lock screen to cycle through the range of emotion values until an appropriate emotion value is reached, and the computing device can update the graphical object—including its color, viscosity, and emotion value—in Blocks S120 and S130 according to each additional swipe entered by the user. Once the appropriate emotion value is reached, the user can enter a passcode through an alphanumeric keypad adjacent the graphical object or in an alternate screen of the lock screen to unlock the computing device and enter the emotion value. Alternatively, the user can double tap the graphical object, select an alternate region on the touchscreen, trace a passcode pattern, or enter any other input into a touchscreen of the computing device to unlock the computing device and to submit the selected emotion value. The computing device can then record this emotion value in Block S130, unlock the computing device, and then handle the selected emotion value as described herein.
  • Therefore, in this variation, the computing device (e.g., a smartphone) can: render a graphical object (e.g., a virtual sphere) within a lock screen, such as adjacent a numeric keypad in Block S110; record an emotion value selected by the user in response to entry of a correct passcode into the lock screen (and store this emotion value in a local or remote database of emotion values entered by the user) in Block S140; and distribute a prompt—to contact the user—to an external entity in response to the emotion value—entered through the lock screen upon receipt of the correct passcode—equaling a trigger value.
  • In this variation (and in other variations), the computing device can alter the initial emotion value applied to the graphical object with each subsequent unlock cycle (or with each subsequent entry of an emotion value) in order to require the user's active attention when swiping to the appropriate number. In particular, by both pseudorandomly changing the initial emotion value inserted into the graphical object upon each new unlock cycle and automatically executing actions in response to entry of select emotion values (e.g., 1, 2, and 3), the computing device can encourage the user to deliberately and attentively swipe to an appropriate number rather than simply thoughtlessly swiping to an emotion value and selecting a submit key. Similarly, the computing device can change how swipe directions (e.g., up or down) correspond to index directions (e.g., increasing and decreasing) for cycling through the range of emotion values and can modify swipe distances that trigger an index event for cycling through the range of emotion values for each new unlock cycle to achieve similar effects.
  • Furthermore, in the foregoing and other implementations, the computing device can: pseudorandomly assign an initial emotion value—in a spectrum of emotion values corresponding to unique contentment states of the user—to the graphical object; and index through this spectrum of emotion values in a loop—sequentially or in a pseudorandom order—according to a sequence of inputs into the graphical user interface. In particular, by varying a start value and/or an order through which emotion values in the range or spectrum of emotion values are cycled through responsive to user inputs into the computing device, the computing device can prompt the user to pay greater attention to which emotion value is selected before submitting a final emotion value, which may elicit greater honesty in emotion value submissions by the user. Furthermore, the computing device (or the native mental health application, the emotion tracking and support platform, etc.) can characterize an integrity of an emotion value entered by the user based on a number of swipe (or other) inputs entered by the user into the computing device before reaching a final emotion value. For example, if the number of swipe inputs entered by the user historically falls between no swipe input and two swipe inputs for a pseudorandom order of emotion values but the user's selected emotion values exhibit large swings between “1” to “10,” the computing device can characterize the honesty of the user's inputs (or engagement with the native mental health application) as low. However, if the user regularly swipes between one and six times when selecting an emotion value and enters emotion values than exhibit relatively smooth changes over time, the computing device can characterize the honesty of the user's inputs (or engagement with the native mental health application) as high.
  • 8. Keyboard
  • In another variation, the computing device executes Blocks of the method S100 within an alternate keyboard, as shown in FIG. 7. In this variation, in Block S110, the computing device can render the graphical object within a keyboard area as the user drafts or reviews textual content within any one or more native applications executing on the computing device. In particular, rather than discrete keys representing discrete alphanumeric characters, the alternate keyboard includes an instance of the graphical object rendered within a defined keyboard area, and the alternate keyboard updates the emotion value, color, and/or geometry, etc. of the graphical object within the keyboard area in Blocks S120 and S130 in response to swipe gestures entered by the user over the keyboard area, such as described above.
  • In one example application, while in a native text messaging application executing on a computing device, the user can switch to the alternate keyboard and swipe over the graphical object to select an emotion value, as in Blocks S110 and S120. In response to a double tap input over the graphical object, a selection within the keyboard area and outside of the graphical object, or any other suitable input, the alternate keyboard can load a static image of the graphical object—including the final color, shape, and emotion value of the graphical object—into a message preview within the text messaging application in Block S140, as shown in FIG. 1. The text messaging application can then transmit the image of the graphical object to a selected recipient in response to selection of a virtual “send” button by the user. In this example application, the alternate keyboard can alternatively load a dynamic image, such as a GIF or video file showing the virtual object rotating about its center, deforming according to its assigned viscosity corresponding to the selected emotion value, and/or pulsing within a narrow color band corresponding to selected emotion value, such as described above. Furthermore, in addition to sending the static or dynamic image of the graphical object to the recipient, the alternate keyboard can also generate, store, and/or distribute the selected emotion value and select metadata. For example, the alternate keyboard can store the selected emotion value, the time and date the emotion value was sent to the recipient, and other metadata, as described above, in local memory on the computing device. The alternate keyboard can also encrypt and push these data to a remote database for storage with other user-specific data, or the alternate keyboard can anonymize and push these data to a remote database for storage with other anonymized population data.
  • In this foregoing example application, the recipient can push to the user—through a similar native text messaging application—a prompt to submit an emotion value corresponding to the user's general mood or feeling, the user's emotional health, the user's relationship condition, or the user's financial condition, and the user can switch to the alternate keyboard, swipe the graphical object to select an appropriate emotion value, and transmit this emotion value back to the recipient.
  • The recipient's computing device can thus execute a second instance of the alternate keyboard, and the recipient can open this second instance of the alternate keyboard within a text messaging application executing on the recipient's computing device to select from a prepopulated list of prompts to send to the user. For example, the instance of the alternate keyboard executing on the recipient's computing device can include options to send any one or more of the following prompts to the user: “How are you feeling?,” “How are things with your spouse?,” “How is your emotional health?,” and/or “How are things going financially?” Thus, the recipient can select a particular prompt from this prepopulated list of prompts and send this prompt to the user, and the instance of the alternate keyboard executing on the user's computing device can write a flag for a prompt type corresponding to the particular metadata stored with the emotion value submitted in response to the particular prompt. Alternatively, the instance of the alternate keyboard executing on the user's computing device can implement natural language processing techniques to determine a prompt type from one or more messages sent by the recipient to the user—or vice versa—leading up to the time that the user submitted the emotion value to the recipient; the instance of the alternate keyboard executing on the user's computing device can thus add a flag for the corresponding prompt type to metadata associated with the selected emotion value.
  • In the foregoing example application, the alternate keyboard can also automatically transmit (or trigger transmission of) a prompt to an emergency responder, such as described above, in response to transmission of an emotion value of “1” to the recipient. The alternate keyboard can similarly connect the user with a therapist or to a contact on a preset contact shortlist, such as described above, in response to transmission of an emotion value of “2” and an emotion value of “3,” respectively, to the recipient.
  • The alternate keyboard can implement similar methods and techniques within a native email application executing on the user's computing device. For example, while composing an email within the native email application, the user can select the alternate keyboard, swipe the graphical object to select a relevant emotion value, and then insert a static or dynamic image representing the selected emotion value and the graphical object into an open email draft. The alternate keyboard can store the selected emotion value and related metadata and distribute relevant action triggers, as described above. The alternate keyboard can also scan the body of the user's email, such as including text entered by the user and text in earlier emails in the same email thread—and implement natural language processing techniques to determine the context of (i.e., a prompt type associated with) the selected emotion value, as described above.
  • The alternate keyboard can thus interface with a native text messaging application and/or a native email application executing on the user's computing device to enable the user to send an emotion value to a recipient while enabling additional emotion value tracking and action trigger handling, as described above. The alternate keyboard can also interface with a native application specific to a health clinic or to an employer to enter emotion values into fields corresponding to physical health—, mental health-, and/or productivity-related prompts. Alternatively, the native messaging application can characterize emotion values typed into the message preview by the user and can implement methods and techniques described herein accordingly.
  • In this example application, the computing device can access an alternate keyboard containing the graphical object and enabling the user to cycle through emotion values assigned to the graphical object by entering swipe (or other) inputs into the alternative keyboard. Once a final emotion value is selected by the user, the computing device can write the graphical object—including color, virtual viscosity, and/or other parameters corresponding to the final emotion value—to a message or other text field within a textual communication application (e.g., a native email application, a native SMS text messaging application, or a native social networking application, etc.) executing on the computing device and then transmit this message, including the graphical object, to a recipient selected by the user. Upon receipt of the graphical object, such as in a private text message or through a public post in an online social network, the recipient of the graphical object can thus quickly, visually distinguish the user's emotional state based on the color, emotion number, and other parameters represented by the graphical object. In particular, the graphical object can be colored and move according to the user's selected emotion value, as described above. The computing device can share this graphical object with a recipient explicitly selected by the user (e.g., one person in a private text message, the user's social feed in an online social network) when the emotion value exceeds general or user-specific trigger values with the textual communication application executing on the computing device. However, the computing device can share the graphical object and/or emotion value with other entities (e.g., member of the user's support group, the user's therapist) if the entered emotion value equals a trigger value assigned to the user, as described above, such as automatically or following additional confirmation from the user to share the emotion value.
  • Therefore, the computing device can: access an alternate digital keyboard—in replacement of an alphanumeric keyboard—within a native textual communication application executing on a mobile computing device and render the graphic object within a message preview area of the alternate digital keyboard in Block S110; index the emotion value represented on the graphical object in response to and in a direction corresponding to a swipe input over the message preview area in Block S120; transmitting the graphical object to a recipient selected in the native textual communication application through a private messaging channel in response to receipt of a command at the native textual communication application to send the graphical object to the recipient, such as in the form of a digital file, in Block S150; and withhold the final emotion value from the external entity in response to the final emotion value differing from the trigger value in Block S150. The computing device can also: record a final emotion value selected by the user within the native text messaging application to a database of emotion values entered by the user in Block S140; transmit a form of the final emotion value and an identifier of the user to a mental health representative in response to the final emotion value equaling the trigger value (e.g., “1”) in Block S150; and/or communicate a form of the final emotion value to a contact specified in the user's support group and distinct from the original user-elected recipient in response to the final emotion value exceeding the trigger value and a threshold intervention value exceeding the final emotion value (e.g., equal to a second trigger value of “2” or “3”). The computing device can therefore execute Blocks of the method S100 to distribute the user's emotion value from one recipient selected by the user to many recipients suited to support the user in a time of need.
  • Similarly, the computing device can: access an alternate digital keyboard within a native online social networking application executing on a mobile computing device and render the graphic object within a message preview area of the alternate digital keyboard in Block S110; update a visual form of the graphical object within the preview area to correspond to the emotion value in response to the input into the graphical user interface in Block S120; post the graphical object to a social feed within an online social network in response to receipt of a command at the native online social networking application to post the graphical object to the online social network; and communicate a prompt to a contact affiliated with the user in response to the final emotion value equaling a trigger value, wherein the prompt directs the contact to the social feed or to contact the user directly. The computing device can therefore execute Blocks of the method S100 to distribute the user's emotion value from many recipients to one particular well-suited to support the user in a time of need.
  • 10. Application: Mental Health Clinic
  • In another variation, Blocks of the method S100 are executed within a native mental health application executing on the user's computing device. In this variation, a mental health clinic, rehabilitation center, hospital, or similar institution can track mental health statuses of in-patient and out-patient populations over time through instances of the native mental health application executing on personal or institution-issued computing devices. Each instance of the native mental health application executing on a computing device affiliated with a user (e.g., a patient) can implement an internal graphical user interface, the alternate keyboard, and/or the alternate lock screen described above.
  • In this variation, the native mental health application can issue regular prompts—such as general, mental health-, relationship, or finance-related prompts, as described above—to the user to check-in with an emotion value submission, such as every morning, every afternoon, and every evening. Alternatively, the native mental health application can issue such prompts at dynamic intervals. For example, the native mental health application can issue a higher frequency of check-in prompts to the user for lower emotion values last submitted by the user. In this example, the native mental health application can issue prompts to the user at a frequency of every ten minutes for a last emotion value of “1,” every 30 minutes for a last emotion value of “2,” every hour for a last emotion value of “3,” every two hours for a last emotion value of “4,” etc. The native mental health application can also issue prompts to the user at frequencies based on whether the user is inpatient or outpatient (e.g., 20% higher check-in prompt frequencies for outpatients than for inpatients) and/or based on trends in emotion values submitted by the user (e.g., a lower frequency of check-in prompts if the user is trending toward higher submitted emotion values).
  • In this variation, a therapist or mental health specialist can access a dashboard (e.g., within a native mental health management application, within a web browser) to manually issue a check-in prompt to all or a subset of the affiliated patient population. The native mental health application can also automatically connect the user with this therapist, this specialist, an advisor, or an other contact based on an emotion value submitted by the user, as described above.
  • In this variation, each patient in the patient population can be associated with a particular health condition or set of health conditions, and the native mental health application (or the native mental health management application, the remote computer system, a remote database, etc.) can assign a general prompt, a set of lower-level prompts, and/or a set of action triggers to a particular patient based on the particular patient's current health condition. In one example, for a first user who is an alcoholic, an instance of the native mental health application executing on the first user's smartphone: can connect the first user with a member in the first user's support group in response to submission of an emotion value of “3”; connect the first user with her sponsor in response to submission of an emotion value of “2”; and dispatch the first user's sponsor, a psychotherapist, and/or a police officer to the last known GPS location of the first user's smartphone in response to submission of an emotion value of “1.” In this example, for a second user who is clinically depressed and exhibits suicidal tendencies, an instance of the native mental health application executing on the second user's smartphone: can connect the second user with a close friend in response to submission of an emotion value of “3”; connect the second user with her psychiatrist in response to submission of an emotion value of “2”; and dispatch the second user's psychiatrist and an ambulance to the last known GPS location of the second user's smartphone in response to submission of an emotion value of “1” by the second user.
  • 11. Application: Employer
  • In another variation, Blocks of the method S100 are executed within a native employer application executing on a computing device (e.g., a smartphone, a tablet) affiliated with the user. In this variation, an employer can track mental health statuses of its employees during (and outside of) work hours through instances of the native employer application executing on personal or employer-issued computing devices. Each instance of the native employer application executing on a computing device affiliated with a user (e.g., an employee) can implement an internal graphical user interface, the alternate keyboard, and/or the alternate lock screen described above and as shown in FIG. 9.
  • In this variation, emotion values entered by an employee in response to a general feeling, mood, anxiety, and/or other mental or physical health-related prompt may remain concealed to the employer. In particular, an instance of the native employer application or a remote computer system cooperating with the instance of the native employer application can anonymize emotion values submitted by employees before communicating these data through an employer portal to track mental health, anxiety, etc. across an employee population, or the native application and/or remote computer system can withhold employee-entered emotion values from representatives of the employer (e.g., managers) and instead automatically distribute these data to select third-party services contracted by the employer. For example, the native employer application and/or the affiliated remote computer system can automatically distribute notifications to emergency personnel in response to emotion values of 1 submitted by employees and automatically distribute employee data and prompts to third-party contracted therapists or psychologists in response to emotion values of 2 submitted by employees, as described above.
  • In one implementation, the native employer application executing Blocks of the method S100 can also serve productivity-related prompts to users through an internal graphical user interface, through the alternate lock screen, and/or through an alternate keyboard within a native messaging application. In this implementation, the native employer application can communicate emotion values entered by employees in response to productivity-related prompts to an employer representative, such as to an human resources representative or to the user's manager to track the user's feelings about productivity during work hours. For example, an instance of the native employer application or a remote computer system hosting the native employer application can automatically identify trends in an employee's feeling about her productivity and submit related suggestions to a representative of the employer (e.g., a manager) accordingly. For example, as an employee enters emotion values in response to productivity-related prompts over time, the remote computer system can aggregate and process these emotion values to determine that the user tends to feel most productive between the hours of 8 AM and 1 PM (e.g., a time window corresponding to high frequencies of emotion values above 6), tends to feel much less productive between the hours of 1 PM and 3 PM (e.g., a time window corresponding to high frequencies of emotion values below 5), and tends to regain productivity between 3 PM and 5 PM on weekdays. To preserve employee work satisfaction, the remote computer system can issue a suggestion to the employee's manager to implement a break period from 1 PM to 2 PM for the employee during a one-week trial period. In this example, the remote computer system can continue to receive and track emotion values entered by the employee in response to productivity-related prompts issued during work hours; if the employee shows increased productivity in the hour from 2 PM to 3 PM, the remote computer system can transmit a suggestion to the employee's manager to extend the trial period or to make the break period mandatory for the user.
  • In one example application, an airline issues a native employer application to its grounds crews, pilots, and/or stewards to collect regular check-ins from its employees during operating hours. In one example, an instance of the native employer application executing on a smartphone issued to a pilot, the native employer application can prompt the pilot to check-in before a flight assigned to the pilot, such as: 1 day before, 12 hours before, 8 hours before, 4 hours before, 2 hours before, 1 hour before, 30 minutes before, and 5 minutes before doors are closed to initiate the flight. In this example, the airline can automatically ground the pilot if he enters an emotion value of “1” or “2” within three days of the departure of an assigned flight, and the airline can automatically ground the pilot if he enters more than three emotion values of “3” within 24 hours of the scheduled departure of a flight assigned to the pilot. Furthermore, in this example application, the native employer application can prompt the pilot to check-in during the flight, such as every hour throughout the flight. The native employer application can also increase a check-in frequency during a flight, such as once at the first hour, again at the fourth hour, at the eight hour, at the tenth hour, at the twelfth hour, at the thirteenth hour, at the fourteenth hour, etc. during the flight. The native employer application can interface with a remote computer system (e.g., a remote server) and computing devices issued to a second pilot and stewards on the plane with the pilot to issue prompts related to the emotion values submitted by the pilot. For example, if the instance of the native employer application executing on the pilot's smartphone transmits an emotion value of “2” or less to an affiliated remote server during the flight, the remote server can broadcast notifications to smartphones affiliated with the second pilot and onboard stewards to remove the pilot from the cockpit.
  • In the foregoing example application, an instance of the native employer application can execute on a tablet issued to a grounds crewman to collect emotion values from the grounds crewman through an alternate lock screen, as described above. For example, the native employer application can require the grounds crewman to unlock the tablet by entering an emotion value in response to a prompt of one or more types, as described above, in order to access a digital pre-flight checklist for the aircraft or a refueling checklist, as shown in FIG. 9. Alternatively, the employer can require the grounds crewman to open the native employer application and enter an emotion value at select times during work hours or before performing certain tasks, such as before opening bay doors on an aircraft to load or unload luggage. In another example, the native employer application can execute on a digital control panel mounted on or near a cockpit door of an aircraft and can execute Blocks of the method S100 to prompt a pilot (or steward) to enter an emotion value into the digital control panel before unlocking the cockpit door.
  • Therefore, in this example application, the computing device can execute Blocks of the method S100 to gate access to content on the computing device (e.g., an application executing on the computing device) until an emotion value is selected and submitted by the user. For example, the computing device—issued to the user—can: restrict access to an employer-issued checklist until the final emotion value is selected by the user and submitted through the graphical user interface; and then communicate the emotion value entered by the user and an identifier of the user to a mental health professional or other entity affiliated with the employer in response to the entered emotion value equaling a trigger value.
  • In another example application, a mining operation contracts a mental health application that prompts miners employed by the mining operation to submit emotion values in response to mental health-related prompts when punching in, when punching out, at the beginning of each shift, and/or at the beginning of each break period, such as through a digital punch clock arranged at an entrance of a mining facility or through computing devices issued to each miner or miner group within the mining operation. In this example implementation, the mental health application and/or an associated remote computer system can automatically discharge a therapist—employed by the mining operation or contracted by the operation—to a particular miner or automatically instruct the particular miner to visit an onsite therapist if the particular miner enters an emotion value of “1” or “2,” such as described above.
  • In this variation, Blocks of the method S100 can be executed by equipment rather than by discrete computing devices (e.g., a smartphone, a tablet, a smart watch) assigned to particular employees. For example, when an employee swipes a badge, enters a fingerprint, or logs into a machine with an username and password within an employer's facility, an interlock on the machine can execute Blocks of the method S100 to present the user with a prompt to enter an emotion value corresponding to the user's current mood, feeling, confidence, or productivity, etc., such as by swiping a rendered graphical object or by typing an emotion value on a touch screen within the interlock, as described above. The interlock can then grant access to the user upon receipt of an emotion value thus entered. In this example, the interlock can associate the submitted emotion value with an employee ID, name, or other identifier of the user to generate an emotion package, as described above and then push this emotion package to a remote computer system; the computer system can then implement methods and techniques described above to selectively connect the employee with a preselected contact, a therapist, or an emergency responder based on general or custom action triggers, as described herein.
  • 11. School
  • In another variation, Blocks of the method S100 are executed within a native student application executing on a computing device assigned to or accessed by a student. For example, an elementary school, middle school, or high school can issue mobile computing devices executing Blocks of the method S100 to students in order to track mental health statuses of its students during (and outside of) school hours. In this example, each instance of the native student application can implement methods and techniques described above to collect emotion values entered by a corresponding student in response to one or more prompts, such as in response to check-in prompts issued during certain times of the day (e.g., at the beginning of a school day and following a lunch break) or at the start of each class. Each instance of the native student application can thus respond to action triggers, as described above, by connecting a student to a counselor, advisor, or teacher based on select emotion values entered by the student. Each instance of the native student application can also collect student-entered emotion values into a journal for select distribution to parents and/or teachers affiliated with the school.
  • 11. Guns
  • In another example, the native mental health application is executed by an electronic gun case, an electronic gun lock installed on a gun, or a smartphone or other mobile device wirelessly paired to the electronic gun case or electronic gun lock. In this example, the native mental health application can prompt a user to enter an emotion value before unlocking the electronic gun case or electronic gun lock and can implement methods and techniques described above to notify a member of a support group, a therapist, or an emergency responder (e.g., policy, a security agency, etc.) if the user enters corresponding emotion values, such as “3,” “2,” or “1,” respectively before retrieving or unlocking a gun.
  • In a similar example, Blocks of the method S100 can be executed by a native gun ownership application executing on a smartphone. In this example, the native gun ownership application can implement methods and techniques described above to regularly (e.g., daily) prompt a gun owner to enter an emotion value. Upon receipt of a substantially low emotion value, such as “1” or “2,” the native gun ownership application can communicate the location of the smartphone—and therefore the location of the gun owner—and a warning to an emergency responder (e.g., a police department, a security agency) and/or to other people near the gun owner, such as the gun owner's family members, friends, neighbors, coworkers, etc.
  • 12. Network Access
  • In another example, the native mental health application—executing on a computing device—executes Blocks of the method S100 to prompt a user to enter an emotion value when the computing device attempts to connect to a wireless network, such as a cellular network or a Wi-Fi hub within a hospital, airport terminal, or military base. The computing device can return an emotion value entered by the user to the wireless network in order to gain access to the wireless network, and a remote computer system can execute other Blocks of the method S100 to respond to the user's emotion value, as described above.
  • However, Blocks of the method S100 can be executed in any other employment, teaching, or operations environment to collect and track feedback pertaining to mental health, physical health, satisfaction, and/or productivity, etc. from one or more employees or affiliates.
  • The systems and methods of the embodiments can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, hardware/firmware/software elements of a user computer or mobile device, wristband, smartphone, or any suitable combination thereof. Other systems and methods of the embodiment can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above. The computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component can be a processor but any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.
  • As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the embodiments of the invention without departing from the scope of this invention as defined in the following claims.

Claims (21)

I claim:
1. A method for tracking and responding to mental health changes in a user, the method comprising:
rendering a graphical object within a graphical user interface;
indexing an emotion value assigned to the graphical object through a spectrum of emotion values according to a direction of an input into the graphical user interface;
within the graphical user interface, updating the graphical object to visually correspond to the emotion value assigned to the graphical object;
recording submission of a final emotion value through the graphical user interface; and
in response to the final emotion value equaling a trigger value, distributing a prompt to an external entity to monitor the user.
2. The method of claim 1:
wherein indexing the emotion value through the spectrum of emotion values comprises:
indexing the emotion value forward through a set of integers from “1” through “10,” inclusive, in response to a swipe input in a first direction into the graphical user interface; and
indexing the emotion value backward through the set of integers in response to a swipe input in a second direction into the graphical user interface, the second direction opposite the first direction; and
wherein updating the graphical object within the graphical user interface comprises updating the graphic object to show a particular integer, from the set of integers, corresponding to a current emotion value selected by the user through a sequence of swipe inputs into the graphical user interface.
3. The method of claim 2:
wherein rendering the graphical object within a graphical user interface comprises rendering a virtual sphere within the graphical user interface; and
wherein updating the graphical object within the graphical user interface comprises rendering the virtual sphere in a warmer color and in an increasingly amorphous form in response to selection of emotion values corresponding to increasingly content states of the user.
4. The method of claim 2, wherein distributing the prompt to the external entity to monitor the user comprises:
communicating a first prompt to monitor the user to a computing device associated with a therapist affiliated with the user in response to the final emotion value equaling a first trigger value of “1”; and
pushing a notification to interact with the user to a computing device associated with a known contact of the user in response to the final emotion value equaling a second trigger value of “2.”
5. The method of claim 1,
wherein indexing the emotion value comprises:
pseudorandomly assigning an initial emotion value, in a spectrum of emotion values corresponding to unique contentment states of the user, to the graphical object; and
indexing through the spectrum of number values in a loop according to a sequence of inputs into the graphical user interface; and
further comprising characterizing an integrity of the final emotion value based on a number of inputs over the graphical object preceding selection of the final emotion value.
6. The method of claim 1:
wherein rendering the graphical object within the graphical user interface comprises accessing an alternate digital keyboard within a native textual communication application executing on a mobile computing device and rendering the graphical object within a message preview area of the alternate digital keyboard;
wherein indexing the emotion value assigned to the graphical object comprises indexing the emotion value represented on the graphical object in response to and in a direction corresponding to a swipe input over the message preview area;
further comprising transmitting the graphical object to a recipient selected in the native textual communication application in response to receipt of a command at the native textual communication application to send the graphical object to the recipient; and
withholding the final emotion value from the external entity in response to the final emotion value differing from the trigger value.
7. The method of claim 6,
further comprising, prior to rendering the graphical object within the graphical user interface, prompting the user to populate a support group with a set of contacts;
wherein accessing the alternate digital keyboard comprises accessing the alternate digital keyboard in replacement of an alphanumeric keyboard within a native text messaging application executing on the mobile computing device;
wherein distributing the prompt to the external entity comprises recording a final emotion value selected by the user within the native text messaging application to a database of emotion values entered by the user and transmitting a form of the final emotion value and an identifier of the user to the external entity comprising a mental health representative in response to the final emotion value equaling the trigger value; and
further comprising communicating a form of the final emotion value to a contact specified in the support group and distinct from the recipient in response to the final emotion value exceeding the trigger value and a threshold intervention value exceeding the final emotion value.
8. The method of claim 6,
wherein rendering the graphical object within the graphical user interface comprises accessing an alternate digital keyboard within a native online social networking application executing on a mobile computing device and rendering the graphical object within a message preview area of the alternate digital keyboard;
wherein updating the graphical object to visually correspond to the emotion value assigned to the graphical object comprises updating a visual form of the graphical object within the preview area to correspond to the emotion value in response to the input into the graphical user interface;
further comprising posting the graphical object to a social feed within an online social network in response to receipt of a command at the native online social networking application to post the graphical object to the online social network; and
wherein distributing the prompt to the external entity comprises communicating the prompt to a contact previously affiliated with the user in response to the final emotion value equaling the trigger value, the prompt directing the contact to the social feed.
9. The method of claim 1:
wherein rendering the graphical object within the graphical user interface comprises rendering the graphical object within a lock screen on a mobile computing device;
further comprising, in response to entry of a correct passcode into the lock screen on the mobile computing device, recording a final emotion value selected by the user to a database of emotion values entered by the user; and
wherein distributing the prompt to the external entity comprises distributing the prompt to contact the user to the external entity in response to a final emotion value, entered through the lock screen upon receipt of the correct passcode, equaling the trigger value.
10. The method of claim 1, further comprising gating access to an application through a computing device executing the graphical user interface until a final emotion value is selected by the user and submitted through the graphical user interface.
11. The method of claim 10:
wherein gating access to the application comprises restricting access to an employer-issued checklist until the final emotion value is selected by the user and submitted through the graphical user interface; and
wherein distributing the prompt to monitor the user to the external entity comprises communicating the final emotion value and an identifier of the user to a mental health professional affiliated with the employer in response to the final emotion value equaling the trigger value.
12. The method of claim 1,
further comprising receiving a first category label for the final emotion value and storing the final emotion value with the first category label;
wherein distributing the prompt to monitor the user to the external entity comprises prompting a member of a first support group affiliated with the user and associated with the first category label to contact the user in response to the final emotion value equaling the trigger value for the first category label;
further comprising recording submission of a second emotion value through the graphical user interface, receiving a second category label different from the first category label for the second emotion value, and storing the second emotion value with the second category label; and
prompting a member of a second support group affiliated with the user and associated with the second category label to contact the user
13. The method of claim 1:
wherein receiving the first category label for the final emotion value comprises receiving the first category label corresponding to an overall emotional state of the user; and
wherein receiving the second category label for the second emotion value comprises receiving the second category label corresponding to one of: perceived financial comfort, perceived relationship comfort, perceived sleep quality, perceived anxiety level, perceived anger level, and perceived physical health of the user.
14. The method of claim 1:
further comprising storing a location of a computing device executing the graphical user interface at an approximate time the final emotion value was entered by the user; and
wherein distributing the prompt to monitor the user to the external entity comprises dispatching an emergency responder to the location in response to the final emotion value equaling the trigger value corresponding to a low state of contentment.
15. The method of claim 1, further comprising:
quantifying an emotional stability of the user based on the final emotion value and emotion values previously entered by the user; and
customizing the trigger value for the user based on the emotional stability of the user.
16. A method for tracking and responding to mental health changes in a user, the method comprising:
rendering a graphical object within a graphical user interface;
indexing an emotion value assigned to the graphical object according to a direction of a swipe input over the graphical object within the graphical user interface;
updating a color value and a virtual viscosity of the graphical object to correspond to the emotion value assigned to the graphical object;
distributing a dynamic visual object, representing the graphical object in a color value and a virtual viscosity corresponding to a final emotion value selected by the user, to a first recipient elected by the user;
in response to the final emotion value equaling a first trigger value, prompting a second recipient to contact the user; and
in response to the final emotion value equaling a second trigger value representing a less content state of the user than the first trigger value, distributing the final emotion value to a mental health entity.
17. The method of claim 16, wherein prompting a second recipient to contact the user comprises prompting the second recipient to send a song to a computing device affiliated with the user through a digital music service.
18. The method of claim 16, wherein prompting a second recipient to contact the user comprises sending, to the second recipient, a notification comprising a link to an online florist and a prompt to send flowers to the user.
19. The method of claim 16:
wherein indexing an emotion value assigned to the graphical object comprises indexing the emotion value forward through a set of integers from “1” through “10,” inclusive, in response to a swipe input in a first direction into the graphical user interface and indexing the emotion value backward through the set of integers in response to a swipe input in a second direction into the graphical user interface, the second direction opposite the first direction
wherein distributing the dynamic visual object to the first recipient elected by the user comprises transmitting a digital file containing the dynamic visual object representing the graphical object to the recipient through a private messaging channel;
wherein prompting the second recipient to contact the user comprises transmitting a prompt to the second recipient comprising a member of a support group associated with the user and other than the first recipient in response to the final emotion value equaling the first trigger value contained within the set of integers; and
wherein distributing the final emotion value to the mental health entity comprises notifying a mental health professional of the emotion value entered by the user in response to the final emotion value equaling the second trigger value contained within the set of integers.
20. The method of claim 16:
wherein prompting the second recipient to contact the user comprises pushing a notification to the second recipient, specified in a support group affiliated with the user, to show concern for the user in response to the final emotion value equaling the first trigger value; and
further comprising revealing a second emotion value, previously submitted by the second recipient through a second graphical user interface executing on a second computing device, to the user at the graphical user interface in response to receipt of the final emotion value from the user.
21. A method for tracking and responding to mental health changes in a user, the method comprising:
at a computing device linked to the user, receiving an emotion value, on a spectrum of emotion values, selected by the user;
recording the emotion value in a database of emotion values entered by the user;
enabling access to a process on the computing device in response to receipt of the emotion value; and
in response to the emotion value equaling a trigger value corresponding to a low state of contentment, distributing a second prompt to a mental health representative to monitor the user.
US15/233,732 2015-08-10 2016-08-10 Methods for tracking and responding to mental health changes in a user Abandoned US20170046496A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/233,732 US20170046496A1 (en) 2015-08-10 2016-08-10 Methods for tracking and responding to mental health changes in a user
US16/424,299 US11430567B2 (en) 2015-08-10 2019-05-28 Methods for tracking and responding to mental health changes in a user
US17/864,166 US20230178229A1 (en) 2015-08-10 2022-07-13 Methods for tracking and responding to mental health changes in a user

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562203083P 2015-08-10 2015-08-10
US15/233,732 US20170046496A1 (en) 2015-08-10 2016-08-10 Methods for tracking and responding to mental health changes in a user

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/424,299 Continuation US11430567B2 (en) 2015-08-10 2019-05-28 Methods for tracking and responding to mental health changes in a user

Publications (1)

Publication Number Publication Date
US20170046496A1 true US20170046496A1 (en) 2017-02-16

Family

ID=57995396

Family Applications (3)

Application Number Title Priority Date Filing Date
US15/233,732 Abandoned US20170046496A1 (en) 2015-08-10 2016-08-10 Methods for tracking and responding to mental health changes in a user
US16/424,299 Active 2037-02-25 US11430567B2 (en) 2015-08-10 2019-05-28 Methods for tracking and responding to mental health changes in a user
US17/864,166 Pending US20230178229A1 (en) 2015-08-10 2022-07-13 Methods for tracking and responding to mental health changes in a user

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/424,299 Active 2037-02-25 US11430567B2 (en) 2015-08-10 2019-05-28 Methods for tracking and responding to mental health changes in a user
US17/864,166 Pending US20230178229A1 (en) 2015-08-10 2022-07-13 Methods for tracking and responding to mental health changes in a user

Country Status (1)

Country Link
US (3) US20170046496A1 (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170160924A1 (en) * 2015-12-08 2017-06-08 Lenovo (Beijing) Limited Information processing method and electronic device
US20170207646A1 (en) * 2016-01-15 2017-07-20 International Business Machines Corporation Alternate alarm notifications based on battery condition
US20170345424A1 (en) * 2016-05-31 2017-11-30 Toyota Jidosha Kabushiki Kaisha Voice dialog device and voice dialog method
US20180089171A1 (en) * 2016-09-26 2018-03-29 International Business Machines Corporation Automated message sentiment analysis and aggregation
US20180248819A1 (en) * 2015-10-20 2018-08-30 Sony Corporation Information processing system and information processing method
US10069842B1 (en) 2017-03-14 2018-09-04 International Business Machines Corporation Secure resource access based on psychometrics
US20180276345A1 (en) * 2017-03-24 2018-09-27 International Business Machines Corporation System and method to monitor mental health implications of unhealthy behavior and optimize mental and physical health via a mobile device
US20190013092A1 (en) * 2017-07-05 2019-01-10 Koninklijke Philips N.V. System and method for facilitating determination of a course of action for an individual
US20190102696A1 (en) * 2017-10-02 2019-04-04 International Business Machines Corporation Empathy fostering based on behavioral pattern mismatch
US20190188769A1 (en) * 2017-12-14 2019-06-20 Wells Fargo Bank, N.A. Customized predictive financial advisory for a customer
US10410017B2 (en) 2016-09-30 2019-09-10 The Toronto-Dominion Bank Device lock bypass on selectable alert
US20190289125A1 (en) * 2016-10-13 2019-09-19 The Trustees Of Princeton University System and method for tracking a mobile device user
JP2019164737A (en) * 2018-03-20 2019-09-26 ヤフー株式会社 Determination device, determination method and determination program
US10481749B1 (en) * 2014-12-01 2019-11-19 Google Llc Identifying and rendering content relevant to a user's current mental state and context
US20200064986A1 (en) * 2018-08-22 2020-02-27 Caressa Corporation Voice-enabled mood improvement system for seniors
US20200285669A1 (en) * 2019-03-06 2020-09-10 International Business Machines Corporation Emotional Experience Metadata on Recorded Images
US20200333925A1 (en) * 2019-04-19 2020-10-22 Microsoft Technology Licensing, Llc System and method for navigating interfaces using touch gesture inputs
USD900126S1 (en) * 2018-08-28 2020-10-27 Nitto Denko Corporation Display screen or portion thereof with graphical user interface
WO2020223339A1 (en) 2019-04-30 2020-11-05 Next Jump, Inc. Electronic systems and methods for the assessment of emotional state
CN111931073A (en) * 2020-10-10 2020-11-13 腾讯科技(深圳)有限公司 Content pushing method and device, electronic equipment and computer readable medium
US10990166B1 (en) * 2020-05-10 2021-04-27 Truthify, LLC Remote reaction capture and analysis system
US11023687B2 (en) * 2018-10-08 2021-06-01 Verint Americas Inc. System and method for sentiment analysis of chat ghost typing
CN112906399A (en) * 2021-02-20 2021-06-04 北京百度网讯科技有限公司 Method, device, equipment and storage medium for determining emotional state
US11049604B2 (en) * 2018-09-26 2021-06-29 International Business Machines Corporation Cognitive monitoring of online user profiles to detect changes in online behavior
US20210196169A1 (en) * 2017-11-03 2021-07-01 Sensormatic Electronics, LLC Methods and System for Monitoring and Assessing Employee Moods
US11055979B1 (en) 2020-07-29 2021-07-06 GoX Studio, Inc. Systems and methods to provide a watch as a dashboard of a plurality of modules by utilizing a mesh protocol
US11161011B2 (en) * 2019-04-29 2021-11-02 Kpn Innovations, Llc Methods and systems for an artificial intelligence fitness professional support network for vibrant constitutional guidance
US11205518B1 (en) * 2020-08-17 2021-12-21 GoX Studio, Inc. System and method to provide indications of a subject's fitness based on values of fitness metrics for the subject
US20210400356A1 (en) * 2019-08-22 2021-12-23 Juhaokan Technology Co., Ltd. Method For Displaying Message On Smart Television, And Smart Television
US11229407B2 (en) 2020-06-08 2022-01-25 GoX Studio, Inc. Systems and methods to determine a risk factor related to dehydration and thermal stress of a subject
US11322143B2 (en) * 2016-09-27 2022-05-03 Google Llc Forming chatbot output based on user state
US11354507B2 (en) * 2018-09-13 2022-06-07 International Business Machines Corporation Compared sentiment queues
US20220210102A1 (en) * 2018-03-29 2022-06-30 TipeME Holdings Pty Ltd A System and Method for Allowing Messaging Between a First Computing Device Operated by a First User and a Second Computing Device Operated by a Second User and a Structured Message Data Set for Use in that System and Method
US11410683B2 (en) * 2017-09-05 2022-08-09 Kyocera Corporation Electronic device, mobile terminal, communication system, monitoring method, and program
US11461664B2 (en) * 2019-05-07 2022-10-04 Kpn Innovations, Llc. Methods and systems for an artificial intelligence alimentary professional support network for vibrant constitutional guidance
US11533272B1 (en) * 2018-02-06 2022-12-20 Amesite Inc. Computer based education methods and apparatus
US20220415500A1 (en) * 2021-06-25 2022-12-29 Mental Health & Wellness Now Partners, LLC Computing Device Configured with User Check-In for Mental Health and Wellness
US12009087B2 (en) 2020-11-18 2024-06-11 Evernorth Strategic Development, Inc. Predictive modeling for mental health management

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11157700B2 (en) 2017-09-12 2021-10-26 AebeZe Labs Mood map for assessing a dynamic emotional or mental state (dEMS) of a user
US11362981B2 (en) 2017-09-12 2022-06-14 AebeZe Labs System and method for delivering a digital therapeutic from a parsed electronic message
US20220189623A1 (en) * 2020-12-15 2022-06-16 State Farm Mutual Automobile Insurance Company Systems and methods of guided information intake
US11823591B2 (en) 2021-01-08 2023-11-21 Microsoft Technology Licensing, Llc Emotional management system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080098074A1 (en) * 2004-11-03 2008-04-24 Robert Hurling Method and Apparatus for Motivation Enhancement
US8776108B2 (en) * 2009-01-12 2014-07-08 Disney Enterprises, Inc. System and/or method for distributing media content and providing an option to maintain an advertising experience
US8643648B2 (en) * 2009-03-31 2014-02-04 Patientslikeme, Inc. Systems, methods, and computer-readable media for context-linked importation of user information
US9123081B2 (en) * 2011-02-14 2015-09-01 Neil Young Portable device for simultaneously providing text or image data to a plurality of different social media sites based on a topic associated with a downloaded media file
US20140195255A1 (en) * 2013-01-08 2014-07-10 Robert Bosch Gmbh System And Method For Assessment Of Patient Health Using Patient Generated Data
US20140324719A1 (en) * 2013-03-15 2014-10-30 Bruce A. Canal Social media screening and alert system
US20150287403A1 (en) * 2014-04-07 2015-10-08 Neta Holzer Zaslansky Device, system, and method of automatically generating an animated content-item
US10909216B2 (en) * 2014-05-07 2021-02-02 SkyTherapist, Inc. Virtual mental health platform
US20160261675A1 (en) * 2014-08-02 2016-09-08 Apple Inc. Sharing user-configurable graphical constructs
US10466883B2 (en) * 2015-03-02 2019-11-05 Apple Inc. Screenreader user interface

Cited By (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11372514B1 (en) 2014-12-01 2022-06-28 Google Llc Identifying and rendering content relevant to a user's current mental state and context
US10963119B1 (en) 2014-12-01 2021-03-30 Google Llc Identifying and rendering content relevant to a user's current mental state and context
US11861132B1 (en) 2014-12-01 2024-01-02 Google Llc Identifying and rendering content relevant to a user's current mental state and context
US10481749B1 (en) * 2014-12-01 2019-11-19 Google Llc Identifying and rendering content relevant to a user's current mental state and context
US20180248819A1 (en) * 2015-10-20 2018-08-30 Sony Corporation Information processing system and information processing method
US10673788B2 (en) * 2015-10-20 2020-06-02 Sony Corporation Information processing system and information processing method
US20170160924A1 (en) * 2015-12-08 2017-06-08 Lenovo (Beijing) Limited Information processing method and electronic device
US9859731B2 (en) * 2016-01-15 2018-01-02 International Business Machines Corporation Alternate alarm notifications based on battery condition
US20180097384A1 (en) * 2016-01-15 2018-04-05 International Business Machines Corporation Alternate alarm notifications based on battery condition
US20170207646A1 (en) * 2016-01-15 2017-07-20 International Business Machines Corporation Alternate alarm notifications based on battery condition
US10097019B2 (en) * 2016-01-15 2018-10-09 International Business Machines Corporation Alternate alarm notifications based on battery condition
US10438586B2 (en) * 2016-05-31 2019-10-08 Toyota Jidosha Kabushiki Kaisha Voice dialog device and voice dialog method
US20170345424A1 (en) * 2016-05-31 2017-11-30 Toyota Jidosha Kabushiki Kaisha Voice dialog device and voice dialog method
US10642936B2 (en) * 2016-09-26 2020-05-05 International Business Machines Corporation Automated message sentiment analysis and aggregation
US20180089171A1 (en) * 2016-09-26 2018-03-29 International Business Machines Corporation Automated message sentiment analysis and aggregation
US11322143B2 (en) * 2016-09-27 2022-05-03 Google Llc Forming chatbot output based on user state
US10410017B2 (en) 2016-09-30 2019-09-10 The Toronto-Dominion Bank Device lock bypass on selectable alert
US10936755B2 (en) 2016-09-30 2021-03-02 The Toronto-Dominion Bank Device lock bypass on selectable alert
US10798238B2 (en) * 2016-10-13 2020-10-06 The Trustees Of Princeton University System and method for tracking a mobile device user
US20190289125A1 (en) * 2016-10-13 2019-09-19 The Trustees Of Princeton University System and method for tracking a mobile device user
US10069842B1 (en) 2017-03-14 2018-09-04 International Business Machines Corporation Secure resource access based on psychometrics
US20180276345A1 (en) * 2017-03-24 2018-09-27 International Business Machines Corporation System and method to monitor mental health implications of unhealthy behavior and optimize mental and physical health via a mobile device
US20190013092A1 (en) * 2017-07-05 2019-01-10 Koninklijke Philips N.V. System and method for facilitating determination of a course of action for an individual
US11410683B2 (en) * 2017-09-05 2022-08-09 Kyocera Corporation Electronic device, mobile terminal, communication system, monitoring method, and program
US20190102696A1 (en) * 2017-10-02 2019-04-04 International Business Machines Corporation Empathy fostering based on behavioral pattern mismatch
US11157831B2 (en) * 2017-10-02 2021-10-26 International Business Machines Corporation Empathy fostering based on behavioral pattern mismatch
US20210196169A1 (en) * 2017-11-03 2021-07-01 Sensormatic Electronics, LLC Methods and System for Monitoring and Assessing Employee Moods
US11238518B2 (en) * 2017-12-14 2022-02-01 Wells Fargo Bank, N.A. Customized predictive financial advisory for a customer
US20190188769A1 (en) * 2017-12-14 2019-06-20 Wells Fargo Bank, N.A. Customized predictive financial advisory for a customer
US11533272B1 (en) * 2018-02-06 2022-12-20 Amesite Inc. Computer based education methods and apparatus
JP2019164737A (en) * 2018-03-20 2019-09-26 ヤフー株式会社 Determination device, determination method and determination program
US20230370402A1 (en) * 2018-03-29 2023-11-16 TipeME Holdings Pty Ltd System and Method for Allowing Messaging Between a First Computing Device Operated by a First User and a Second Computing Device Operated by a Second User and a Structured Message Data Set for Use in that System and Method
US20220210102A1 (en) * 2018-03-29 2022-06-30 TipeME Holdings Pty Ltd A System and Method for Allowing Messaging Between a First Computing Device Operated by a First User and a Second Computing Device Operated by a Second User and a Structured Message Data Set for Use in that System and Method
US20200064986A1 (en) * 2018-08-22 2020-02-27 Caressa Corporation Voice-enabled mood improvement system for seniors
USD900126S1 (en) * 2018-08-28 2020-10-27 Nitto Denko Corporation Display screen or portion thereof with graphical user interface
US11354507B2 (en) * 2018-09-13 2022-06-07 International Business Machines Corporation Compared sentiment queues
US11049604B2 (en) * 2018-09-26 2021-06-29 International Business Machines Corporation Cognitive monitoring of online user profiles to detect changes in online behavior
US11023687B2 (en) * 2018-10-08 2021-06-01 Verint Americas Inc. System and method for sentiment analysis of chat ghost typing
US11544473B2 (en) * 2018-10-08 2023-01-03 Verint Americas Inc. System and method for sentiment analysis of chat ghost typing
US20210271825A1 (en) * 2018-10-08 2021-09-02 Verint Americas Inc. System and method for sentiment analysis of chat ghost typing
US20200285668A1 (en) * 2019-03-06 2020-09-10 International Business Machines Corporation Emotional Experience Metadata on Recorded Images
US11163822B2 (en) * 2019-03-06 2021-11-02 International Business Machines Corporation Emotional experience metadata on recorded images
US11157549B2 (en) * 2019-03-06 2021-10-26 International Business Machines Corporation Emotional experience metadata on recorded images
US20200285669A1 (en) * 2019-03-06 2020-09-10 International Business Machines Corporation Emotional Experience Metadata on Recorded Images
US20200333925A1 (en) * 2019-04-19 2020-10-22 Microsoft Technology Licensing, Llc System and method for navigating interfaces using touch gesture inputs
US11161011B2 (en) * 2019-04-29 2021-11-02 Kpn Innovations, Llc Methods and systems for an artificial intelligence fitness professional support network for vibrant constitutional guidance
WO2020223339A1 (en) 2019-04-30 2020-11-05 Next Jump, Inc. Electronic systems and methods for the assessment of emotional state
US11682490B2 (en) 2019-04-30 2023-06-20 Next Jump, Inc. Electronic systems and methods for the assessment of emotional state
EP3963427A4 (en) * 2019-04-30 2023-01-25 Next Jump, Inc. Electronic systems and methods for the assessment of emotional state
US11461664B2 (en) * 2019-05-07 2022-10-04 Kpn Innovations, Llc. Methods and systems for an artificial intelligence alimentary professional support network for vibrant constitutional guidance
US20210400356A1 (en) * 2019-08-22 2021-12-23 Juhaokan Technology Co., Ltd. Method For Displaying Message On Smart Television, And Smart Television
US11818440B2 (en) * 2019-08-22 2023-11-14 Juhaokan Technology Co., Ltd. Method for displaying message on smart television, and smart television
US10990166B1 (en) * 2020-05-10 2021-04-27 Truthify, LLC Remote reaction capture and analysis system
US11229407B2 (en) 2020-06-08 2022-01-25 GoX Studio, Inc. Systems and methods to determine a risk factor related to dehydration and thermal stress of a subject
US11701063B2 (en) 2020-06-08 2023-07-18 GoX Studio, Inc Systems and methods to determine a risk factor related to dehydration and thermal stress of a subject
US11055979B1 (en) 2020-07-29 2021-07-06 GoX Studio, Inc. Systems and methods to provide a watch as a dashboard of a plurality of modules by utilizing a mesh protocol
US11676467B2 (en) 2020-07-29 2023-06-13 GoX Studio, Inc. Systems and methods to provide a watch as a dashboard of a plurality of modules by utilizing a mesh protocol
US11205518B1 (en) * 2020-08-17 2021-12-21 GoX Studio, Inc. System and method to provide indications of a subject's fitness based on values of fitness metrics for the subject
CN111931073A (en) * 2020-10-10 2020-11-13 腾讯科技(深圳)有限公司 Content pushing method and device, electronic equipment and computer readable medium
US12009087B2 (en) 2020-11-18 2024-06-11 Evernorth Strategic Development, Inc. Predictive modeling for mental health management
CN112906399A (en) * 2021-02-20 2021-06-04 北京百度网讯科技有限公司 Method, device, equipment and storage medium for determining emotional state
US11657917B2 (en) * 2021-06-25 2023-05-23 Mental Health & Wellness Now Partners, LLC Computing device configured with user check-in for mental health and wellness
US20220415500A1 (en) * 2021-06-25 2022-12-29 Mental Health & Wellness Now Partners, LLC Computing Device Configured with User Check-In for Mental Health and Wellness

Also Published As

Publication number Publication date
US20230178229A1 (en) 2023-06-08
US20200152323A1 (en) 2020-05-14
US11430567B2 (en) 2022-08-30

Similar Documents

Publication Publication Date Title
US11430567B2 (en) Methods for tracking and responding to mental health changes in a user
JP7055838B2 (en) A data processing terminal that can display various icon badges and a method using the badge and terminal
US9002944B2 (en) Virtual badge, device and method
US20170177807A1 (en) Enhanced user interface for a system and method for optimizing surgical team composition and surgical team procedure resource management
US20170323068A1 (en) Wearable device for real-time monitoring of parameters and triggering actions
US20160048369A1 (en) Systems for Displaying Media on Display Devices
US20150120633A1 (en) Wellness information analysis system
US11087539B2 (en) Systems and methods for generating augmented reality-based profiles
US10540550B2 (en) Augmented reality systems and methods for service providers
US20220353640A1 (en) System and Method for Appointment Scheduling
US10657589B2 (en) Digital bank branch
US11368443B2 (en) Decentralized digital communication platform system and method
US20170085514A1 (en) Methods and apparatuses for using network-based devices to improve educator/parent communication
US20150046209A1 (en) System and method for providing calendar services to users
US20180139574A1 (en) Systems and Methods For Monitoring Compliance With Recovery Goals
KR102126891B1 (en) Method for providing schedule management service and schedule management service system using it
US20140101066A1 (en) Systems and methods for wellness programs
CA3098610A1 (en) Decentralized digital communication platform system and method
US20230410172A1 (en) Smart table system for document management
US20180053011A1 (en) Secure access device
Miller et al. Engaging suicidal youth in outpatient treatment: Theoretical and empirical underpinnings
US20220318930A1 (en) Customized integrated date organization and personalized dating calendar system and method
KR102223954B1 (en) A data processing terminal capable of displaying various icon badges and a method for using such badges and terminal
US11740853B1 (en) Smart table system utilizing extended reality
KR20210025038A (en) A data processing terminal capable of displaying various icon badges and a method for using such badges and terminal

Legal Events

Date Code Title Description
AS Assignment

Owner name: SOCIAL HEALTH INNOVATIONS, INC., DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JOHNSTONE, AMANDA;ROZYNSKI, OLIVER;REEL/FRAME:040075/0224

Effective date: 20161019

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION