US11944436B2 - Determining a stress level of a smartphone user based on the user's touch interactions and motion sensor data - Google Patents
Determining a stress level of a smartphone user based on the user's touch interactions and motion sensor data Download PDFInfo
- Publication number
- US11944436B2 US11944436B2 US17/227,308 US202117227308A US11944436B2 US 11944436 B2 US11944436 B2 US 11944436B2 US 202117227308 A US202117227308 A US 202117227308A US 11944436 B2 US11944436 B2 US 11944436B2
- Authority
- US
- United States
- Prior art keywords
- touch
- motion
- user
- smartphone
- data points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 237
- 230000003993 interaction Effects 0.000 title description 18
- 238000000034 method Methods 0.000 claims abstract description 43
- 238000007477 logistic regression Methods 0.000 claims description 12
- 230000000694 effects Effects 0.000 claims description 3
- 230000001755 vocal effect Effects 0.000 claims description 3
- 230000035882 stress Effects 0.000 description 123
- 238000013480 data collection Methods 0.000 description 9
- 238000010801 machine learning Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 238000010606 normalization Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000035790 physiological processes and functions Effects 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 238000011282 treatment Methods 0.000 description 3
- UCTWMZQNUQWSLP-VIFPVBQESA-N (R)-adrenaline Chemical compound CNC[C@H](O)C1=CC=C(O)C(O)=C1 UCTWMZQNUQWSLP-VIFPVBQESA-N 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000006998 cognitive state Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 230000002996 emotional effect Effects 0.000 description 2
- 208000024891 symptom Diseases 0.000 description 2
- UCTWMZQNUQWSLP-UHFFFAOYSA-N Adrenaline Natural products CNCC(O)C1=CC=C(O)C(O)=C1 UCTWMZQNUQWSLP-UHFFFAOYSA-N 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 206010020772 Hypertension Diseases 0.000 description 1
- 208000008589 Obesity Diseases 0.000 description 1
- 230000010757 Reduction Activity Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 229940102884 adrenalin Drugs 0.000 description 1
- 230000004872 arterial blood pressure Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 210000000748 cardiovascular system Anatomy 0.000 description 1
- 230000037326 chronic stress Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000006735 deficit Effects 0.000 description 1
- 238000002405 diagnostic procedure Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000009931 harmful effect Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000007407 health benefit Effects 0.000 description 1
- 230000005802 health problem Effects 0.000 description 1
- 210000000987 immune system Anatomy 0.000 description 1
- 230000001976 improved effect Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000004060 metabolic process Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 235000020824 obesity Nutrition 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000241 respiratory effect Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 230000003938 response to stress Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 239000002438 stress hormone Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
- A61B5/1122—Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6898—Portable consumer electronic devices, e.g. music players, telephones, tablet computers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7278—Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7435—Displaying user selection data, e.g. icons in a graphical user interface
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7475—User input or interface means, e.g. keyboard, pointing device, joystick
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
- G06F3/04146—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position using pressure sensitive conductive elements delivering a boolean signal and located between crossing sensing lines, e.g. located between X and Y sensing line layers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/065—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04105—Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
Definitions
- the present invention relates to a method for determining the stress level of a person based only on how the person uses a smartphone by weighting contributions of both the person's touch interactions with the smartphone and movements of the smartphone as indicated by motion sensor data.
- a method for determining a user's stress level is performed by a mobile app running on a smartphone.
- the method includes generating touch feature values and motion feature values, weighting the feature values by regression parameters, and generating a stress score for the user based on the weighted touch feature values and weighted motion feature values.
- the touch feature values indicate how the user's finger moves over the smartphone screen and are generated from touch data points including X positions, Y positions and associated touch timestamp values.
- the motion feature values indicate the movement of the smartphone and are generated from motion data points including X movements, Y movements, Z movements and associated motion timestamp values.
- the regression parameters are generated using touch and motion data identified by other users as being acquired while those other users were experiencing various perceived levels of stress.
- the app indicates to the user whether the stress score is higher or lower than a previously generated stress score.
- a method for determining a stress score of the user of a smartphone is based on touch and motion data sensed by the smartphone and does not require inputs from wearable devices or other sensors.
- a computing system on the smartphone receives touch data points of the smartphone user that each includes an X position value, a Y position value and an associated touch timestamp value.
- the touch data points are segmented into discrete movements of the user's finger on the screen of the smartphone.
- the computing system receives motion data points of the smartphone that each includes an X movement value, a Y movement value, a Z movement value and an associated motion timestamp value.
- the motion data points are segmented into discrete time intervals including a first time interval.
- the first time interval is associated with those discrete movements of the user's finger that occur entirely within the first time interval.
- Values of touch features are calculated for each discrete movement within the first time interval.
- Values of motion features associated with the first time interval are calculated.
- the touch features and the motion features are normalized based on prior touch data points and motion data points previously acquired while the user was using the smartphone.
- a stress level value is calculated for each discrete movement within the first time interval by applying logistic regression parameters to the normalized touch features and the normalized motion features.
- the logistic regression parameters are generated using a logistic regression model trained on touch data points and motion data points acquired from other users during time intervals identified by those other users as being associated with various levels of stress.
- a stress score of the user during the first time interval is determined based on the stress level values of the discrete movements within the first time interval.
- An indication is displayed of how the stress score of the user has changed since the stress score of the user was last determined. The user is prompted to engage in a stress mitigation activity based on the determined stress score.
- FIG. 1 is a schematic diagram of a computing system that runs a smartphone app.
- FIG. 2 is a schematic diagram of the components of the smartphone app that determines the stress level of the smartphone app user.
- FIG. 3 illustrates motion data points segmented into evenly sized time intervals and touch data points segmented into discrete touch events.
- FIG. 4 is a flowchart of steps of a method by which the smartphone app uses touch and motion data to determine the current stress level of the user of the smartphone app.
- FIGS. 5 A and 5 B are a table listing the components of forty-six exemplary touch data points.
- FIGS. 6 A, 6 B, 6 C, 6 D, 6 E, 6 F, 6 G, 6 H and 6 I are a table listing the components of 585 exemplary motion data points.
- FIG. 7 is a table listing the values of thirty-two touch features for each of five touch samples as well as the mean and standard deviation of the touch features.
- FIG. 8 shows the formula by which the value of the touch feature “distance_covered” is calculated.
- FIG. 9 shows the formula by which the value of the touch feature “distance_spanned” is calculated.
- FIG. 10 is a table listing the values of twenty-nine motion features for a motion sample as well as the mean and standard deviation of the motion features.
- FIGS. 11 A and 11 B are a table listing the normalized touch and motion features associated with three touch samples as well as the regression parameter values associated with the touch and motion features.
- FIG. 12 shows the formula by which the stress level value associated with a touch sample is calculated.
- FIG. 1 is a simplified schematic diagram of a computing system 10 on a smartphone 11 , which is a mobile telecommunications device.
- System 10 can be used to implement a method for determining the stress level of a smartphone user based only on that person's touch interactions with the smartphone and the movements of the smartphone as measured by an accelerometer in the phone. Portions of the computing system 10 are implemented as software executing as a mobile App on the smartphone 11 .
- Components of the computing system 10 include, but are not limited to, a processing unit 12 , a system memory 13 , and a system bus 14 that couples the various system components including the system memory 13 to the processing unit 12 .
- Computing system 10 also includes computing machine-readable media used for storing computer readable instructions, data structures, other executable software and other data.
- the system memory 13 includes computer storage media such as read only memory (ROM) 15 and random access memory (RAM) 16 .
- ROM 15 read only memory
- RAM 16 random access memory
- a basic input/output system 17 (BIOS) containing the basic routines that transfer information between elements of computing system 10 , is stored in ROM 15 .
- RAM 16 contains software that is immediately accessible to processing unit 12 .
- RAM includes portions of the operating system 18 , other executable software 19 , and program data 20 .
- Application programs 21 including smartphone “apps”, are also stored in RAM 16 .
- Computing system 10 employs standardized interfaces through which different system components communicate. In particular, communication between apps and other software is effected through application programming interfaces (APIs), which define the conventions and protocols for initiating and servicing function calls.
- APIs application programming interfaces
- a user of computing system 10 enters commands and information through input devices such as a touchscreen 22 , input buttons 23 and a microphone 24 .
- a display screen 26 which is physically combined with touchscreen 22 , is connected via a video interface 27 to the system bus 14 .
- Touchscreen 22 includes a contact intensity sensor, such as a piezoelectric force sensor, a capacitive force sensor, an electric force sensor or an optical force sensor.
- These input devices are connected to the processing unit 12 through a user input interface 25 that is coupled to the system bus 14 .
- User input interface 25 detects the contact of a finger of the user with touchscreen 22 .
- Interface 25 includes various software components for performing operations related to detecting contact with touchscreen 22 , such as determining if contact (a fingerdown event) has occurred, determining the pressure of the contact, determining the size of the contact, determining if there is movement of the contact and tracking the movement across touchscreen 22 , and determining if the contact has ceased (a finger-up event).
- User input interface 25 generates a series of touch data points, each of which includes an X position, a Y position, a touch pressure, a touch size (by minimum and maximum radius), and an associated touch timestamp value.
- the touch timestamp value indicates the time at which the X position, Y position, touch pressure and touch size where measured.
- Computing system 10 also includes an accelerometer 28 , whose output is connected to the system bus 14 .
- Accelerometer 28 outputs motion data points indicative of the movement of smartphone 11 .
- Each motion data point includes an X movement value, a Y movement value, a Z movement value and an associated motion timestamp value.
- the motion timestamp value indicates the time at which the X movement value, Y movement value and Z movement value were measured.
- accelerometer 28 actually includes three accelerometers, one that measures X movement, one that measures Y movement, and one that measures Z movement.
- the wireless communication modules of smartphone 11 have been omitted from this description for brevity.
- FIG. 2 is a schematic diagram of the components of one of the application programs 21 running on smartphone 11 .
- This mobile application (app) 30 is part of computing system 10 .
- App 30 includes a data collection module 31 , a sample extraction and selection module 32 , a predictive modeling module 33 and a knowledge base module 34 .
- mobile app 30 is one of the application programs 21 .
- at least some of the functionality of app 30 is implemented as part of the operating system 18 itself. For example, the functionality can be integrated into the iOS mobile operating system or the Android mobile operating system.
- Data collection module 31 collects logs of user interactions with smartphone 11 , both as touch interaction logs from user input interface 25 and as motion data from accelerometer 28 .
- Software development kits (SDKs) specific to the operating system of smartphone 11 are used to enable data collection module 31 to collect the logs of the user's interactions with smartphone 11 and the motion sensor output.
- data collection module 31 collects data using layers of the operating system 18 .
- app 30 is implemented entirely as a mobile app
- only touch interaction data based on user interactions within the app itself can be collected.
- user interactions are collected independently of which app or screen of the smartphone is being used. In that case, for example, touch data generated on social networking, texting or chat apps can be collected.
- FIG. 2 shows that data collection module 31 collects touch data points 35 from user input interface 25 and motion data points 36 from accelerometer 28 .
- data collection module 31 also collects reports in which users indicate their physiological and cognitive states.
- Sample extraction and selection module 32 segments the continuous stream of touch data points 35 and motion data points 36 into discrete data samples and assesses whether the characteristics of the data of each sample warrant using the data in determining the user's stress. Module 32 parses the individual streams of interaction logs and motion data into a discrete series of samples. Module 32 separately extracts the samples from interaction logs and motion data.
- the motion data points 36 of accelerometer data are segmented into evenly sized time intervals with a predefined time length starting from the first collected data point within a usage session, as illustrated in FIG. 3 .
- the last (and shorter) interval of data in a usage session is discarded.
- module 32 segments the continuous stream of motion data points 36 into discrete samples each having a length of twenty seconds.
- the last group of motion data points 36 in the usage session which is nearly always shorter than twenty seconds, is discarded.
- the daily usage of a smartphone user is one usage session. Thus, in a two-week period, there would be fourteen usage sessions.
- the touch data points 35 of interaction logs are segmented into discrete touch events separated by interruptions.
- the interruptions in the touch events are already designated in the output of the software development kits (SDKs) and are included as part of the touch interaction logs received by data collection module 31 .
- Touch events whose time lengths do not fall within a predefined range e.g., 0.1 sec to 5 sec
- a touch event that does not fall entirely within a concurrent sample of motion data points 36 such as a 20-second period
- FIG. 3 illustrates touch data points 37 that are discarded either because the associated touch event does not fall entirely within a 20-second sample of motion data points 36 or because the touch event is too short.
- Predictive modeling module 33 receives the segmented samples of motion data, including motion data points 36 , and interaction logs, including touch data points 35 , from sample extraction and selection module 32 .
- Module 33 includes three components: a feature computation component 38 , a feature normalization component 39 , and a machine learning model 40 .
- Feature computation component 38 computes motion features and touch features from the samples of motion and touch data that it receives in a raw format from module 32 .
- motion features include the energy of the movement of the smartphone during a motion sample and the mean x movement per time increment during a motion sample.
- the accelerometer determines the movement in the x dimension after each time increment of ten milliseconds.
- touch features include the speed, curvature and distance covered of a touch event, such as a swipe.
- Feature normalization component 39 normalizes the motion features and touch features to each particular user.
- the magnitudes of the features are influenced in large part by usage patterns of individual users, which in turn depend on the type of device used. For example, the size and type of touchscreen 22 significantly influences the magnitudes of the features. Therefore, each of the many motion features and touch features is normalized based on the mean magnitude of that feature and the standard deviation of that feature for the particular user over a period of past use, such as a 2-week usage period. The user is assumed not to have changed the type of smartphone used during the normalization usage period.
- Machine learning model 40 determines the user's physiological and cognitive states, such as the user's stress level, based on the normalized motion features, the normalized touch features, and a weight or parameter for each of the features calculated using machine learning on a knowledge base of features for which prior users have indicated their perceived stress levels during each usage session.
- the knowledge base is stored in knowledge base module 34 .
- the knowledge base was acquired from the usage patterns of 110 users each over fourteen daily usage sessions. For each usage session, sixty-one motion and touch features were calculated. The users indicated their perceived stress level on a scale between 0-10 for each usage session. Thus, there were 93,940 entries, each associated with a stress level between zero and ten.
- the entries were divided into two equal groups, half having higher associated stress levels and the other half having lower associated stress levels. Each of the 46,970 entries associated with lower stress were deemed to represent a no-stress state, and each of the other 46,970 entries associated with higher stress were deemed to represent a stress state.
- the entries in knowledge base module 34 are fixed and immutable.
- the knowledge base is periodically augmented with additional entries from users who rate their stress levels during each usage session. Thus, the augmented knowledge base grows over time.
- Machine learning model 40 then performs a logistic regression on the 93,940 entries from knowledge base module 34 and calculates logistic regression parameters for each of the sixty-one motion and touch features.
- the feature parameter with the highest correlation to the stress state is the standard deviation of the overall motion.
- the magnitude of the parameter std_overall is 0.701.
- the feature parameter with the highest correlation to the no-stress state is the standard deviation of the energy of the overall motion.
- the magnitude of the parameter std_energy is ⁇ 1.011.
- a positive parameter correlates to more stress
- a negative parameter correlates to less stress.
- the magnitudes or weights of the parameters do not necessarily fall within any range, such as an arbitrary range between ⁇ 1.0 and 1.0.
- Machine learning model 40 then applies the parameters to each of the sixty-one motion and touch features associated with each touch sample (discrete touch event), and a stress level is determined for the touch sample.
- a stress level is determined for the touch sample.
- the stress level is determined for a touch sample based not only on the touch features associated with the touch sample, but also on the motion features of the motion sample to which the touch feature corresponds.
- a touch sample corresponds to a motion sample if the touch data points 35 of the touch sample fall entirely within the period of the motion sample.
- there are three touch samples 41 - 43 whose touch data points 35 fall entirely within the motion sample 44 .
- machine learning model 40 determines a stress level for each of the three touch samples 41 - 43 . Each determination is based on the thirty-two touch features of the corresponding touch sample as well as on the twenty-nine motion features of motion sample 44 .
- the stress levels determined for each of the three touch samples are averaged to obtain a stress level for the motion sample associated with a time interval of motion data.
- machine learning model 40 outputs a stress level associated with each 20-second motion sample.
- App 30 displays an indication on display screen 26 of how the stress level of the user has changed since the stress level was last determined by displaying a numerical representation of the determined stress level.
- App 30 displays an indication of how the stress level of the user has changed merely by indicating to the user whether the stress level is higher or lower than a previously determined stress level.
- the size of a green arrow or dot could correspond to a decrease in the user's stress level, and a the size of a red arrow or dot could correspond to an increase in the user's stress level.
- Another alternative would be to indicate the user's measured stress level using non-verbal audio feedback, such as a higher or lower pitched tone.
- FIG. 4 is a flowchart of steps 46 - 56 of a method 45 by which App 30 uses data sensed by smartphone 11 to determine the current stress level of the user of App 30 .
- the steps of FIG. 4 are described in relation to App 30 and computing system 10 which implement method 45 .
- step 46 data collection module 31 receives touch interaction data from smartphone 11 indicative of a user's interactions with touchscreen 22 of smartphone 11 .
- the touch data is received as touch interaction logs based on data sensed by user input interface 25 .
- the touch data comprises touch data points 35 , each of which includes an X position value, a Y position value and an associated touch timestamp value.
- FIGS. 5 A- 5 B are a table listing forty-six exemplary touch data points 35 .
- each point also includes values indicating the change in the X position (delta x), the change in the Y position (delta y), the pressure of the user's finger on touchscreen 22 , the overall size of the touch impression on the screen, the minimum radius of the touch impression, the maximum radius of the touch impression and the orientation of the touch impression.
- the size value has not been sensed, and is listed as zero.
- the timestamp values are listed in units of milliseconds.
- sample extraction and selection module 32 segments the continuous stream of touch data points 35 into discrete data samples representing distinct movements of the user's finger on touchscreen 22 . If the values for both delta x and delta y are zero for a particular timestamp value, then the touch event of the user's finger has stopped, and a new touch event is deemed to have begun. Each touch event is analyzed as a separate sample. In FIGS. 5 A- 5 B , for example, the delta x and delta y values are both zero at touch data points 14 , 22 , 30 and 38 . Thus, a first touch event sample #1 includes touch data points 1 - 14 , sample #2 includes points 15 - 22 , sample #3 includes points 23 - 30 , sample #4 includes points 31 - 38 and sample #5 includes points 39 - 46 .
- step 48 data collection module 31 receives motion sensor data from smartphone 11 indicative of the movement of the smartphone while the user's finger is moving over touchscreen 22 of smartphone 11 . At least a portion of the sensed movement of smartphone 11 associated with the motion data has occurred concurrently with the user's interactions with touchscreen 22 .
- the motion sensor data comprises motion data points 36 , each of which includes an X movement value, a Y movement value, a Z movement value and an associated motion timestamp value.
- the motion timestamp value indicates the time at which the X movement value, Y movement value and Z movement value were measured by accelerometer 28 .
- FIGS. 6 A- 6 I are a table listing 585 exemplary motion data points 36 .
- the timestamp values are listed in units of milliseconds.
- the table also lists the total time elapsed in seconds since the first motion data point was measured.
- FIGS. 6 A- 6 I include 5.855 seconds of motion data.
- the example usage session was segmented into 20-second samples of motion data points 36 .
- the last motion sample that was shorter than twenty seconds in length was discarded.
- the 585 motion data points 36 of FIGS. 6 A- 6 I are segmented into motion samples of no more than five seconds.
- step 49 the motion data points 36 of accelerometer data are segmented into evenly sized time intervals with a predefined time length (5 seconds) starting from the first collected data point within the usage session.
- the first time interval includes motion data points 1 - 499 .
- the motion data points 500 - 585 are discarded because they include less than five seconds of data.
- the first time interval of motion data points 36 (motion sample #1) is associated with those discrete movements of the user's finger that occur entirely within the first time interval.
- a touch sample is associated with a motion sample if the touch data points 35 of the touch sample fall entirely within the period of the motion sample.
- touch samples #1-#3 have touch data points with touch timestamp values that fall entirely within the motion timestamp values of motion sample #1 of FIGS. 6 A- 6 I .
- FIG. 6 C shows the highlighted motion timestamp values of motion data points 160 - 177 within motion sample #1, which coincide with the touch timestamp values of touch data points 1 - 14 of touch sample #1 of FIG. 5 A .
- Touch sample #1 runs from timestamp 1583005689576 to timestamp 1583005689753.
- FIGS. 6 D- 6 E show the highlighted motion timestamp values of motion data points 277 - 291 within motion sample #1, which coincide with the touch timestamp values of touch data points 15 - 22 of touch sample #2 of FIG. 5 A .
- Touch sample #2 runs from timestamp 1583005690747 to timestamp 1583005690903.
- 6 F shows the highlighted motion timestamp values of motion data points 410 - 418 within motion sample #1, which coincide with the touch timestamp values of touch data points 23 - 30 of touch sample #3 of FIG. 5 A .
- Touch sample #3 runs from timestamp 1583005692082 to timestamp 1583005692170.
- the touch timestamp values of touch data points 31 - 38 of touch sample #4 of FIG. 5 B coincide with the highlighted motion timestamp values of motion data points 517 - 524 , which occur after the end of motion sample #1 at motion data point 499 .
- touch sample #4 is not associated with motion sample #1, and the touch data points 35 of touch sample #4 are not used to calculate features in method 45 .
- touch timestamp values of touch data points 39 - 46 of touch sample #5 of FIG. 5 B coincide with the highlighted motion timestamp values of motion data points 578 - 585 , which occur after the end of motion sample #1 at motion data point 499 .
- touch sample #5 is also not associated with motion sample #1, and the touch data points 35 of touch sample #5 are not used to calculate features in method 45 .
- a predefined threshold e.g., 0.05 sec
- step 51 feature computation component 38 of predictive modeling module 33 calculates the values of touch features for each discrete movement within the first time interval.
- thirty-two touch features are calculated for each of touch samples #1-#3 which fall within motion sample #1.
- the touch features are calculated using the touch data points 35 listed in FIGS. 5 A- 5 B .
- FIG. 7 is a table listing the values of the thirty-two touch features for each of touch samples #1-#5. Only the feature values are samples #1-#3 are used in further calculations.
- the thirty-two touch features include: distance_covered, distance_spanned, efficiency, duration, speed, num_points, curvature, pressure_median, pressure_mean, pressure_std, pressure_max, pressure_min, radiusMin_median, radiusMin_mean, radiusMin_std, radiusMin_max, radiusMin_min, radiusMax_median, radiusMax_mean, radiusMax_std, radiusMax_max, radiusMax_min, orientation_median, orientation_mean, orientation_std, orientation_max, orientation_min, size_median, size_mean, size_std, size_max, and size_min.
- the distance_covered is the entire distance_covered by the user's finger during a touch event, whereas the distance_spanned is the straight-line distance between the beginning and the end of the touch event.
- FIG. 8 shows how the touch feature “distance_covered” is calculated for touch sample #1 using the touch data points 1 - 14 .
- the sum of the distances between the xy positions at the end of each touch data point is calculated using the delta_x and delta_y values for touch data points 1 - 14 in FIG. 5 A .
- the result is 200.3943288, which is listed as the touch feature value for distance_covered for touch sample #1 in FIG. 7 .
- FIG. 9 shows how the touch feature “distance_spanned” is calculated for touch sample #1 using the first touch data point 1 of sample #1 and the first touch data point 14 of touch sample #2.
- the distances between the xy position at the beginning of touch sample #1 and the xy position at the beginning of touch sample #2 is calculated using the x_position and y_position values for touch data points 1 and 14 in FIG. 5 A .
- the result is 195.0025641, which is listed as the touch feature value for distance_spanned for touch sample #1 in FIG. 7 .
- the touch feature “efficiency” is calculated as the ratio of distance_spanned to distance_covered. For the touch sample #1, the efficiency is 195.0025641/200.3943288, which equals 0.973094225 as listed in FIG. 7 .
- the touch feature “duration” is an indication of the duration of a swipe of the user's finger over screen 26 of smartphone 11 .
- the feature “duration” is calculated as the difference between the final and initial timestamps of the touch sample.
- the duration is (1583005689753 ⁇ 1583005689576), which equals 177 milliseconds.
- the duration is listed in FIG. 7 as 0.177 seconds.
- the feature “pressure_mean” is an indication of the average pressure applied by the user's finger to screen 26 of smartphone 11 during a swipe or touch event.
- step 52 feature computation component 38 calculates the values of motion features associated with the first time interval.
- the values of the motion features are determined using motion data points 36 listed in FIGS. 6 A- 6 I whose timestamps fall within motion sample #1. In this example, twenty-nine motion features are calculated for motion sample #1.
- FIG. 6 A- 6 I whose timestamps fall within motion sample #1.
- 10 is a table listing the values of the twenty-nine motion features, which are: mean_x, std_x, var_x, mean_y, std_y, var_y, mean_z, std_z, var_z, mean_overall, median_overall, std_overall, var_overall, amax_overall, amin_overall, mean_l2_norm, mean_l1_norm, root_mean_sq, curve_len, diff_entropy, energy, std_energy, peak_magnitude, peak_magnitude frequency, peak_power, peak_power_frequency, magnitude_entropy, power_entropy, and energy_pct_between_3_and_7.
- step 53 feature normalization component 39 of predictive modeling module 33 normalizes the touch features and the motion features based on prior touch data points 35 and motion data points 36 previously acquired while the user was using smartphone 11 .
- the touch and motion features are normalized using the past mean of each feature and the past standard deviation of each feature for the particular user.
- touch and motion data is acquired for the user over a 2-week period, and then the mean value and the standard deviation of each touch and motion feature are determined. Note that the user must be using the same smartphone 11 over the entire 2-week period because the feature values are heavily influenced by screen size and other characteristics that vary for different smartphones.
- FIG. 7 lists the mean and the standard deviation for each touch feature for the particular user in the last two columns.
- FIG. 10 lists the mean and the standard deviation for each motion feature for the particular user in the last two columns.
- FIGS. 11 A- 11 B list the normalized touch and motion features associated with touch samples #1-#3.
- the normalized value for each feature is calculated as the difference between the feature value and the feature mean, with the difference then divided by the standard deviation of the feature.
- the normalized distance_covered for touch sample #1 is calculated as (200.3943288 ⁇ 196.713947)/48.93410519, which equals 0.075210975.
- the inputs are listed in the first row of FIG. 7
- the normalized distance_covered is listed in the first row of FIG. 11 A .
- Note that the same normalized motion feature values are used with each of the three distinct touch feature values for touch samples #1-#3, as shown in the columns of FIGS. 11 A- 11 B .
- the contribution of the motion features is the same.
- the machine learning model 40 of predictive modeling module 33 determines a stress level value for each of the three touch samples #1-#3. Each determination is based on both the thirty-two touch features of the corresponding touch sample and the twenty-nine motion features of motion sample #1.
- a stress level value is generated for each discrete movement within the first time interval of motion sample #1 by applying logistic regression parameters to weight each of the normalized touch and motion features.
- the values of the sixty-one logistic regression parameters associated with each touch and motion feature are listed in the last column of FIGS. 11 A- 11 B .
- the stress level value of each touch sample is calculated using the formula of FIG. 12 .
- the value of “t” in the formula is the sum of the products of the sixty-one feature values times their associated regression parameters; an intercept value is then added to the sum.
- the calculated stress level value falls within a range of zero to one, in which one represents the maximum possible stress.
- the sums of the products of the sixty-one feature values times their associated regression parameters for the touch samples #1, #2 and #3 are ⁇ 0.861821372, ⁇ 0.952629425 and ⁇ 0.975875892, respectively.
- the resulting stress levels for touch samples #1, #2 and #3 are 0.275536305, 0.257783613 and 0.25336094, respectively
- the regression parameters used in step 54 are generated using a logistic regression model trained on touch data points and motion data points acquired from other users during time intervals identified by those other users as being associated with various levels of stress.
- the touch and motion data was acquired from 110 other users over fourteen daily usage sessions.
- the users recorded their perceived stress level on a scale between 0-10 for each usage session.
- each touch data point and motion data point is associated with a stress level value corresponding to that of the usage session during which the data was acquired.
- the data from the usage sessions is divided into a first half with the highest stress rankings and a second half with the lowest stress rankings.
- the data in the first half is deemed to be associated with a stress level of one, and the data in the second half is deemed to be associated with a stress level of zero.
- the feature with the greatest correlation to the user's stress level is the standard deviation of energy of the motion data points (std_energy), which has a negative contribution to stress of ⁇ 1.011.
- the feature with the biggest positive correlation to the user's stress level is the overall standard deviation of the motion data points (std_overall), which has a positive contribution to stress of 0.701.
- Another motion feature with a large contribution to the user's stress level is the peak magnitude of the motion data points while the user's finger moves over the screen, which has a positive contribution to stress of 0.529.
- Touch features that make a large contribution to the user's stress level are the minimum size of the touch (size_min), the standard deviation of the touch size (size_std) and the minimum of the radius of the minimum sized touches (radiusMin_min).
- size_min the minimum size of the touch
- size_std the standard deviation of the touch size
- radiusMin_min the minimum of the radius of the minimum sized touches
- a stress score of the user during the first time interval is determined based on the stress level values calculated for the discrete touch movements during the first time interval.
- a stress score is calculated by averaging the determined stress level values of the touch samples #1-#3 that fall within the first time interval between the timestamps for motion data points 1 and 499 .
- the stress score for motion sample #1 is 0.262, which is the average of the stress level values of the touch samples #1-#3. Note, however, that both the stress level values and the stress score are influenced by both the touch data points and the motion data points that occur during the first time interval.
- the stress score is determined to be the highest stress level value calculated for a touch sample during the first time interval.
- stress level values are calculated for the touch samples (discrete finger movements) that occur during many motion samples acquired over longer periods of use of smartphone 11 , such as a work day or portions of a day. The calculated stress level values are then averaged to determine a stress score for the day or portion of a day.
- an indication is displayed on screen 26 of how the stress score of the user has changed since the stress score was last determined. For example, an image is displayed indicating to the user that the stress score is higher or lower than a previously generated stress score. A numerical representation of the stress score is displayed on display screen 26 of smartphone 11 .
- App 30 gives the user non-verbal audio feedback indicating to the user whether the stress score is higher or lower than the previously generated stress score. For example, a higher tone indicates a rising stress score, and a lower tone indicates a falling stress score.
- step 56 the user is prompted to engage in an activity based on the determined stress score that is not indicated to the user.
- App 30 prompts the user to engage in an intervention that is selected based on the determined stress score. If the user's stress score is above a predetermined threshold, the user is prompted to engage in relaxation and meditation techniques guided by App 30 .
- a company may recommend that its employees use the mobile app running on their smartphones to assist them in engaging in stress reduction techniques during the work day.
- Each employee's stress level is determined in the background as the employee uses the smartphone throughout the day without the employee having to perform any specific diagnostic procedures to measure his or her current stress level. If the employee's stress level exceeds a predetermined threshold, then App 30 notifies the employee and recommends a stress reduction activity, for example, meditation.
- An advantage of method 45 for determining a stress level using App 30 on smartphone 11 is that the method does not require other sensors or wearable devices to determine the user's stress level. Over forty percent of the world's population owns a mobile device on which method 45 can be implemented, and most of these people use the mobile device regularly throughout the day. This ubiquity of mobile devices allows the method to be used by many more people than would otherwise be possible if purchasing a wearable device or medical-grade device were required to determine the user's stress level.
- method 45 Another advantage of method 45 is that the user is not required to engage in any specific behavior during which the user's stress level is measured. Most other methods of determining a user's stress level require the user to adjust his or her patterns of behavior and to act in a way that the user would not otherwise act. However, method 45 is completely unobtrusive because the user's stress level is determined based on the user's normal usage of the mobile device or smartphone. Thus, the user of App 30 is more likely to engage in stress reduction techniques because the reluctance to perform additional steps to measure the user's current stress level is removed as an excuse for failing to implement the techniques.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physiology (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- Child & Adolescent Psychology (AREA)
- Social Psychology (AREA)
- Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Educational Technology (AREA)
- Developmental Disabilities (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Dentistry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (15)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/227,308 US11944436B2 (en) | 2021-04-10 | 2021-04-10 | Determining a stress level of a smartphone user based on the user's touch interactions and motion sensor data |
EP21179548.9A EP4070718A1 (en) | 2021-04-10 | 2021-06-15 | Determining a stress level of a smartphone user based on the user's touch interactions and motion sensor data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/227,308 US11944436B2 (en) | 2021-04-10 | 2021-04-10 | Determining a stress level of a smartphone user based on the user's touch interactions and motion sensor data |
Publications (2)
Publication Number | Publication Date |
---|---|
US20220322985A1 US20220322985A1 (en) | 2022-10-13 |
US11944436B2 true US11944436B2 (en) | 2024-04-02 |
Family
ID=76796879
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/227,308 Active 2041-11-07 US11944436B2 (en) | 2021-04-10 | 2021-04-10 | Determining a stress level of a smartphone user based on the user's touch interactions and motion sensor data |
Country Status (2)
Country | Link |
---|---|
US (1) | US11944436B2 (en) |
EP (1) | EP4070718A1 (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120289793A1 (en) | 2011-05-13 | 2012-11-15 | Fujitsu Limited | Continuous Monitoring of Stress Using Accelerometer Data |
US20140136450A1 (en) | 2012-11-09 | 2014-05-15 | Samsung Electronics Co., Ltd. | Apparatus and method for determining user's mental state |
US20160006941A1 (en) | 2014-03-14 | 2016-01-07 | Samsung Electronics Co., Ltd. | Electronic apparatus for providing health status information, method of controlling the same, and computer-readable storage |
US9521973B1 (en) * | 2014-02-14 | 2016-12-20 | Ben Beiski | Method of monitoring patients with mental and/or behavioral disorders using their personal mobile devices |
US20170071551A1 (en) * | 2015-07-16 | 2017-03-16 | Samsung Electronics Company, Ltd. | Stress Detection Based on Sympathovagal Balance |
US20200042687A1 (en) * | 2019-08-06 | 2020-02-06 | Lg Electronics Inc. | Method and device for authenticating user using user's behavior pattern |
US20200060603A1 (en) * | 2016-10-24 | 2020-02-27 | Akili Interactive Labs, Inc. | Cognitive platform configured as a biomarker or other type of marker |
US20200380882A1 (en) * | 2017-08-03 | 2020-12-03 | Akili Interactive Labs, Inc. | Cognitive platform including computerized evocative elements in modes |
-
2021
- 2021-04-10 US US17/227,308 patent/US11944436B2/en active Active
- 2021-06-15 EP EP21179548.9A patent/EP4070718A1/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120289793A1 (en) | 2011-05-13 | 2012-11-15 | Fujitsu Limited | Continuous Monitoring of Stress Using Accelerometer Data |
US20140136450A1 (en) | 2012-11-09 | 2014-05-15 | Samsung Electronics Co., Ltd. | Apparatus and method for determining user's mental state |
JP2014094291A (en) | 2012-11-09 | 2014-05-22 | Samsung Electronics Co Ltd | Apparatus and method for determining user's mental state |
US9521973B1 (en) * | 2014-02-14 | 2016-12-20 | Ben Beiski | Method of monitoring patients with mental and/or behavioral disorders using their personal mobile devices |
US20160006941A1 (en) | 2014-03-14 | 2016-01-07 | Samsung Electronics Co., Ltd. | Electronic apparatus for providing health status information, method of controlling the same, and computer-readable storage |
US20170071551A1 (en) * | 2015-07-16 | 2017-03-16 | Samsung Electronics Company, Ltd. | Stress Detection Based on Sympathovagal Balance |
US20200060603A1 (en) * | 2016-10-24 | 2020-02-27 | Akili Interactive Labs, Inc. | Cognitive platform configured as a biomarker or other type of marker |
US20200380882A1 (en) * | 2017-08-03 | 2020-12-03 | Akili Interactive Labs, Inc. | Cognitive platform including computerized evocative elements in modes |
US20200042687A1 (en) * | 2019-08-06 | 2020-02-06 | Lg Electronics Inc. | Method and device for authenticating user using user's behavior pattern |
Non-Patent Citations (7)
Title |
---|
Ciman et al., "Individuals' stress assessment using human-smartphone interaction analysis," 2016 IEEE, Doi 10.1109/TAFFC. 2016. 2592504 (14 pages). |
Ciman et al., "iSenseStress: Assessing stress through human-smartphone interaction analysis," Proceedings of the 9th International Conference on Pervasive Computing Technologies for Healthcare, ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering) 2015, pp. 84-91 (8 pages). |
Extended European Search Report dated Feb. 21, 2022, from the European Patent Office in the related foreign application EP21179548.9 (14 pages). |
Joselli et al., "gRmobile: A Framework for Touch and Accelerometer Gesture Recognition for Mobile Games," Games and Digital Entertainment (SBGAMES), 2009 VIII Brazilian Symposium, IEEE Computer Society, Piscataway, NJ, Oct. 8, 2009, pp. 141-150 XP031685727 (10 pages). |
Ruensuk et al., "How Do You Feel Online, " Proceedings of ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, ACMPUB27, NY, NY, vol. 4, No. 4, Dec. 17, 2020, pp. 1-32, XP058662221 (32 pages). |
Sagbas et al., "Stress Detection via Keyboard Typing Behaviors by Using Smartphone Sensors and Machine Learning Techniques," Jour. of Med. Systems, Springer US, NY, vol. 44, No. 4, Feb. 17, 2020, XP037054202 (12 pages). |
Wampfler et al., "Affective State Prediction Based on Semi-Supervised Learning from Smartphone Touch Data, " Companion Pub. of 2020 ACM Designing Interactive Sys. Conf., ACMPUB27, Apr. 21, 2020 pp. 1-13, XP058616613 (13 pages). |
Also Published As
Publication number | Publication date |
---|---|
EP4070718A1 (en) | 2022-10-12 |
US20220322985A1 (en) | 2022-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Muaremi et al. | Towards measuring stress with smartphones and wearable devices during workday and sleep | |
CN107405072B (en) | System and method for generating stress level information and stress elasticity level information for an individual | |
US20180055375A1 (en) | Systems and methods for determining an intensity level of an exercise using photoplethysmogram (ppg) | |
CN106343979B (en) | Resting heart rate measuring method and device and wearable equipment comprising device | |
CN104023802B (en) | Use the control of the electronic installation of neural analysis | |
CN111818850B (en) | Pressure evaluation device, pressure evaluation method, and storage medium | |
KR20170076281A (en) | Electronic device and method for providing of personalized workout guide therof | |
US11191483B2 (en) | Wearable blood pressure measurement systems | |
US20210290082A1 (en) | Heart rate monitoring device, system, and method for increasing performance improvement efficiency | |
EP3364856B1 (en) | Method and device for heart rate detection with multi-use capacitive touch sensors | |
Reimer et al. | Mobile stress recognition and relaxation support with smartcoping: user-adaptive interpretation of physiological stress parameters | |
Raij et al. | mstress: Supporting continuous collection of objective and subjective measures of psychosocial stress on mobile devices | |
CN115227213A (en) | Heart rate measuring method, electronic device and computer readable storage medium | |
CN108697363B (en) | Apparatus and method for detecting cardiac chronotropic insufficiency | |
EP3838142A1 (en) | Method and device for monitoring dementia-related disorders | |
KR20210067827A (en) | Operation method of stress management apparatus for the emotional worker | |
CN111012312A (en) | Portable Parkinson's disease bradykinesia monitoring intervention device and method | |
US11944436B2 (en) | Determining a stress level of a smartphone user based on the user's touch interactions and motion sensor data | |
CN110313922A (en) | A kind of pressure regulating method, pressure regulating system and terminal | |
Aras et al. | GreenMonitor: Extending battery life for continuous heart rate monitoring in smartwatches | |
Maier et al. | A mobile solution for stress recognition and prevention | |
WO2019230235A1 (en) | Stress evaluation device, stress evaluation method, and program | |
KR102390599B1 (en) | Method and apparatus for training inner concentration | |
CN114847949A (en) | Cognitive fatigue recovery method based on electroencephalogram characteristics and relaxation indexes | |
CN111315296B (en) | Method and device for determining pressure value |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KOA HEALTH B.V., SPAIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GUERREIRO, JOAO;SKORULSKI, BARTLOMIEJ M.;MATIC, ALEKSANDAR;REEL/FRAME:055884/0349 Effective date: 20210409 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: KOA HEALTH DIGITAL SOLUTIONS S.L.U., SPAIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KOA HEALTH B.V.;REEL/FRAME:064106/0466 Effective date: 20230616 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |