US20150265211A1 - Device, method and application for establishing a current load level - Google Patents

Device, method and application for establishing a current load level Download PDF

Info

Publication number
US20150265211A1
US20150265211A1 US14/418,374 US201314418374A US2015265211A1 US 20150265211 A1 US20150265211 A1 US 20150265211A1 US 201314418374 A US201314418374 A US 201314418374A US 2015265211 A1 US2015265211 A1 US 2015265211A1
Authority
US
United States
Prior art keywords
data
mobile terminal
user
biometric data
artificial neural
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/418,374
Other languages
English (en)
Inventor
Peter Schneider
Johann Huber
Christopher Lorenz
Diego Alberto Martin-Serrano Fernandez
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
V1am Ltd
Soma Analytics Ug (haftungsbeschrankt)
Original Assignee
Soma Analytics Ug (haftungsbeschrankt)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=49944062&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20150265211(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Soma Analytics Ug (haftungsbeschrankt) filed Critical Soma Analytics Ug (haftungsbeschrankt)
Assigned to SOMA ANALYTICS UG reassignment SOMA ANALYTICS UG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FERNANDEZ, DIEGO ALBERTO MARTIN-SERRANO, HUBER, JOHANN, LORENZ, Christopher, SCHNEIDER, PETER
Publication of US20150265211A1 publication Critical patent/US20150265211A1/en
Assigned to PRENETICS EMEA LTD reassignment PRENETICS EMEA LTD NUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS). Assignors: SOMA ANALYTICS UG (HAFUNGSBESCHRÄNKT)
Assigned to TPDM1 LTD reassignment TPDM1 LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PRENETICS EMEA LTD
Assigned to V1AM LIMITED reassignment V1AM LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TPDM1 LIMITED
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4884Other medical applications inducing physiological or psychological stress, e.g. applications for stress testing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4803Speech analysis specially adapted for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4815Sleep quality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/088Non-supervised learning, e.g. competitive learning
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/63Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/66Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for extracting parameters related to health condition
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
    • G16B40/00ICT specially adapted for biostatistics; ICT specially adapted for bioinformatics-related machine learning or data mining, e.g. knowledge discovery or pattern finding
    • G16B40/20Supervised data analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1124Determining motor skills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
    • G16B40/00ICT specially adapted for biostatistics; ICT specially adapted for bioinformatics-related machine learning or data mining, e.g. knowledge discovery or pattern finding

Definitions

  • the present invention relates to an apparatus for ascertaining a current stress level for a user according to the preamble of claim 1 , to a method for ascertaining a current stress level, to a computer program product and to an application for a mobile terminal.
  • a mobile terminal that has at least one sensor integrated in the mobile terminal for producing signal data and has a plurality of available applications for use by the user.
  • an evaluation unit is provided. The evaluation unit is provided particularly to evaluate signal data.
  • the mobile terminal according to the preamble is a smartphone such as an iPhone or another smartphone, which is equipped with an operating system, such as an iOS or Android operating system for a mobile terminal, and has an integrated sensor, for example a GPS sensor for ascertaining the current position.
  • the mobile terminal has a plurality of standard applications installed on it, such as a telephony application for setting up and conducting a telephone call via a mobile radio link and/or an application for writing, sending and receiving SMSs and/or a browser application for accessing web pages on the Internet.
  • the operating system allows the installation and execution of further applications, which can be downloaded from an online shop for mobile applications on the Internet, for example.
  • the further application can request and process data pertaining to the current location of the mobile terminal from the GPS sensor, for example, and can transmit said data to a central server via the mobile radio network, for example via a GPRS or UMTS connection.
  • a central server via the mobile radio network, for example via a GPRS or UMTS connection.
  • Such tracking data can be stored on the server and evaluated in an evaluation unit arranged in the central server, for example in order to provide the user with location-based services, such as a locating service for tracking the whereabouts of a child for his parents, what is known as a Childtracker.
  • Location-based services are currently widely advertized on the market and in great demand, as a result of which smartphones have now become prevalent, particularly in younger demographic groups.
  • the Stress Monitor (see http://www.healthreviser.com/content/stress-monitor) thus uses a clip worn on the ear of the user to ascertain the heart rate of the user and uses this singular indicator to determine a current stress level.
  • the cited apparatuses and methods for ascertaining a current stress level have the disadvantage that the current stress level is determined on the basis of very few biometric data (for example subjective questionnaire data or objective sensor data that measure a vital function), which means that different categories of biometric data are not combined. Therefore, the current stress level can be determined less reliably than when a large number of different categories of biometric data are used.
  • biometric data for example subjective questionnaire data or objective sensor data that measure a vital function
  • the aforementioned sensor-based apparatuses resort to special devices designed precisely for this instance of application, which are equipped with integrated sensors that capture a specific category of biometric data, such as the heart rate or the body temperature or another vital function. These devices need to be worn by the user on his body, for example, in order to determine his current stress level. This firstly means increased outlay in terms of hardware and cost, since measuring a vital function requires a dedicated device with a sensor for measuring precisely this vital function. Furthermore, fitting and wearing a special apparatus with an integrated sensor on or in direct proximity to the body is a nuisance for the wearer, particularly a restriction of comfort and wellbeing.
  • the aforementioned questionnaire-based methods require a high level of additional effort from the user.
  • the user thus needs to regularly answer questionnaire data, for example using a computer via the Internet, and then autonomously manage, compare and rate the results that vary over time.
  • the present invention comprises a further application that can be installed on a mobile terminal and that ascertains a multiplicity of biometric data from a user.
  • the further application interacts with other components that are likewise arranged in the mobile terminal.
  • the other components are sensors integrated in the mobile terminal and also available applications.
  • An available application denotes an application that is available on a mobile terminal. That is to say an application that is installed and executable on the mobile terminal, such as telephony, SMS, MMS, chat applications and/or browser applications for accessing the Internet and also other applications that are suitable for extracting tactile, acoustic and/or visual biometric features of the user.
  • the mobile terminal is a smartphone or a tablet computer or a PDA or another mobile terminal that the user can use for many diverse purposes, for example for communication.
  • the mobile terminal has means for installing and executing a further application.
  • the further application can be obtained via the Internet via an online shop integrated in the operating system of the mobile terminal, such as the App store, or another online shop that supplies compatible applications for the mobile terminal, and can be installed on the mobile terminal directly.
  • the further application may be available in various versions, for example in an iOS version for installation and execution on an iPhone or an iPad and in an Android version for installation and execution on a mobile terminal that supports said Android operating system, or in a further version that is compatible with a further mobile operating system.
  • the further application may be installed, and can be executed, on an interchangeable component of the mobile terminal.
  • it may be stored as a SIM application in a memory area on a SIM card that can be operated in the mobile terminal, and can be executed by a separate execution unit integrated on the SIM card.
  • the portion of the apparatus for ascertaining a current stress level that is arranged on the mobile terminal is obtained.
  • the further application ascertains the biometric data firstly from signal data that are produced by sensors integrated in the mobile terminal, and secondly biometric data are extracted from the use data of other applications installed on the mobile terminal.
  • Determination of a plurality of biometric data is firstly facilitated by virtue of mobile terminals, such as smartphones, being equipped with an increasing number of sensors as standard. Furthermore, determination of further biometric data is also facilitated by virtue of users increasingly satisfying their interaction and communication needs by using such devices, and hence biometric data pertaining to the categories speech and social interaction, for example, being able to be derived from use data pertaining to communication applications of the mobile terminal directly, and without additional effort and restriction of comfort for the user.
  • the further application can use voice analysis to ascertain biometric data pertaining to speech, such as volume, speech rate and/or modulation capability, from the use data from a telephony application installed on the mobile terminal, particularly from the voice data from the user that are ascertained via the microphone of the mobile handset and that are transmitted from the mobile terminal to the communication partner via a radio network, for example.
  • the voice data from the user can come from using other available applications, for example from applications that are controlled by the user using voice control.
  • the further application can determine the number of SMS messages sent and received and the number of different receivers and senders of SMS messages, and hence can determine biometric data pertaining to social interaction.
  • a sensor integrated in the mobile terminal may be a gyroscope or gyroscopic instrument.
  • a gyroscope is used for position finding in space and is increasingly widespread in smartphones currently advertized on the market, such as in the iPhone 4 .
  • the gyroscope and further sensors that are integrated in the mobile terminal can be used to ascertain biometric data pertaining to the sleep structure of the user.
  • the mobile terminal is positioned on the mattress at night and the movements of the user during the night are detected by the sensors.
  • the further application collects the signal data produced by gyroscope, acceleration sensor and/or light sensor and ascertains biometric data therefrom pertaining to sleep quality and sleep profile, such as time of sleep onset, sleep duration and/or sleep stages.
  • questionnaire data answered by the user which questionnaire data are requested using a web form in a browser application or in a form that provides the further application, to be part of the biometric data ascertained by the further application. Since answering questionnaire data generates additional effort for the user, these data showing the subjective stress level should be requested only once or as rarely as possible.
  • the biometric data are ascertained constantly by the further application while the mobile terminal is in operation.
  • the mobile terminal is configured such that the further application is started automatically, for example, when the mobile terminal is switched on and is operated continually in the background until the mobile terminal is switched off, without the need for further interaction with the user.
  • the mobile terminal can be configured such that the user can activate, configure and deactivate the further application autonomously and in this way controls the times at which biometric user data are meant to be tracked by the further application and provided for evaluation.
  • the signal data produced by the sensors integrated in the mobile terminal which signal data are tracked by the further application, are constantly received by the further application and the biometric data are ascertained therefrom.
  • use data from standard applications used by the user which are tracked by the further application, such as telephony and/or SMS applications, are constantly evaluated by the further application and biometric data are determined therefrom.
  • the biometric data determined by the further application can be divided into different categories.
  • the biometric data ascertained from the further application belong to the categories sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data.
  • the biometric data ascertained by the further application can be evaluated by a first evaluation apparatus, which is provided in the further application, on the mobile terminal.
  • the biometric data ascertained by the further application can alternatively be evaluated by a second evaluation apparatus on a central server.
  • the second evaluation apparatus can increase the quality of the evaluation further.
  • the biometric data ascertained by the further application are transmitted from the mobile terminal to a central server using standard means that the mobile terminal provides.
  • the transmission of the biometric data is effected in pseudonymized form.
  • the user is managed on the central server under a pseudonym, rather than under his real name or another identifier identifying the user.
  • the transmitted biometric data are associated with the user by means of the pseudonym.
  • the biometric data are transmitted in encrypted form.
  • the transmission unit is responsible for setting up, maintaining and clearing down a connection to a network, for example a GPRS or UMTS connection to a mobile radio network to which the data are transmitted. Transmission is effected only when the mobile terminal is in transmission mode.
  • the transmission unit can also be designed for transmission via different networks, for example one or more mobile radio network(s) and/or a wireless connection to a local area network, for example based on an IEEE-802 Standard, such as a WLAN or a WPAN connection.
  • a transmission network for example a mobile radio network, and possibly further networks connected to said transmission network via gateways, for example, such as the Internet, via which gateways the central server can be reached, is used to transmit the data to the central server.
  • the data ascertained by the further application first of all being stored on a local memory unit arranged on the mobile terminal before the data are transmitted to the central server. This is necessary particularly when the transmission unit is temporarily incapable of transmitting the data via a transmission network.
  • the transmission unit is temporarily incapable of transmitting the data via a transmission network.
  • the biometric data can also be analyzed and evaluated by the further application on the mobile terminal directly.
  • the further application autonomously ascertains a current stress level for the user from the biometric data.
  • the central server has a reception unit that can be used to receive data from the network, for example from the Internet.
  • the reception unit receives said data and stores them in a central memory unit.
  • the stored data are transmitted to an evaluation unit arranged on the central server and are analyzed and evaluated by the evaluation unit.
  • the transmitted and stored data comprise a plurality of biometric data pertaining to a user that are used by the evaluation unit in order to ascertain a current stress level for the user.
  • the method disposed in this application is not a diagnostic method. Instead, it is a method that ascertains, collects and analyzes biometric data pertaining to a user and provides the user with the results of the analysis in the form of at least one current stress level.
  • the data analysis is used particularly for detecting an alteration in the at least one stress level, i.e. establishing whether the at least one current stress level has increased or decreased in comparison with a previous stress level.
  • the user is therefore provided with a tool for obtaining information particularly about changes in his at least one current stress level in comparison with earlier stress levels over time and, following autonomous rating of the detected changes, if need be taking individual measures for stress reduction.
  • the evaluation unit can resort to biometric data pertaining to the user that have been ascertained in the past, for example, and can take said biometric data into account when ascertaining said current stress level.
  • biometric data pertaining to the user that have been ascertained in the past can be used as reference data in order to perform user-specific calibration.
  • the current stress level is ascertained in relation to the available reference data.
  • the evaluation unit can resort to not only the biometric data from the user but also to biometric data from other users when ascertaining a current stress level.
  • this allows clusters of user groups to be formed, for example according to age, sex or profession.
  • the data from other users in the same user group can be taken into account.
  • the evaluation unit ascertains a current stress level for a user using artificial neural networks.
  • the artificial neural networks are trained on the basis of the available biometric data from a multiplicity of users. As a result of the training, the artificial neural network learns progressively as well and can thereby further improve the quality of the ascertained current stress level.
  • the artificial neural network can be realized on the basis of a multilayer perceptron network.
  • This neural network consists of a plurality of layers, a permanent input layer and a permanent output layer and if need be further intermediate layers, with no feedback taking place from one layer to layers situated before it.
  • the artificial neural network can consist of precisely three layers, for example, input layer, hidden layer and output layer.
  • the seven categories of biometric data for example, sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data, can form the neurons of the input layer.
  • a relatively large number of neurons are used in the input layer by virtue of more finely granular category features being used as neurons of the input layer.
  • the artificial neural network has feedback mechanisms. When the artificial neural network has feedback mechanisms, it is transited multiple times (iteratively).
  • the evaluation unit ascertains a current stress level for a user using a network of artificial neural networks, for example using a Deep Belief Network.
  • the network of artificial neural networks is made up of a plurality of neural networks that interact with one another.
  • a single neural network comprises an input layer and a hidden layer.
  • a first level of neural networks is provided.
  • the input layer is stipulated by the biometric data from the user.
  • the input layer of a first-level neural network can be stipulated by the biometric data from the user in precisely one category. From the input layer of an artificial neural network, it is possible to determine the hidden layer of the same neural network.
  • a second and possibly further level of neural networks is provided.
  • the input layer can be determined using the hidden layers of a plurality of neural networks on the preceding level.
  • the current stress level can be determined from at least one neural network on a topmost level.
  • the multiplicity of biometric data parameters which particularly includes a combination of different categories of biometric data that are used for ascertaining the at least one current stress level, means that the apparatus disclosed in this application and also the disclosed method and the disclosed application allow very much more reliable ascertainment of the at least one current stress level than is afforded by the known apparatuses and methods mentioned at the outset that ascertain a current stress level only on the basis of a single or very few biometric data parameter(s) in the same category.
  • the quality of the analysis of the biometric data is increased further by the neural network method used, since, as time progresses and the database available for training the neural network becomes larger, said method can make ever more precise statements and hence further improves the reliability of the method for determining a current stress level.
  • the further application can determine a plurality of biometric data pertaining to the user that belong to different categories solely from the user-specific use data from available applications on the mobile terminal, on the one hand, and from signal data, on the other hand, which are produced by sensors integrated in the mobile terminal.
  • the apparatus according to the invention and the method according to the invention and the application according to the invention provide a user with an inexpensive and non-time-consuming solution for determining a current stress level.
  • the solution according to the invention dispenses with additional apparatuses, particularly sensors, that the user needs to fix or wear directly on his body.
  • the solution does not restrict the user in any way in terms of comfort, wellbeing or look, as is entailed by the application or wearing of specific apparatuses with sensors.
  • the at least one current stress level determined by the evaluation unit can either be made accessible to the user via the Internet or can be made accessible to the user on the mobile terminal, for example by sending an SMS to the user.
  • the analysis data consisting of the at least one current stress level and possibly further evaluation data, for example statistics pertaining to the change in a stress level over time, can alternatively be transmitted to the mobile terminal using the same transmission paths as when transmitting biometric data from the mobile terminal to the central server, but in the opposite direction.
  • a transmission unit for transmitting data from the server to the mobile terminal is provided on the central server.
  • a reception unit for receiving data from the central server is provided on the mobile terminal.
  • the analysis data can also be transmitted from the central server to the mobile terminal using a push service. The data transmission is in turn effected in encrypted form.
  • FIG. 1 shows a schematic illustration of the apparatus for ascertaining a current stress level
  • FIG. 2 shows a schematic illustration of the further application for ascertaining a current stress level
  • FIG. 3 a shows a flowchart for a first instance of application, sleep
  • FIG. 3 b shows a flowchart for a second instance of application, motor functions
  • FIG. 3 c shows a flowchart for a third instance of application, speech
  • FIG. 4 a shows a graphical user interface for starting the sleep instance of application
  • FIG. 4 b shows a further graphical user interface for a first evaluation display
  • FIG. 4 c shows a further graphical user interface for a second evaluation display
  • FIG. 4 d shows a further graphical user interface for a third evaluation display
  • FIG. 5 a shows a schematic illustration of an exemplary embodiment of the evaluation unit
  • FIG. 5 b shows a schematic illustration of an alternative exemplary embodiment of the evaluation unit
  • FIG. 5 c shows a schematic illustration of a further alternative exemplary embodiment of the evaluation unit
  • FIG. 6 shows a schematic illustration of an exemplary embodiment of the evaluation unit with a plurality of artificial neural networks.
  • FIG. 1 shows a schematic illustration of an embodiment of the apparatus for ascertaining a current stress level 36 , 36 A, 36 B, 36 C, 36 D.
  • the apparatus comprises a mobile terminal 1 and a central server 10 .
  • the mobile terminal 1 contains a plurality of sensors 2 , for example a gyroscope 21 , an acceleration sensor 22 , a light sensor 23 and/or a microphone 24 .
  • the signal data 31 produced by the sensors 2 can be accessed via an operating system 4 .
  • the operating system 4 is executed within an execution unit 3 and manages the access to the hardware components of the mobile terminal 1 , for example the sensors 2 .
  • different applications for example a plurality of available applications 5 and a further application 6 , are executed in the execution unit 3 .
  • the further application 6 ascertains a plurality of biometric data 33 pertaining to a user of the mobile terminal 1 .
  • the further application 6 is implemented in the programming language Java.
  • the further application 6 uses the MVC (model view controller) design pattern as a basic design pattern.
  • the use of the MVC design pattern structures the further application 6 such that this facilitates the comprehensibility and also the extendability and adjustability of the further application 6 to new and/or altered hardware components and operating systems 4 .
  • the further application 6 obtains the biometric data 33 from signal data 31 that are produced by the sensors 2 and that can be accessed by means of the operating system 4 .
  • the access to the signal data 31 is realized by the further application 6 , for example through the use of the observer design pattern.
  • the observer design pattern provides the further application 6 with simplified and standardized access to the signal data 31 .
  • the further application 6 can extract a plurality of further biometric data 33 from the use data 32 from available applications 5 too.
  • the use data 32 produced by the available applications 5 are accessible via the operating system 4 .
  • the access to the use data 32 is realized by the further application 6 , for example through the use of the observer design pattern.
  • the observer design pattern provides the further application 6 with simplified and standardized access to the use data 32 .
  • An observer is informed about status changes on the object that it is observing, for example an available application 5 . If the available application 5 is an SMS application, for example, and the user calls the SMS application in order to write a new SMS, then the observer observing the SMS application is informed about this status change.
  • the further application 6 reacts to the writing of a new SMS that is observed by the observer by recording the characters input by the user, for example using a keypad, providing them with a timestamp and storing them in the local memory unit 7 as use data 32 for the SMS application.
  • the sensor keypad 25 it is also possible for all keypad inputs by the user to be recorded regardless of their use in a specific application.
  • an observer or a plurality of observers is implemented for the sensor keypad 25 , for example one observer for each key on the keypad.
  • the observer observing the key is informed of said pressing of a key.
  • the further application 6 reacts to the pressing of the key that is observed by this observer by virtue of the further application 6 checking whether the user has pressed a delete key or another key.
  • the ‘delete key’ or ‘other key’ information is recorded by the further application 6 , provided with a timestamp, and these data are stored in the local memory unit 7 as signal data 31 .
  • the further application 6 extracts a plurality of biometric data 33 .
  • the biometric data 33 are subdivided into categories, for example into the categories sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data.
  • a category-specific ascertainment time interval is defined, for example 30 seconds for the sleep category and 20 milliseconds for the speech category.
  • the signal data 31 and/or use data 32 that are relevant to a category are processed in a first pre-processing step using category-specific time intervals to produce conditioned signal data 31 A and/or conditioned use data 32 A.
  • the timestamps stored for the signal data 31 and/or use data 32 are evaluated.
  • the conditioned signal data 31 A and/or conditioned use data 32 A are in turn provided with a timestamp.
  • the biometric data 33 are extracted from a sequence of conditioned signal data 31 A and/or conditioned use data 32 A.
  • biometric data 33 in the motor functions category are ascertained from the conditioned use data 32 A pertaining to the SMS written.
  • the biometric data 33 pertaining to a category that are ascertained in an instance of application are also referred to as a feature vector for this category.
  • the biometric data 33 comprise the feature vectors ascertained for the various categories, with the respective timestamps of said feature vectors.
  • the biometric data 33 ascertained by the further application 6 are stored in a local memory unit 7 of the mobile terminal 1 .
  • the mobile terminal 1 has a transmission unit 8 A and a reception unit 8 B.
  • the transmission unit 8 A transmits data 34 from the mobile terminal 1 to an external node, for example the central server 10 .
  • the transmission is effected via the air interface, for example.
  • the reception unit 8 B receives data from an external node, for example the central server 10 .
  • the transmission unit 8 A is used to transmit data 34 , for example the biometric data 33 from the user, to the central server 10 for the purpose of evaluation.
  • the reception unit 8 B is used to receive data 34 coming from the central server 10 , for example evaluations 35 created by the central server. Each evaluation 35 is provided with a timestamp that stipulates the time interval for which the evaluation is valid.
  • An evaluation 35 for example a current stress level 36 , 36 A, 36 B, 36 C, 36 D of a user of the mobile terminal 1 , is transferred to the further application 6 for display and displays to the user on the display 9 of the mobile terminal 1 by means of the operating system 4 .
  • the central server 10 has a transmission unit 18 A and a reception unit 18 B.
  • the reception unit 18 B is used to receive data 34 from another node, for example the mobile terminal 1 .
  • the received data 34 are biometric data 33 from the user of the mobile terminal 1 .
  • the received data 34 are stored in a central memory unit 17 .
  • an evaluation unit 13 is provided on the central server 10 .
  • the evaluation unit 13 evaluates the received biometric data 33 .
  • the evaluation unit 13 determines the at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D at an instant t by evaluating those feature vectors for the received biometric data 33 whose timestamps are valid at the instant t.
  • the current stress level 36 A determines a first current stress level of the user for a first category of biometric data 33 , for example the sleep category.
  • the current stress level 36 C determines a second current stress level of the user for a second category of biometric data 33 , for example the motor functions category.
  • the current stress level 36 B determines a third current stress level of the user for a third category of biometric data 33 , for example the speech category.
  • the current stress level 36 D determines a fourth current stress level of the user for a fourth category of biometric data 33 , for example the social interaction category, or for a combination of categories of biometric data, for example the social interaction, economic data, personal data and/or questionnaire data categories.
  • further current stress levels can be determined for further categories and/or combinations of categories.
  • the current stress level 36 determines a consolidated current stress level of the user that is obtained from a combination of the category-specific stress levels 36 A, 36 B, 36 C, 36 D and if need be of available further category-specific stress levels, for example by forming the arithmetic mean of the category-specific stress levels.
  • the at least one evaluation 35 determined by the evaluation unit 13 for example the at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D, comprises, for each evaluation 35 , a timestamp that stipulates the time interval for which the evaluation 35 is valid.
  • the at least one evaluation 35 for example the at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D is stored in the central memory unit 17 and transmitted to the mobile terminal 1 via the transmission unit 18 A.
  • FIG. 2 shows a schematic illustration of an embodiment of the further application 6 for ascertaining at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D.
  • the further application 6 comprises a plurality of components, for example a data manager 61 , a data preprocessor 62 and a data analyzer 63 .
  • the signal data 31 and/or use data 32 made available via the operating system 4 are loaded into the data manager 61 and managed thereby.
  • the data manager 61 transfers the signal data 31 and/or use data 32 to the data preprocessor 62 .
  • the data preprocessor 62 conditions the signal data 31 and/or use data 32 and transfers the conditioned signal data 31 A and/or conditioned use data 32 A back to the data manager 61 .
  • the data manager 61 stores the conditioned signal data 31 A and/or conditioned use data 32 A in the local memory unit 7 .
  • the data manager 61 transfers the conditioned signal data 31 A and/or conditioned use data 32 A to the data analyzer 63 .
  • the data analyzer 63 analyzes the conditioned signal data 31 A and/or conditioned use data 32 A and determines the biometric data 33 therefrom.
  • the data analyzer 63 creates at least one evaluation 35 , for example in the form of at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D.
  • the data analyzer 63 transfers the biometric data 33 and if need be the at least one evaluation 35 to the data manager 61 .
  • the data manager 61 visualizes the at least one evaluation 35 for the user of the mobile terminal 1 by displaying it on the display 9 .
  • the data manager 61 transfers the biometric data 33 to the transmission unit 8 A for transmission to the central server 10 , insofar as the biometric data 33 are evaluated centrally.
  • That evaluation 35 that is provided in the form of the consolidated current stress level 36 can be visualized on the display 9 continuously, for example, as a traffic light icon.
  • the traffic light icon can display the colors green, amber or red on the basis of the consolidated current stress level 36 . If the consolidated current stress level 36 is normalized to an integer value in the value range [0,10], for example, then the traffic light color is chosen on the basis of the current value of the consolidated current stress level 36 . A high value corresponds to a high consolidated current stress level 36 . A low value corresponds to a low consolidated current stress level 36 . If the consolidated current stress level 36 is low, for example in the value range [0,3], the color green is displayed.
  • the consolidated current stress level 36 is increased, for example in the value range [4,6], the color amber is displayed. If the consolidated current stress level 36 of the user is high, for example in the value range [7,10], the color red is displayed.
  • the display of the consolidated current stress level 36 is updated as soon as a consolidated current stress level 36 is available with a timestamp that is more recent than the timestamp of the previously displayed consolidated stress level.
  • the consolidated current stress level 36 is visualized as a bar chart having 10 bars. Each bar in the bar chart has an associated integer value from the value range [0,10], to which the consolidated current stress level 36 is normalized.
  • the data manager 61 receives at least one evaluation 35 , for example in the form of a third current stress level 36 B for the speech category, pertaining to the biometric data 33 for the speech category that are evaluated on the server.
  • the data manager 61 When the data manager 61 receives a new evaluation 35 , for example in the form of a third current stress level 36 B for the speech category, it ascertains a new consolidated current stress level 36 from the category-specific current stress levels, known to the data manager 61 , whose timestamps are currently still valid.
  • the consolidated current stress level 36 is obtained by means of the arithmetic mean or by means of a weighted mean of the category-specific current stress levels 36 A, 36 B, 36 C, 36 D that are still valid.
  • the data manager 61 visualizes the consolidated current stress level 36 on the display 9 , for example by updating the traffic light icon.
  • the consolidated current stress level 36 of the user is an individual variable.
  • user-specific calibration can be performed. To this end, the user is asked to record biometric data 33 in the personal data category, for example via a form integrated in the further application 6 .
  • an individual current stress level of the user is determined, which stipulates a calibration factor, for example.
  • the individual current stress level for example in its manifestation as a calibration factor, is taken into account for determining the current stress level 36 , 36 A, 36 B, 36 C, 36 D for the user.
  • FIG. 3A shows a flowchart for a first instance of application, sleep.
  • the first instance of application, sleep ascertains biometric data 33 in the sleep category for the purpose of ascertaining a first current stress level 36 A of a user.
  • the first instance of application describes a first method for ascertaining said first current stress level 36 A.
  • the user Prior to first use of the sleep instance of application, the user allows the mobile terminal 1 to fall onto his mattress from a height of approximately 30 centimeters.
  • the further application 6 computes the spring temper and the damping constant of the mattress, which are stored as calibration data pertaining to the sleep instance of application.
  • the sleep instance of application ascertains motion data during the rest phase of the user and evaluates said data.
  • the user of the mobile terminal 1 calls the sleep mode of the further application 6 .
  • calling the sleep mode automatically prompts the mobile terminal 1 to be put into flight mode in order to minimize emissions of electromagnetic radiation by the mobile terminal 1 .
  • the user positions the mobile terminal 1 on the mattress during his rest phase.
  • (A3) The signal data 31 produced by the sensors 2 , for example the gyroscope 21 , the acceleration sensor 22 and the light sensor 23 during the rest phase are collected by the further application 6 and stored in the local memory unit 7 .
  • the data manager 61 of the further application 6 loads the sensor data ascertained during the sleep mode in the further application 6 and transfers these signal data 31 to the data preprocessor 62 .
  • the data preprocessor 62 divides the ascertained signal data 31 into time intervals, for example into time intervals having a length of 30 seconds. For the signal data 31 in each time interval, conditioned signal data 31 A that are characteristic of the time interval are determined and are provided with a timestamp.
  • the data preprocessor 62 transfers the conditioned signal data 31 A with their timestamps to the data manager 61 .
  • the data manager 61 stores the conditioned signal data 31 A with their timestamps in the local memory unit 7 .
  • the data manager 61 transfers the conditioned signal data 31 A with their timestamps to the data analyzer 63 for the purpose of evaluation.
  • the data analyzer 63 analyzes the conditioned signal data 31 A and determines therefrom a feature vector with biometric data 31 in the sleep category.
  • the feature vector is determined by means of a statistical regression model for modeling a binary target variable, for example a logit or probit model.
  • the sequence of conditioned signal data 31 A that is obtained by arranging the conditioned signal data 31 A according to ascending timestamps is evaluated and each element in the sequence is classified as “awake” or “asleep” for the sleep state.
  • the classification takes account of the sleep states of the preceding elements in the sequence, that is to say the sleep states in the preceding time intervals.
  • the time interval is classified with the state “asleep”, otherwise with the state “awake”.
  • the sequence of sleep states over all time intervals is given as a basis for determining the feature vector.
  • the feature vector of the biometric data 33 pertaining to the sleep category comprises the following features:
  • the data analyzer 63 determines an evaluation 35 that comprises particularly the first current stress level 36 A for the sleep category.
  • all features of the feature vector are rated with an integer value for the value range [0,10], for example, and the individual values are used to form a mean value, for example an arithmetic mean or a weighted mean.
  • the rating is influenced to some extent by the user-specific calibration, for example as a result of a calibration factor that needs to be taken into account.
  • the first current stress level 36 A for the sleep category is obtained as an integer value in the value range [0,10].
  • the first current stress level 36 A comprises a timestamp that stipulates the period for which the first current stress level 36 A for the sleep category is valid.
  • the data manager 61 stores the feature vector of the biometric data 33 pertaining to the sleep category and also the evaluation 35 , particularly the first current stress level 36 A for the sleep category, in the local memory unit 7 .
  • the data manager 61 visualizes the evaluation 35 , particularly the first current stress level 36 A for the sleep category, on the display 9 . From the first current stress level 36 A for the sleep category and if need be further available, valid current stress levels for further categories, for example the current stress levels 36 B, 36 C, 36 D, the data manager 61 determines a consolidated current stress level 36 and visualizes the consolidated current stress level 36 , for example by updating the traffic light icon.
  • FIG. 4A shows an exemplary graphical user interface for the start of the sleep instance of application of the further application 6 .
  • the exemplary graphical user interface contains a tip for successful measurement of the biometric data 33 pertaining to the sleep category. By selecting the OK button, the user can start the instance of application.
  • FIG. 4B shows a further exemplary graphical user interface of a first evaluation display for the sleep instance of application.
  • the first evaluation display visualizes an evaluation 35 for the sleep category in the form of an overview evaluation.
  • the sleep quality parameter is used to display a first current stress level 36 A of the user for the sleep category.
  • the sleep quality is indicated by the numerical value 2.0 within a scale from 0 to 10.
  • the first evaluation display comprises further elements, for example the last sleep pattern as a function of time.
  • FIG. 4C shows a further exemplary graphical user interface of a second evaluation display for the sleep instance of application.
  • the second evaluation display visualizes an evaluation 35 for the sleep category in the form of a detail display.
  • the detail display comprises the ascertained biometric data 33 pertaining to the sleep category. For each feature of the biometric data 33 in the sleep category, the ascertained value is indicated.
  • FIG. 4D shows a further graphical user interface of a third evaluation display for the sleep instance of application.
  • the third evaluation display visualizes the consolidated current stress level 36 in a bar chart.
  • the consolidated stress level 36 and the current stress levels 36 A, 36 B, 36 C, 36 D for the individual categories are displayed as numerical values. Each numerical value is displayed in a color that is specific to the value. The choice of color visualizes the current stress levels 36 , 36 A, 36 B, 36 C, 36 D in color.
  • FIG. 3B shows a flowchart for a second instance of application, motor functions.
  • the second instance of application, motor functions ascertains biometric data 33 in a motor functions category for the purpose of ascertaining a second current stress level 36 C of a user.
  • the second instance of application describes a second method for ascertaining the second current stress level 36 C. This instance of application requires only indirect interaction with the user.
  • (B3) The user uses the keypad 25 of the mobile terminal 1 to type an SMS, for example.
  • the data manager 61 transfers the collected and stored keypad data to the data preprocessor 62 .
  • the data preprocessor 62 performs pre-evaluation of the keypad data. To this end, the data preprocessor 62 divides the ascertained keypad data into time intervals, for example into time intervals with a length of 15 seconds. For the keypad data 32 in each time interval, conditioned use data 32 A that are characteristic of the time interval are determined and are provided with a timestamp.
  • the data manager 61 stores the conditioned use data 32 A provided with timestamps in the local memory unit 7 .
  • the data manager 61 transfers the conditioned use data 32 A provided with timestamps to the data analyzer 63 .
  • the data analyzer 63 analyzes the conditioned use data 32 A provided with timestamps and determines a feature vector therefrom with biometric data 31 in the motor functions category.
  • the data analyzer 63 determines the error rate from the frequency of keypad input errors, particularly from the number of times the user operates a delete key in the time interval under consideration.
  • the error rate determined is a measure of the hand/eye coordination of the user.
  • the feature vector of the biometric data pertaining to the motor functions category comprises the following features:
  • the data analyzer 63 determines an evaluation 35 , particularly the second current stress level 36 C for the motor functions category.
  • all features of the feature vector are rated with an integer value from the value range [0,10], for example, and a mean value, for example an arithmetic mean or a weighted mean, is formed from the individual values.
  • the rating is influenced to some extent by the user-specific calibration, for example as a result of a calibration factor that needs to be taken into account.
  • the second current stress level 36 C for the sleep category is obtained as an integer value in the value range [0,10], for example.
  • the second current stress level 36 C comprises a timestamp that stipulates the period for which the second current stress level 36 C for the sleep category is valid.
  • the data analyzer 63 transfers the feature vector of the biometric data 33 pertaining to the sleep category and also the evaluation 35 , particularly the second current stress level 36 C for the motor functions category, with its timestamp, to the data manager 61 .
  • the data manager 61 stores the feature vector of the biometric data 33 pertaining to the sleep category and also the evaluation 35 , particularly the second current stress level 36 C for the motor functions category, with its timestamp, in the local memory unit 7 .
  • the data manager 61 visualizes the evaluation 35 , particularly the second current stress level 36 C for the motor functions category, on the display 9 . From the second current stress level 36 C for the motor functions category and if need be further available valid current stress levels for further categories, the data manager 61 determines the consolidated current stress level 36 and visualizes it, for example by updating the traffic light icon.
  • the biometric data 31 for example the biometric data 31 pertaining to the sleep and/or motor functions categories, are transmitted to the central server 10 , stored in the central memory unit 17 and evaluated by the evaluation unit 13 arranged on the server.
  • FIG. 3C shows a flowchart for a third instance of application, speech.
  • the third instance of application, speech ascertains biometric data 33 in the speech category, for example the speech parameters speech rate and/or modulation capability, in order to ascertain a third current stress level 36 B of a user.
  • the third instance of application describes a third method for ascertaining the third current stress level 36 B. This instance of application requires only indirect interaction with the user.
  • the speech instance of application comprises voice analysis of voice data from the user, for example voice data from telephone calls conducted by the user using the mobile terminal 1 .
  • (C1) The further application 6 has been loaded into the execution unit 3 of the mobile terminal 1 and has been started.
  • the further application 6 runs as a background process in the execution unit 3 .
  • (C2) The speech instance of application is started by an incoming call to the mobile terminal 1 , for example.
  • (C6) The data manager 61 transfers the voice data 31 stored with a timestamp to the data preprocessor 62 .
  • the data preprocessor 62 performs pre-evaluation of the voice data 31 . To this end, the data preprocessor 62 divides the captured voice data 31 into time intervals, for example into time intervals with a length of 20 milliseconds. For the voice data 31 in each time interval, conditioned voice data 31 A that are characteristic of the time interval are determined and are provided with a timestamp.
  • the data preprocessor 62 transfers the conditioned voice data 31 A with their timestamps to the data manager 61 .
  • the data manager 61 stores the conditioned voice data 31 A for their timestamps in the local memory unit 7 .
  • the data manager 61 transfers the conditioned voice data 31 A with their timestamps to the data analyzer 63 for the purpose of evaluation.
  • the data analyzer 63 analyzes the conditioned voice data 31 A and determines from them a feature vector with biometric data 31 in the speech category.
  • the feature vector of the biometric data 31 for the speech category comprises the following features:
  • the feature vector is provided with a timestamp and these data are transferred from the data analyzer 63 to the data manager 61 as biometric data 33 in the speech category.
  • the data manager 61 stores the feature vector provided with a timestamp in the local memory unit 7 as biometric data 33 in the speech category.
  • the data manager 61 transfers the biometric data 33 pertaining to the speech category to the transmission unit 8 A for the purpose of transmission to the central server 10 .
  • the reception unit 18 B of the central server 10 receives the transmitted data in the form of the biometric data 33 pertaining to the speech category.
  • the central server 10 stores the biometric data 33 in the central memory unit 17 and evaluates the biometric data 33 in the evaluation unit 13 .
  • a neural network method is used, for example.
  • the evaluation unit 13 determines an evaluation 35 .
  • the evaluation 35 particularly comprises the third current stress level 36 B in the speech category.
  • the third current stress level 36 B for the speech category is determined as an integer value in the value range [0,10], for example.
  • the third current stress level 36 B comprises a timestamp that stipulates the period for which the third current stress level 36 B for the speech category is valid.
  • the central server 10 transmits the evaluation 35 , particularly the third current stress level 36 B for the speech category, with its timestamp, to the mobile terminal 1 by means of the transmission unit 18 A.
  • the transmitted evaluation 35 is received by the reception unit 8 B of the mobile terminal 1 and transferred to the data manager 61 of the further application 6 .
  • the data manager 61 stores the evaluation 35 , particularly the third current stress level 36 B for the speech category, with its timestamp, in the local memory unit 7 .
  • the data manager visualizes the evaluation 35 , particularly the third current stress level 36 B for the speech category, on the display 9 . From the third current stress level 36 B for the speech category and if need be further available, valid current stress levels for further categories, the data manager 61 determines the consolidated current stress level 36 and visualizes the consolidated current stress level 36 , for example by updating the traffic light icon.
  • the social interaction instance of application evaluates use data 32 from the user from such available applications 5 as are used for social interaction.
  • available applications 5 that are used for social interaction are SMS applications, e-mail applications or social network applications, such as an instant messaging application or a Facebook application. From the use data 32 pertaining to the available applications 5 that are used for social interaction, it is possible to ascertain, by way of example, the number of contacts in social networks or the frequency with which contact is made, for example the frequency with which an SMS is sent.
  • the feature vector of the biometric data 33 pertaining to the social interaction category comprises the following features:
  • biometric data 33 in further categories can be taken into account, for example biometric data 33 in the economic data category, in the personal data category and/or in the questionnaire data category.
  • the economic data category relates to comprehensive rather than user-specific data, for example data pertaining to general sickness absence rate or pertaining to job security.
  • the feature vector of the biometric data 33 pertaining to the economic data category comprises the following features:
  • the personal data category comprises data pertaining to age and family status and also pertaining to occupation group and pertaining to education level.
  • the feature vector of the personal data category is used particularly for individual calibration of the current stress levels 36 , 36 A, 36 B, 36 C, 36 D.
  • the personal data are recorded by the user using a form within the further application 6 , for example.
  • the feature vector of the biometric data 33 pertaining to the personal data category comprises the following features:
  • the questionnaire data comprise individual self-assessments by the user pertaining to stress-related questions.
  • the questionnaire data are recorded by the user using a form within the further application 6 , for example.
  • the biometric data 33 pertaining to the cited further categories can additionally be used for evaluation and particularly for ascertaining the consolidated current stress level 36 of the user.
  • the biometric data 33 are evaluated by the further application 6 directly as an evaluation unit on the mobile terminal 1 , a different approach has been chosen for the exemplary speech instance of application.
  • the evaluation of the biometric data 33 pertaining to the speech category is effected in the evaluation unit 13 that is arranged on the central server 10 .
  • the evaluation unit 13 contains an evaluation method, for example a method based on artificial neural networks that resorts to biometric data 33 from other users and to earlier biometric data 33 from the user.
  • the biometric data 33 from other categories are also evaluated in the evaluation unit 13 arranged on the central server 10 in order to increase the quality of the evaluation further.
  • the evaluation method obtained on the central server 10 by training the artificial neural network method is implemented in the further application 6 , for example by means of an update in the further application 6 .
  • the evaluation unit is provided for all categories by the further application 6 on the mobile terminal 1 . Evaluation of the biometric data 33 pertaining to all categories is effected on the mobile terminal 1 rather than on the central server 10 .
  • FIG. 5A shows a schematic illustration of an exemplary embodiment of the evaluation unit 13 on the central server 10 .
  • a current stress level 36 , 36 A, 36 B, 36 C, 36 D is determined for a user by the evaluation unit 13 on the central server 10 .
  • biometric data 33 pertaining to the user it is alternatively possible for a portion of the biometric data 33 pertaining to the user to be analyzed and evaluated on the mobile terminal 1 directly and for at least one current stress level 36 A, 36 B, 36 C, 36 D ascertained on the terminal to be determined.
  • a second portion of the biometric data 33 pertaining to the user is analyzed and evaluated on the central server 10 by the evaluation unit 13 and at least one current stress level 36 A, 36 B, 36 C, 36 D on the server is determined.
  • the biometric data 33 analyzed and evaluated on the server can comprise biometric data 33 that are also taken into account for the analysis and evaluation on the mobile terminal 1 .
  • a consolidated stress level 36 that takes account both of the at least one current stress level 36 A, 36 B, 36 C, 36 D ascertained on the terminal and of the at least one current stress level 36 A, 36 B, 36 C, 36 D ascertained on the server is determined by the data manager 61 of the further application 6 .
  • the evaluation unit 13 comprises a server-end data manager 14 and a server-end data analyzer 15 .
  • the server-end data analyzer 15 is in the form of an artificial neural network 40 in the form of a multilayer perceptron network.
  • the neural network consists of three layers: the input layer 43 , the hidden layer 44 and the output layer 45 . Each layer is constructed from neurons 46 .
  • the input layer 43 contains a plurality of input neurons 46 A.
  • the hidden layer 44 contains a plurality of hidden neurons 46 B and the output layer 45 contains precisely one output neuron 46 C.
  • each input neuron 46 A of the input layer 43 has, as an associated input value, the value of a feature from a feature vector in a category of biometric data 33 that have been transmitted to the central server 10 , for example in the speech category, following suitable normalization, for example to the value range [0,10].
  • each input neuron 46 A of the input layer 43 has, as an associated input value, the current stress level for a category of biometric data 33 .
  • the input layer 43 consists of seven input neurons 46 A, each input neuron 46 A having the associated current stress level of one of the categories sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data.
  • the features of the category-specific feature vectors, of biometric data 33 available to the central server 10 are linked and evaluated in another way in order to determine the input values of the input neurons 46 A.
  • the multilayer perceptron network is in the form of a feed forward network, i.e. the connections between the neurons 46 always point from one layer, for example the input layer 43 , to the next layer, for example the hidden layer 44 .
  • the input neurons 46 A of the input layer have connections to the hidden neurons 46 B of the hidden layer.
  • each input neuron 46 A of the input layer can have one connection to each hidden neuron 46 B of the hidden layer.
  • the hidden layer 44 has a greater number of neurons 46 than the input layer 43 .
  • the output layer 45 contains precisely one neuron 46 , the output neuron 46 C.
  • the neurons 46 B of the hidden layer 44 have connections to the output neuron 46 C of the output layer 45 .
  • each hidden neuron 46 B of the hidden layer 44 is connected to the output neuron 46 C.
  • the output neuron 46 C represents a current stress level 36 , 36 A, 36 B, 36 C, 36 D of a user.
  • the artificial neural network 40 computes a current stress level 36 , 36 A, 36 B, 36 C, 36 D of the user.
  • the server-end data manager 14 retrieves the biometric data 33 pertaining to a user in the form of the feature vectors for the ascertained categories—transmitted to the central server 10 —of biometric data 33 from the central memory unit 17 .
  • the feature vectors suitable for computing the current stress level 36 , 36 A, 36 B, 36 C, 36 D are taken into account, for example the feature vectors with the most recent timestamp.
  • a feature vector in a category is taken into account only if the instant for which the current stress level 36 , 36 A, 36 B, 36 C, 36 D is computed lies in the validity range defined by the timestamp.
  • the server-end data manager 14 provides the biometric data 33 for the data analyzer 15 , which is in the form of an artificial neural network 40 .
  • the biometric data 33 are read into the input layer 43 of the neural network 40 and forwarded to the next layers of the neural network 40 via the connections.
  • Each connection has a connection weight that has either a boosting or inhibiting effect.
  • Each neuron 46 B of the hidden layer 44 has an activation function, for example the hyperbolic tangent activation function, which maps an arbitrary input value onto the value range [ ⁇ 1, 1].
  • the input value for a neuron 46 B of the hidden layer 44 is obtained as a sum of the values transmitted via the weighted connections.
  • a neuron-specific threshold value is stipulated.
  • the input value exceeds the threshold value of the neuron 46 B, this computed value is forwarded from the hidden neuron 46 B to its outgoing connections and hence to the output neuron 46 C in the output layer 45 .
  • the output neuron 46 C determines its output value using the same method as has been described for a hidden neuron 46 B of the hidden layer 44 .
  • the artificial neural network 40 determines the value of the one output neuron 46 C in a deterministic fashion from the biometric data 33 that are associated with the input neurons 46 A.
  • the value of the output neuron 46 C provides the current stress level 36 , 36 A, 36 B, 36 C, 36 D.
  • the value of the output neuron 46 C is transferred from the server-end data analyzer 15 in the form of an artificial neural network 40 to the server-end data manager 14 .
  • the server-end data manager 14 stores the output value as a current stress level 36 , 36 A, 36 B, 36 C, 36 D for the categories relevant to determination thereof, with a timestamp, in the central memory unit 17 .
  • connection weights of each connection and the threshold values of each neuron 46 are stipulated.
  • connection weight for a connection is stipulated by a random value from the range [ ⁇ 0.5, 0.5], the value 0 being omitted.
  • the threshold value for a neuron 46 is stipulated by a random value from the range [ ⁇ 0.5, 0.5] for example.
  • connection weights of each connection of the neural network 40 and the threshold values for each neuron 46 are adjusted.
  • a monitored learning method preferably a back propagation method
  • the desired output value from the output neuron 46 C is available for the input values for the neural network 40 .
  • the desired output value from the output neuron 46 C is obtained from the current stress level for the questionnaire data category, which level has been ascertained exclusively from the questionnaire data answered by the user.
  • the connection weights of all connections and the threshold values of all neurons 46 are trained until the output value that the neural network 40 provides for the output neuron 46 C matches the desired output value with sufficient accuracy.
  • the repeated with a multiplicity of biometric data 33 from a multiplicity of users allows the analysis and evaluation method provided by the artificial neural network 40 for ascertaining the current stress level 36 , 36 A, 36 B, 36 C, 36 D to be constantly improved and adjusted further.
  • FIG. 5B shows a schematic illustration of an alternative exemplary embodiment of the evaluation unit 13 on the central server 10 .
  • the artificial neural network 40 is in the form of a feedback neural network. Besides the connections that point from a neuron 46 of an upstream layer to a neuron 46 of a downstream layer, for example from an input neuron 46 A of the input layer 43 to a hidden neuron 46 B of the hidden layer 44 , this embodiment has connections that run in the opposite direction, for example from a hidden neuron 46 B of the hidden layer 44 to an input neuron 46 A of the input layer 43 or from the output neuron 46 C of the output layer 45 to a hidden neuron 46 B of the hidden layer 44 .
  • An artificial neural network 40 of this kind has a higher level of complexity than the previously described feed forward network, which forwards data only in a distinguished forward direction.
  • FIG. 5C A development of the feedback artificial neural network is shown in FIG. 5C . Accordingly, lateral feedback loops are also possible, that is to say connections of neurons 46 that are arranged in the same layer. In a further development of the feedback artificial neural network, there is also provision for direct feedback. Direct feedback is a connection from a neuron 46 to itself. Direct feedback means that neurons 46 inhibit or boost themselves in order to arrive at their activation limits.
  • a feedback artificial neural network is provided particularly in order to take account of the “memory” of biometric data 33 pertaining to a user when determining the current stress level 36 , 36 A, 36 B, 36 C, 36 D.
  • the memory of biometric data 33 pertaining to a category pertaining to a user is the sequence, arranged on the basis of their timestamp, of feature vectors for this category and this user; in particular, the sequence comprises older feature vectors from earlier analyses.
  • a suitable subsequence is selected and the artificial neural network method is started with the first feature vector in this subsequence, that is to say the feature vector with the oldest timestamp.
  • the values of the first feature vector are applied to the artificial neural network as input values and the neural network is transited once.
  • the built-in feedback loops mean that the values from the first time step have a further effect on the subsequent time step.
  • the values of the second feature vector are applied to the artificial neural network 40 as input values.
  • the values generated in feedback connections from the previous time step are taken into account as new input values.
  • the method determined in this manner is continued further until the complete subsequence of feature vectors has been transited.
  • the value of the output neuron 46 C provides the current stress level 36 , 36 A, 36 B, 36 C, 36 D of the user.
  • FIG. 6 shows a schematic illustration of an embodiment of the invention that has been developed further.
  • the evaluation unit 6 , 13 ascertains a current stress level 36 , 36 A, 36 B, 36 C, 36 D of a user using a network of artificial neural networks.
  • the network of neural networks may be in the form of a deep belief network or a convolutional deep belief network 50 .
  • a single artificial neural network 40 which is part of the network of artificial neural networks, may be embodied according to one of the embodiments cited previously for artificial neural networks 40 , for example.
  • the network of artificial neural networks comprises a plurality of artificial neural networks 40 that interact with one another.
  • the plurality of artificial neural networks 40 may be embodied as a restricted Boltzmann machine or as a convolutional restricted Boltzmann machine.
  • a single neural network 40 comprises an input layer 43 and a hidden layer 44 .
  • the input layer comprises a plurality of input neurons 46 A.
  • the hidden layer comprises a plurality of hidden neurons 46 B. From the input layer of an artificial neural network, it is possible to determine the hidden layer of the same neural network, as explained in the preceding embodiments, for example.
  • the network of artificial neural networks contains a first level of artificial neural networks 40 , which are referred to as first neural networks.
  • the input layer 43 of the first neural networks is stipulated by the biometric data 33 from the user.
  • a component of a feature vector in a category can be associated with an input neuron 46 A.
  • at least one further level of artificial neural networks 40 is provided, which are referred to as further neural networks.
  • the input layer 43 can be determined from the hidden layers 44 of a plurality of artificial neural networks 40 on the preceding level.
  • an input neuron 46 A of the input layer 43 is stipulated by precisely one hidden neuron 46 B of an artificial neural network 40 from the preceding layer.
  • an input neuron 46 A of the input layer 43 is stipulated by a plurality of hidden neurons 46 B of one or more artificial neural networks 40 from the preceding layer.
  • the network of artificial neural networks 40 contains a topmost level that comprises at least one artificial neural network 40 .
  • the at least one artificial neural network 40 on the topmost level is referred to as the topmost neural network.
  • the at least one topmost neural network has an output layer 45 .
  • the hidden layer 44 of a topmost neural network is identified by means of the output layer 45 .
  • the at least one artificial neural network 40 of the topmost level comprises three layers, the input layer, the hidden layer and the output layer 45 .
  • the output layer 45 comprises precisely one output neuron 46 C.
  • the current stress level 36 , 36 A, 36 B, 36 C, 36 D can be determined from the output layer 45 of the at least one topmost neural network.
  • a classifier is provided that classifies the output layer 45 and determines the current stress level 36 , 36 A, 36 B, 36 C, 36 D therefrom.
  • the classifier may be designed as a support vector machine.
  • the current stress level 36 , 36 A, 36 B, 36 C, 36 D is stipulated by the output neuron 46 C.
  • the evaluation unit 6 , 13 which comprises a network of a plurality of artificial neural networks 40 , is designed such that the computation of the network can be parallelized.
  • the evaluation unit 6 , 13 interacts with at least one processor, the processor being designed and provided to compute neurons 46 , 46 B, 46 C for at least one artificial neural network 40 .
  • the processor may be arranged on the mobile terminal 1 .
  • the processor may also be provided on a central server.
  • a plurality of processors are provided.
  • the plurality of processors may be provided on the mobile terminal or the central server or on both.
  • the evaluation unit 6 , 13 is designed and provided to have the plurality of artificial neural networks 40 computed by the plurality of processors in parallel.
  • the parallel computation optimizes the computation time.
  • the method for determining a current stress level that is based on a network of artificial neural networks can be executed more quickly by parallelizing the computation of neural networks. Similarly, the parallelization allows power to be saved.
  • At least one graphics card with at least one graphics card processor can be incorporated for executing the method, the at least one graphics card being arranged on the mobile terminal or on the central server.
  • the graphics card processor can support computation of the artificial neural networks, in particular. This approach allows the computation time to be optimized even further.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Theoretical Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Psychiatry (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Epidemiology (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Signal Processing (AREA)
  • Hospice & Palliative Care (AREA)
  • Child & Adolescent Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Primary Health Care (AREA)
  • Databases & Information Systems (AREA)
  • Physiology (AREA)
  • Developmental Disabilities (AREA)
  • Psychology (AREA)
US14/418,374 2012-08-01 2013-08-01 Device, method and application for establishing a current load level Abandoned US20150265211A1 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
DE102012213618 2012-08-01
DE102012213618.5 2012-08-01
DE102012214697.0 2012-08-17
DE201210214697 DE102012214697A1 (de) 2012-08-01 2012-08-17 Vorrichtung, Verfahren und Applikation zur Ermittlung eines aktuellenBelastungsniveaus
PCT/EP2013/066241 WO2014020134A1 (de) 2012-08-01 2013-08-01 Vorrichtung, verfahren und applikation zur ermittlung eines aktuellen belastungsniveaus

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/066241 A-371-Of-International WO2014020134A1 (de) 2012-08-01 2013-08-01 Vorrichtung, verfahren und applikation zur ermittlung eines aktuellen belastungsniveaus

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/850,984 Continuation US11468984B2 (en) 2012-08-01 2020-04-16 Device, method and application for establishing a current load level

Publications (1)

Publication Number Publication Date
US20150265211A1 true US20150265211A1 (en) 2015-09-24

Family

ID=49944062

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/418,374 Abandoned US20150265211A1 (en) 2012-08-01 2013-08-01 Device, method and application for establishing a current load level
US16/850,984 Active US11468984B2 (en) 2012-08-01 2020-04-16 Device, method and application for establishing a current load level

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/850,984 Active US11468984B2 (en) 2012-08-01 2020-04-16 Device, method and application for establishing a current load level

Country Status (4)

Country Link
US (2) US20150265211A1 (de)
EP (1) EP2879582B1 (de)
DE (1) DE102012214697A1 (de)
WO (1) WO2014020134A1 (de)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140200468A1 (en) * 2007-09-11 2014-07-17 Korea Advanced Institute Of Science And Technology (Kaist) Method for analyzing stress based on multi-measured bio-signals
EP3501385A1 (de) * 2017-12-21 2019-06-26 IMEC vzw System und verfahren zur bestimmung des stresszustands einer person
WO2022038776A1 (ja) * 2020-08-21 2022-02-24 日本電気株式会社 ストレス推定装置、推定方法、プログラム及び記憶媒体
US11301671B1 (en) * 2015-04-20 2022-04-12 Snap Inc. Determining a mood for a group
US11717217B2 (en) * 2019-03-25 2023-08-08 Steffen Wirth Stress monitor and stress-monitoring method

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014223366A1 (de) * 2014-11-17 2016-05-19 BSH Hausgeräte GmbH Hausgerät mit einer berührungsempfindlichen Bedieneinrichtung sowie Verfahren zu seinem Betrieb
DE102016216950A1 (de) * 2016-09-07 2018-03-08 Robert Bosch Gmbh Modellberechnungseinheit und Steuergerät zur Berechnung eines mehrschichtigen Perzeptronenmodells mit Vorwärts- und Rückkopplung
US20220044101A1 (en) * 2020-08-06 2022-02-10 Micron Technology, Inc. Collaborative sensor data processing by deep learning accelerators with integrated random access memory

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120289789A1 (en) * 2011-05-13 2012-11-15 Fujitsu Limited Continuous Monitoring of Stress Using Environmental Data
US20130262096A1 (en) * 2011-09-23 2013-10-03 Lessac Technologies, Inc. Methods for aligning expressive speech utterances with text and systems therefor

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3310498B2 (ja) * 1994-09-02 2002-08-05 独立行政法人産業技術総合研究所 生体情報解析装置および生体情報解析方法
US5568126A (en) 1995-07-10 1996-10-22 Andersen; Stig L. Providing an alarm in response to a determination that a person may have suddenly experienced fear
US8364136B2 (en) * 1999-02-01 2013-01-29 Steven M Hoffberg Mobile system, a method of operating mobile system and a non-transitory computer readable medium for a programmable control of a mobile system
CA2413148C (en) * 2000-06-23 2010-08-24 Bodymedia, Inc. System for monitoring health, wellness and fitness
US7409373B2 (en) 2001-12-28 2008-08-05 Concepta Ab Pattern analysis system and method
US7152051B1 (en) * 2002-09-30 2006-12-19 Michael Lamport Commons Intelligent control with hierarchical stacked neural networks
SE0202948D0 (sv) 2002-10-04 2002-10-04 Bergfalk & Knagenhjelm Ab Sätt att påvisa aktivitetsmönster som indikerar psykisk sjukdom, och motsvarande arrangemang
US20040176991A1 (en) * 2003-03-05 2004-09-09 Mckennan Carol System, method and apparatus using biometrics to communicate dissatisfaction via stress level
US8002553B2 (en) 2003-08-18 2011-08-23 Cardiac Pacemakers, Inc. Sleep quality data collection and evaluation
EP1861014A1 (de) 2004-10-04 2007-12-05 Statchip ApS Tragbare netzwerkvorrichtung mit sensoren zur überwachung zu hause
US8758019B2 (en) 2006-08-03 2014-06-24 James W. Suzansky Multimedia game based system and process for medical, safety and health improvements
WO2008099288A2 (en) * 2007-02-16 2008-08-21 Vyro Games Ltd. Biosensor device and method
US7720696B1 (en) 2007-02-26 2010-05-18 Mk3Sd, Ltd Computerized system for tracking health conditions of users
DE102007032610A1 (de) * 2007-07-11 2009-01-15 Deutsche Telekom Ag Verfahren zur Fernüberwachung des medizinischen Zustandes eines Nutzers, System und Vorrichtung dafür
WO2009104127A1 (en) * 2008-02-22 2009-08-27 Koninklijke Philips Electronics N.V. A system and kit for stress and relaxation management
US20090254369A1 (en) * 2008-04-08 2009-10-08 Mohaideen A Hassan System and method for providing health care services using smart health cards
WO2009149126A2 (en) * 2008-06-02 2009-12-10 New York University Method, system, and computer-accessible medium for classification of at least one ictal state
US20090326981A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Universal health data collector and advisor for people
TWI405559B (zh) 2008-08-14 2013-08-21 Univ Nat Taiwan 手持式睡眠輔助裝置及其輔助方法
US8004391B2 (en) * 2008-11-19 2011-08-23 Immersion Corporation Method and apparatus for generating mood-based haptic feedback
TWI355260B (en) 2008-11-21 2012-01-01 Univ Yuan Ze Remote sleeping quality detecting system and metho
TWM363295U (en) 2009-03-27 2009-08-21 Platinum Team Co Ltd A device for analyzing quality of sleep
EP2435944A2 (de) * 2009-05-27 2012-04-04 University Of Abertay Dundee A biometric security method, system and computer program
WO2011009085A2 (en) 2009-07-17 2011-01-20 Oregon Health & Science University Method and apparatus for assessment of sleep disorders
GB2471902A (en) * 2009-07-17 2011-01-19 Sharp Kk Sleep management system which correlates sleep and performance data
WO2011011413A2 (en) * 2009-07-20 2011-01-27 University Of Florida Research Foundation, Inc. Method and apparatus for evaluation of a subject's emotional, physiological and/or physical state with the subject's physiological and/or acoustic data
US8527213B2 (en) * 2009-07-21 2013-09-03 Ntt Docomo, Inc. Monitoring wellness using a wireless handheld device
JP4519193B1 (ja) 2009-07-27 2010-08-04 エンパイア テクノロジー ディベロップメント エルエルシー 情報処理システム、情報処理方法
DE102009043775A1 (de) 2009-09-30 2011-04-07 Siemens Medical Instruments Pte. Ltd. Verfahren zum Einstellen einer Hörvorrichtung anhand eines emotionalen Zustandes und entsprechende Hörvorrichtung
US8666672B2 (en) * 2009-11-21 2014-03-04 Radial Comm Research L.L.C. System and method for interpreting a user's psychological state from sensed biometric information and communicating that state to a social networking site
US9592005B2 (en) 2010-01-29 2017-03-14 Dreamwell, Ltd. Systems and methods for bedding with sleep diagnostics
US9634855B2 (en) * 2010-05-13 2017-04-25 Alexander Poltorak Electronic personal interactive device that determines topics of interest using a conversational agent
WO2012008961A1 (en) * 2010-07-15 2012-01-19 Sony Ericsson Mobile Communications Ab A user analytics engine for detecting and mitigating stress and/or concentration lapses for an associated user of an electronic device
WO2012085687A2 (en) * 2010-12-23 2012-06-28 France Telecom Medical record retrieval system based on sensor information and a method of operation thereof
US20120197622A1 (en) * 2011-01-31 2012-08-02 Fujitsu Limited Monitoring Insulin Resistance

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120289789A1 (en) * 2011-05-13 2012-11-15 Fujitsu Limited Continuous Monitoring of Stress Using Environmental Data
US20130262096A1 (en) * 2011-09-23 2013-10-03 Lessac Technologies, Inc. Methods for aligning expressive speech utterances with text and systems therefor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
GEORGIEV et al., Low-resource Multi-task Audio Sensing for Mobile and Embedded Devices via Shared Deep Neural Network Representations, March 2010, Proceedings of the ACM on Interactive, Mobile,Wearable and Ubiquitous Technologies, Vol. 9, No. 4, Article 39:1-19 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140200468A1 (en) * 2007-09-11 2014-07-17 Korea Advanced Institute Of Science And Technology (Kaist) Method for analyzing stress based on multi-measured bio-signals
US10130292B2 (en) * 2007-09-11 2018-11-20 Samsung Electronics Co., Ltd. Method for analyzing stress based on multi-measured bio-signals
US11301671B1 (en) * 2015-04-20 2022-04-12 Snap Inc. Determining a mood for a group
US11710323B2 (en) 2015-04-20 2023-07-25 Snap Inc. Determining a mood for a group
EP3501385A1 (de) * 2017-12-21 2019-06-26 IMEC vzw System und verfahren zur bestimmung des stresszustands einer person
US11160500B2 (en) 2017-12-21 2021-11-02 Imec Vzw System and method for determining a subject's stress condition
US11717217B2 (en) * 2019-03-25 2023-08-08 Steffen Wirth Stress monitor and stress-monitoring method
WO2022038776A1 (ja) * 2020-08-21 2022-02-24 日本電気株式会社 ストレス推定装置、推定方法、プログラム及び記憶媒体

Also Published As

Publication number Publication date
US20200237302A1 (en) 2020-07-30
WO2014020134A1 (de) 2014-02-06
EP2879582B1 (de) 2020-03-25
US11468984B2 (en) 2022-10-11
DE102012214697A1 (de) 2014-02-06
EP2879582A1 (de) 2015-06-10

Similar Documents

Publication Publication Date Title
US11468984B2 (en) Device, method and application for establishing a current load level
US11363953B2 (en) Methods and systems for managing medical anomalies
EP2730223B1 (de) Vorrichtung und Verfahren zur Bestimmung des mentalen Zustands eines Benutzers
US20230142766A1 (en) System and method for fleet driver biometric tracking
US10620593B2 (en) Electronic device and control method thereof
US11455522B2 (en) Detecting personal danger using a deep learning system
LiKamWa et al. Moodscope: Building a mood sensor from smartphone usage patterns
US10872354B2 (en) System and method for personalized preference optimization
JP6246789B2 (ja) 通知の調整を伴うバイオメトリック属性異常検出システム
US20160180222A1 (en) Intelligent Personal Agent Platform and System and Methods for Using Same
CN109460752B (zh) 一种情绪分析方法、装置、电子设备及存储介质
WO2018044632A1 (en) Providing insights based on health-related information
US20150161738A1 (en) Method of determining a risk score or insurance cost using risk-related decision-making processes and decision outcomes
EP3638108B1 (de) Schlafüberwachung von impliziert erfassten computerinteraktionen
JP2018506773A (ja) ジェスチャベースの行動を監視し、それに影響を与える方法およびシステム
KR20140111959A (ko) 자동 햅틱 효과 조정 시스템
US20180107943A1 (en) Periodic stress tracking
US20170316463A1 (en) Method, Apparatus and System for Monitoring Attention Level of a User of a Communications Device
CN115038377A (zh) 连续葡萄糖监测系统的多状态参与
Shapsough et al. Emotion recognition using mobile phones
US20240028967A1 (en) Systems and methods for automatic decision-making with user-configured criteria using multi-channel data inputs
US20190193280A1 (en) Method for personalized social robot interaction
US20220108775A1 (en) Techniques for updating a health-related record of a user of an input/output device
US20200064986A1 (en) Voice-enabled mood improvement system for seniors
WO2024023897A1 (ja) 情報処理装置、情報処理方法、および記録媒体

Legal Events

Date Code Title Description
AS Assignment

Owner name: SOMA ANALYTICS UG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SCHNEIDER, PETER;HUBER, JOHANN;LORENZ, CHRISTOPHER;AND OTHERS;REEL/FRAME:035776/0121

Effective date: 20150211

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: PRENETICS EMEA LTD, ENGLAND

Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:SOMA ANALYTICS UG (HAFUNGSBESCHRAENKT);REEL/FRAME:062448/0496

Effective date: 20230123

AS Assignment

Owner name: TPDM1 LTD, ENGLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRENETICS EMEA LTD;REEL/FRAME:062539/0950

Effective date: 20230130

AS Assignment

Owner name: V1AM LIMITED, ENGLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TPDM1 LIMITED;REEL/FRAME:062560/0112

Effective date: 20230201