WO2019086861A1 - Systems and methods for estimating human states - Google Patents

Systems and methods for estimating human states Download PDF

Info

Publication number
WO2019086861A1
WO2019086861A1 PCT/GB2018/053144 GB2018053144W WO2019086861A1 WO 2019086861 A1 WO2019086861 A1 WO 2019086861A1 GB 2018053144 W GB2018053144 W GB 2018053144W WO 2019086861 A1 WO2019086861 A1 WO 2019086861A1
Authority
WO
WIPO (PCT)
Prior art keywords
devices
data
experience
signal
timer
Prior art date
Application number
PCT/GB2018/053144
Other languages
French (fr)
Inventor
Gawain MORRISON
Shane MCCOURT
Gary John MCKEOWN
Cavan FYANS
Damien DUPRE
Daniel Stephen MOORE
Original Assignee
Sensumco Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sensumco Limited filed Critical Sensumco Limited
Publication of WO2019086861A1 publication Critical patent/WO2019086861A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • aspects of the present invention relate generally to systems and methods for estimating affective, physiological or behavioural states ("human states").
  • Human-machine interfacing is an area of increased interest.
  • situations where the human body generates biometric and emotional data which can be identified and then introduced to computer software and hardware, such that it can dynamically respond, and are of increasing, wide-ranging relevance to the technological community and beyond.
  • Selected wider applications include sports and performance, health and safety, automotive and mobility, gaming and entertainment, and well-being, to name a few examples.
  • Bio-emotional or human states monitoring can be used for enhancing user experiences, personalising engagement and feedback, as well as performance monitoring.
  • Human states refers in this context to biological, physiological, emotional, or behavioural characteristics (or a combination of any of these).
  • the at least one human state may include at least one physiological or behavioural characteristic such as stress, fatigue, intoxication, distraction, positivity. This could be for health purposes, emotional understanding, or for scenarios where fatigue, stress or intoxication could be life threatening.
  • performance monitoring is an increasingly important factor for competing in elite sports or motor sports such as formula 1 (F1) driving.
  • Smart phones and other devices incorporate sensors, transducers and other components for sensing and monitoring a number of parameters including motion, location, acceleration, orientation, temperature, pressure, acoustic and optical conditions etc.
  • smartphones there are a growing number of "wearable devices” that can be incorporated into personal clothing or other objects such as jewellery, watches, ear buds etc. and which may also detect, record and transmit data about the environment or person wearing them.
  • a system comprising a plurality of devices for sensing, detecting or measuring data associated with a person's experience over time, the system comprising a controller having a timer, the controller configured to: repeatedly send queries, at times recorded by the controller's timer, to each of the plurality of devices, wherein each query requests a signal from the queried device that the device has started to sense, detect or measure data associated with the experience; record, for each of the plurality of devices, the time according to the timer at which the device sent the signal and signal value received from the queried device; and
  • data streams may be compiled in real time, even though there may be different inputs of different formats.
  • the timer imposes a meta-time clock or the devices which otherwise would not be synchronised given; the different times at which they start recording, parameters associated with the experience, and different clock speeds.
  • the plurality of devices comprises a plurality of wearable devices, and/or smart textiles (such as smart weave) for example.
  • wearable devices such as smart weave
  • smart textiles such as smart weave
  • Such devices may also take chemical measurements, and even provide indications of potential genetic damage.
  • advances in nano-technology for internal medicines and communication are to enable bridging or block electric signals through the brain and nervous system.
  • Such technologies are envisaged to constitute other means of body data measurement for use in systems and methods as described and claimed herewith.
  • Each of the plurality of devices may have an internal clock running at different speeds, and the devices may start sensing, detecting or measuring data associated with the experience at a plurality of different times according to their respective internal clocks.
  • the system allows for the synchronisation of a range of biometric sensors, cameras and microphones for example, triggering them simultaneously and capturing their data.
  • This enables the estimation of emotions by integrating data from third-party wearables such as smartwatches and fitness bands, medical devices like skin conductance and heart rate sensors, as well as cameras, microphones and other devices and methods.
  • Synchronicity of recordings from multiple devices refers to synchronising recording points and automatic time aligning. Synchronisation of multiple data streams to make them more manageable and legible may be thought of as data "capture and optimisation”.
  • This present system is also advantageous over some prior art systems which perform data streams alignment requiring a "sync point" (clapperboard with a visual queue which may be computational or manual).
  • the system performs the alignment automatically, shifting the recordings into alignment with the ones previously detected.
  • data associated with a person's experience over time comprises physiological data and experience context data.
  • the compiled data forms a single or unitary library.
  • the provision of a single or unitary library allows for cross- comparison across both context-generated data and biometric data harvested by the various devices.
  • the present systems therefore represent a unified means of structuring and comparing the different recordings. This is in contrast to existing systems which allow comparisons for context only data or biometric only data, for example.
  • the system further comprises a processor for estimating a human state or changes to an human state based on the synchronised data.
  • the processor is further configured to provide feedback of the human state or of the changes to an human state.
  • estimating a human state or changes to a human state comprises using the circumplex model. This allows for an estimation of human responses in an enriched manner.
  • compiling signal values comprises aligning signal values received from one of the plurality of devices based on the controller's timer. In essence, compiling the signal values refers to gathering data streams from any of the plurality of devices that the system records from, and storing the data for processing and analysis.
  • the system may be used in a wide range of applications, such as automotive applications for measuring responses of drivers and passengers in a vehicle.
  • the system may also be used for interactive gaming and bio-enhanced AR & VR experiences such as concerts, media, retail or gaming.
  • system may be applied to experiential marketing solutions which can be personalised by feeding bio-emotional data back into a mobile, console or site-specific application.
  • An application of this is to monitor live emotions of large crowds of people during a major multi-national sporting event, for example.
  • Further examples include use relative to fleets (including truck, flight and shipping) and as a method of interrogating bio-emotional diary data for health & safety, identifying fatigue, intoxication and stress. Further applications of the system include managing stress in the workplace to minimise impact to business; in sport and relative to performance space, which also includes extreme conditions research for space and deep-sea travel.
  • a method of synchronising data which has been sensed, detected or measured by a plurality of devices, the data being associated with a person's experience over time comprising the steps of: repeatedly querying, at times recorded by a controller's timer, each of the plurality of devices, wherein each query requests a signal from the queried device that the device has started to sense, detect or measure data associated with the experience; recording, for each of the plurality of devices, the time according to the timer at which the device sent the signal and signal value received from the queried device; and compiling signal values received from the plurality of devices based on the recorded times at which signal values were received.
  • Figure 1 schematically shows a system and process flow according to an embodiment of the invention
  • Figure 2 schematically shows the 'mobile' part of the system
  • FIG. 3 schematically shows an optional web/desktop data flow
  • Figure 4 schematically shows third party tools part of the system
  • Figure 5 schematically shows a process flow for "data humanisation"
  • FIG. 6 schematically shows the functionality of Software Development Kits (SDK).
  • a system device 100 (in this example, a PC 100) has storage means 50 and a processor 51 (for processing data such as video, gaze, face recognition, biometric data, context data, audio data, thermal imaging data, infrared imaging data and biometric radar data).
  • the PC 100 supports a programming application interface (API) and can receive external API requests based on API access, software development kits (SDK) and frameworks.
  • API programming application interface
  • SDK software development kits
  • the PC 100 is in communication with a 'mobile' part of the system 101, an optional web/desktop user interface 102, and third-party tools 103.
  • the 'mobile' part of the system 101 incudes a mobile device 1 onto which a "sync app" is loaded, the device receiving multiple streams of data from a capture module / data inputs 2.
  • the device 1 may be any other suitable computing device such as a tablet etc.
  • Data inputs 2 may be received from wearable devices in what is referred to as a "Body Area Network” (BAN) and additionally from a user "experience” 3 module which records user feedback for example including video playback, audio playback and survey responses.
  • Experience data may be input from the system device 100 or the setup experience 80 as shown in Figure 1.
  • the "experience" is what all of the data streams correspond to.
  • the sync app has a user login and a session login for example, to records data for a particular "session" or
  • the system device 100 is in communication with a user and session login of the 'mobile' part of the system 101.
  • a bike rider can wear a sensor belt, a portable camera and a microphone with inbuilt digital signal converter.
  • the rider can also wear a smart phone on one of their arms onto which the sync app is loaded to control sensors and also to record the signals 2 received from the wearable devices.
  • the mobile device itself also provides GPS coordinates and three-axis accelerometer/magnetometer data.
  • the onset of the experience and thus recording of data streams may be associated with one or more triggers 4 such as a GPS trigger, a near field communication (NFC) trigger, a barcode scanner, an audio trigger or a time trigger for example.
  • triggers 4 such as a GPS trigger, a near field communication (NFC) trigger, a barcode scanner, an audio trigger or a time trigger for example.
  • the different data streams received from the inputs 2 have different clock speeds, for example running at different hertz rates.
  • the different inputs are represented in Figure 2 within the "capture" 2 area.
  • Preferably, as many streams of data as possible are included from various devices, including wearable devices.
  • capture data includes text reports (diary reporting, event tagging, emoji response, survey response), audio data from audio devices (audio recording, audio tagging), video or visual data (video stream/recording, video tagging), biometric capture data which may be physiological: (heart rate (HR), skin conductiveness level and response/galvanic skin response (GSR), skin temperature (ST), PPG, breathing rate (BR), heart rate variability (ECG)), or may be contextual data including pressure (Bar/Alt), GPS location, acceleration (Acc), magnetic field (Mag) etc).
  • HR heart rate
  • GSR skin conductiveness level and response/galvanic skin response
  • ST skin temperature
  • PPG breathing rate
  • BR breathing rate variability
  • ECG heart rate variability
  • pressure Bar/Alt
  • GPS location GPS location
  • acceleration acceleration
  • Magn magnetic field
  • Contextual data may include 'perceived difficulty' of bike trails for example, evaluated afterwards by an experience rider using a dynamic scale from 0 (no difficulty) to 100 (hardest difficulty) whilst watching on his/her point of view video recording.
  • the context measures may be used as predictors for physiological changes.
  • Other data also included is photographic data (photos, barcode scans, NFC scans) for example.
  • the data inputs 2 represent parameters for a user which are measured, sensed, detected or received via signals and/or data which may be further analysed to estimate the emotional state affect, mood, the onset of such or change thereto. It will be appreciated that the described parameters are not an exhaustive list.
  • the recorded data may be compiled as a "batch", e.g. a list of times at which events (audio or video for example) happened or not, forming a batch upload 9. Further, recorded data may be uploaded to the cloud in real time, including data such as event tags, diary reports, survey response, video tags or audio tags. With reference to Figure 1, the PC 100 receives batch and live data uploads from the 'mobile' part of the system 101.
  • the way in which events timings are recorded and compiled is important and will be described here. Different time points include event time and an associated value, for multiple events being recorded.
  • the "sync app" configures the device 1 as a 'controller' which records the start time (starting points) and associated values of the data inputs. For example, a start command or message is sent from the controller 1 to an audio input device to receive audio data. There is a lag time between the time at which the
  • command/message is sent and the controller sends further queries to check where the audio device started recording.
  • the controller asks repeatedly whether the audio recording has started before eventually getting a response. This repeated querying is preferably periodical.
  • the controller 1 records not what time the events happen according to their clock but when they actually happen according to its own clock at which it sends the query (taking lag time into account).
  • the controller 1 may be linked ("hooked") to a wider broadcast network - the external production sync module (timecode) 1 1 in Figure 2.
  • the controller 1 may be linked ("hooked") to other synchronisation systems representing an external timing sync module 12.
  • a start command may be sent by the controller checking whether the audio device has started recording. In effect the controller 1 sends repeated commands, asking periodically whether the device has started and eventually getting a response.
  • Watching an event therefore leads to recording the event, marking the data, storing data and compiling data from the various data streams.
  • the recording of metadata can take the form of a CSV file with two or more columns to record the event which occurred and the time at which the event occurred.
  • the compiling step can be either done on the controller 1 or externally in a data processing module 5 which may be internal or external to the device 1 (i.e. on the API (PC) device 100).
  • Biometric data is slightly different to textual data because the biometric sensors are continuously recording data - still encoding drop-in points.
  • the system effectively synchronises the various data streams by starting and stopping all the data streams simultaneously. Further, the issue of each sensor having a different recording frequency, which ads further difficulty to data analysis, is overcome as the system extrapolates lowest frequency data recordings with statistical methods to be compared with the highest frequency data recording.
  • Processed data i.e. outputs or output data
  • Feedback may be audio, visual or haptic feedback example.
  • Data outputs may also be sent to a 'beacons framework' 8 which detects an emotional peak. When an emotional peak is detected this may be sent as a notification 80 to the user via the mobile device 1.
  • the beacons framework 8 may also receive processed data output from the device 100 as shown in Figure 1.
  • Valence refers to how positive or negative a user feels
  • arousal refers to how stimulated the user is
  • dominance refers to how in control the user feels.
  • an optional Web/Desktop part of the system 102 comprises a Web/Desktop device 52 with a user login and a session login.
  • the Web/Desktop device 52 provides live data uploads 60 to the PC 100 (as shown in Figure 1).
  • a setup experience module 80 and tracker 81 in this example an emoji tracker, also send data to the PC 100.
  • the setup experience module 80 comprises survey, audio and video experience and can also feed data to the mobile experience module 3.
  • Data from the setup experience module 80 is sent to a capture (webapp) module 82, which captures survey responses, audio recording, video recording and biometric data capture in this example. The data capture is then communicated to the Web/Desktop device 52.
  • third party tools 103 may include a number of plugins, in this example: 'Game Engine Plugins' 160, 'Mobile SDKs 170, iOS SDKs 180, Native Libraries 182 (local and live compiled libraries), and Embedded Computing 184 and 'chips'.
  • Chips' refers to the incorporation of all states of a pipeline running on a microprocessor specifically designed for said pipeline.
  • the data flow is from the mobile capture module 2 feeding data streams to the mobile device 1, whilst the synchronising and data processing is carried out by the mobile device 1 and data processing module (e.g. cloud API) as described above.
  • 'Data humanisation' refers to the process whereby data can be fed back to the user via various sensors feedback, from any synchronised data including visuals, sounds or vibrations for example.
  • Continuous playback or skipping functions may be available such that the CSV data set and the synchronicity data which comprises it are aligned.
  • the functionality of mobile device SDK 170, 180 in connection with the API 190 and algorithm for data processing of emotions (module 200) is illustrated in Figure 6.
  • Data streams are fed into the mobile device 1 as described above.
  • 'live data' is provided by a periodical 'data dump', with data being uploaded every 1 minute for example. This data could be gathered in particular measures of time and evaluated for signal to noise. Benchmark checking and pattern recognition may be performed for an individual for example.
  • the user has SDK authorisation control.
  • API commands include authentication and data upload, leading to an SDK notification. Emotions processing in this example is based on the VAD model. Notifications and arousal values are collected periodically by a SDK call and response from the API.
  • the system In performance-critical environments it is valuable to measure the situational data and emotional data in particular, and make them applicable to the athlete's performance and training programs.
  • the system records multiple data streams in an effective manner to analyse emotions, also supporting interpretation and communication of that data, transforming it into visual and concise information to help athletes understand their emotions.
  • Applications of the present invention are wide and include consumer car market, as well as motor sports such as F1 in so called 'human reactive cockpits' to enhance safety and comfort.
  • Other applications which could benefit from human state monitoring include ride- share applications (to assess and improve customer experience), e-sports and affective gaming (that is, gaming which responds to the feelings of a player) involving dynamic play/sensitive content, as well as consumer robotics (e.g. emphatics robots in the household industries).

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

A system (100) comprising a plurality of devices for sensing, detecting or measuring data associated with a person's experience over time, the system comprising a controller (1) having a timer, the controller configured to: repeatedly send queries, at times recorded by the controller's timer, to each of the plurality of devices, wherein each query requests a signal from the queried device that the device has started to sense, detect or measure data associated with the experience; record, for each of the plurality of devices, the time according to the timer at which the device sent the signal and signal value received from the queried device; and compile signal values received from the plurality of devices based on the recorded times at which signal values were received, to thereby synchronise data which has been sensed, detected or measured by the plurality of devices.

Description

SYSTEMS AND METHODS FOR ESTIMATING HUMAN STATES
Field of the Invention
Aspects of the present invention relate generally to systems and methods for estimating affective, physiological or behavioural states ("human states").
Background
Human-machine interfacing is an area of increased interest. In particular, situations where the human body generates biometric and emotional data which can be identified and then introduced to computer software and hardware, such that it can dynamically respond, and are of increasing, wide-ranging relevance to the technological community and beyond. Selected wider applications include sports and performance, health and safety, automotive and mobility, gaming and entertainment, and well-being, to name a few examples.
Bio-emotional or human states monitoring can be used for enhancing user experiences, personalising engagement and feedback, as well as performance monitoring. "Human states" refers in this context to biological, physiological, emotional, or behavioural characteristics (or a combination of any of these). The at least one human state may include at least one physiological or behavioural characteristic such as stress, fatigue, intoxication, distraction, positivity. This could be for health purposes, emotional understanding, or for scenarios where fatigue, stress or intoxication could be life threatening. For example, performance monitoring is an increasingly important factor for competing in elite sports or motor sports such as formula 1 (F1) driving. It is essential not only from an individual perspective - supporting an athlete's autonomy in devising individualised training programs - but also at a collective level in helping the athlete's support network to understand the athlete's performance. Over longer periods, athletes' performance in sports and activities is not stable. In Formula 1 (F1) driving and the like, even marginal gains in performance e.g. due to different levels of tiredness can provide a crucial advantage as well as enhance safety and comfort.
Performance increases and decreases according to both physical (body) and psychological states. Psychological performance indicators have typically been underestimated or overlooked, often due to their lack of reliability, even though the importance of mental conditioning is commonly acknowledged. However, psychological assessment is
increasingly taken more seriously for sports and performance, health and safety, automotive and mobility, gaming and entertainment, and well-being.
Even if measuring emotions is possible in lab conditions, it remains a challenge in elite or extreme environments such as F1 for example. Data signals can be disrupted by added noise of exertion and movement for example, such as when an individual already has elevated breathing and heart rate. There are also additional problems coming from the athlete's environment, with extreme conditions for example regarding temperature, vibration, speed, or G-forces.
In order to evaluate emotions 'in the wild' (for example, in the context of actually playing sport), it is desirable to evaluate data from a range of biometric sensors and media sensors for context. Smart phones and other devices incorporate sensors, transducers and other components for sensing and monitoring a number of parameters including motion, location, acceleration, orientation, temperature, pressure, acoustic and optical conditions etc. In addition to smartphones, there are a growing number of "wearable devices" that can be incorporated into personal clothing or other objects such as jewellery, watches, ear buds etc. and which may also detect, record and transmit data about the environment or person wearing them.
Processing and analysing different strands of data from such devices, however, presents a number of challenges, since wearable-derived data tends to comprise a number of different inputs and different formats. Formats which pertain to individual types of sensors are difficult to compare and thus it is difficult to extract data and make it agnostic of commercial origin. This begets a lack of efficiency and cross comprehensibility. Further problems in processing such data in real time arise from the fact that the different data streams start recording at different times and the time clocks on different devices run at different speeds (in Hz / Mhz) according to multiple protocols (Bluetooth, WIFI etc). It is thus difficult to know precisely what device started when and what frame relates to which frame for example, so that the various data streams tend to drift (known as the "Clock speed" problem).
It is to these problems, amongst others, that aspects according to the present invention attempt to offer a solution. Summary of invention
In an independent aspect, there is provided a system comprising a plurality of devices for sensing, detecting or measuring data associated with a person's experience over time, the system comprising a controller having a timer, the controller configured to: repeatedly send queries, at times recorded by the controller's timer, to each of the plurality of devices, wherein each query requests a signal from the queried device that the device has started to sense, detect or measure data associated with the experience; record, for each of the plurality of devices, the time according to the timer at which the device sent the signal and signal value received from the queried device; and
compile signal values received from the plurality of devices based on the recorded times at which signal values were received, to thereby synchronise data which has been sensed, detected or measured by the plurality of devices. Advantageously, data streams may be compiled in real time, even though there may be different inputs of different formats.
In effect, the timer imposes a meta-time clock or the devices which otherwise would not be synchronised given; the different times at which they start recording, parameters associated with the experience, and different clock speeds.
In preferred, dependent aspects, the plurality of devices comprises a plurality of wearable devices, and/or smart textiles (such as smart weave) for example. In this manner, clothing, jewellery and 'wearables' of this kind can pick up responses from the body, wherever they are placed. Such devices may also take chemical measurements, and even provide indications of potential genetic damage. Further, advances in nano-technology for internal medicines and communication are to enable bridging or block electric signals through the brain and nervous system. Such technologies are envisaged to constitute other means of body data measurement for use in systems and methods as described and claimed herewith.
Each of the plurality of devices may have an internal clock running at different speeds, and the devices may start sensing, detecting or measuring data associated with the experience at a plurality of different times according to their respective internal clocks.
Advantageously, the system allows for the synchronisation of a range of biometric sensors, cameras and microphones for example, triggering them simultaneously and capturing their data. This enables the estimation of emotions by integrating data from third-party wearables such as smartwatches and fitness bands, medical devices like skin conductance and heart rate sensors, as well as cameras, microphones and other devices and methods.
Understanding, influencing and regulating emotions allows individuals and teams to aim for better results and performance. Evaluating physiological responses may be used not only to assess athletic performance but also to inform of health issues, for example.
Signals may be received from a variety of sensors to compile and process the data so that afterwards features may be extracted from the trends observed in the data so compiled. "Synchronisation" is a solution to the problem of communicating and capturing multiple data streams. The communication/collection process is of media, biometric and/or contextual data for example. Synchronicity of recordings from multiple devices refers to synchronising recording points and automatic time aligning. Synchronisation of multiple data streams to make them more manageable and legible may be thought of as data "capture and optimisation".
This present system is also advantageous over some prior art systems which perform data streams alignment requiring a "sync point" (clapperboard with a visual queue which may be computational or manual). In particular, the system performs the alignment automatically, shifting the recordings into alignment with the ones previously detected.
In preferred dependent aspects, data associated with a person's experience over time comprises physiological data and experience context data. The compiled data forms a single or unitary library. The provision of a single or unitary library allows for cross- comparison across both context-generated data and biometric data harvested by the various devices. The present systems therefore represent a unified means of structuring and comparing the different recordings. This is in contrast to existing systems which allow comparisons for context only data or biometric only data, for example.
In a dependent aspect, the system further comprises a processor for estimating a human state or changes to an human state based on the synchronised data. This enables accurate bio-emotional monitoring. Preferably, the processor is further configured to provide feedback of the human state or of the changes to an human state.
In a dependent aspect, estimating a human state or changes to a human state comprises using the circumplex model. This allows for an estimation of human responses in an enriched manner. In a dependent aspect, compiling signal values comprises aligning signal values received from one of the plurality of devices based on the controller's timer. In essence, compiling the signal values refers to gathering data streams from any of the plurality of devices that the system records from, and storing the data for processing and analysis. The system may be used in a wide range of applications, such as automotive applications for measuring responses of drivers and passengers in a vehicle. The system may also be used for interactive gaming and bio-enhanced AR & VR experiences such as concerts, media, retail or gaming.
In addition, the system may be applied to experiential marketing solutions which can be personalised by feeding bio-emotional data back into a mobile, console or site-specific application. An application of this is to monitor live emotions of large crowds of people during a major multi-national sporting event, for example.
Further examples include use relative to fleets (including truck, flight and shipping) and as a method of interrogating bio-emotional diary data for health & safety, identifying fatigue, intoxication and stress. Further applications of the system include managing stress in the workplace to minimise impact to business; in sport and relative to performance space, which also includes extreme conditions research for space and deep-sea travel.
In a further independent aspect, there is provided a method of synchronising data which has been sensed, detected or measured by a plurality of devices, the data being associated with a person's experience over time, the method comprising the steps of: repeatedly querying, at times recorded by a controller's timer, each of the plurality of devices, wherein each query requests a signal from the queried device that the device has started to sense, detect or measure data associated with the experience; recording, for each of the plurality of devices, the time according to the timer at which the device sent the signal and signal value received from the queried device; and compiling signal values received from the plurality of devices based on the recorded times at which signal values were received. Brief description of figures
Examples of the present invention will now be described with reference to the accompanying drawings, where:
Figure 1 schematically shows a system and process flow according to an embodiment of the invention;
Figure 2 schematically shows the 'mobile' part of the system;
Figure 3 schematically shows an optional web/desktop data flow;
Figure 4 schematically shows third party tools part of the system;
Figure 5 schematically shows a process flow for "data humanisation"; and
Figure 6 schematically shows the functionality of Software Development Kits (SDK).
Detailed Description
Turning first to Figure 1, a system device 100 (in this example, a PC 100) has storage means 50 and a processor 51 (for processing data such as video, gaze, face recognition, biometric data, context data, audio data, thermal imaging data, infrared imaging data and biometric radar data). The PC 100 supports a programming application interface (API) and can receive external API requests based on API access, software development kits (SDK) and frameworks. As will be described in more detail with reference to Figures 2, 3 and 4, respectively, the PC 100 is in communication with a 'mobile' part of the system 101, an optional web/desktop user interface 102, and third-party tools 103. Turning now to Figure 2, the 'mobile' part of the system 101 incudes a mobile device 1 onto which a "sync app" is loaded, the device receiving multiple streams of data from a capture module / data inputs 2. It will be appreciated that the device 1 may be any other suitable computing device such as a tablet etc. Data inputs 2 may be received from wearable devices in what is referred to as a "Body Area Network" (BAN) and additionally from a user "experience" 3 module which records user feedback for example including video playback, audio playback and survey responses. Experience data may be input from the system device 100 or the setup experience 80 as shown in Figure 1. The "experience" is what all of the data streams correspond to. The sync app has a user login and a session login for example, to records data for a particular "session" or
"experience" such as a bike ride. With reference to Figure 1, the system device 100 is in communication with a user and session login of the 'mobile' part of the system 101.
For example, physiological changes on mountain bike riders during their runs on downhill trails can be evaluated by the system, wherein a BAN is created for each of the riders. A bike rider can wear a sensor belt, a portable camera and a microphone with inbuilt digital signal converter. The rider can also wear a smart phone on one of their arms onto which the sync app is loaded to control sensors and also to record the signals 2 received from the wearable devices. The mobile device itself also provides GPS coordinates and three-axis accelerometer/magnetometer data.
The onset of the experience and thus recording of data streams may be associated with one or more triggers 4 such as a GPS trigger, a near field communication (NFC) trigger, a barcode scanner, an audio trigger or a time trigger for example.
The different data streams received from the inputs 2 have different clock speeds, for example running at different hertz rates. The different inputs are represented in Figure 2 within the "capture" 2 area. Preferably, as many streams of data as possible are included from various devices, including wearable devices. In this example, capture data includes text reports (diary reporting, event tagging, emoji response, survey response), audio data from audio devices (audio recording, audio tagging), video or visual data (video stream/recording, video tagging), biometric capture data which may be physiological: (heart rate (HR), skin conductiveness level and response/galvanic skin response (GSR), skin temperature (ST), PPG, breathing rate (BR), heart rate variability (ECG)), or may be contextual data including pressure (Bar/Alt), GPS location, acceleration (Acc), magnetic field (Mag) etc). Contextual data may include 'perceived difficulty' of bike trails for example, evaluated afterwards by an experience rider using a dynamic scale from 0 (no difficulty) to 100 (hardest difficulty) whilst watching on his/her point of view video recording. The context measures may be used as predictors for physiological changes. Other data also included is photographic data (photos, barcode scans, NFC scans) for example. Accordingly, the data inputs 2 represent parameters for a user which are measured, sensed, detected or received via signals and/or data which may be further analysed to estimate the emotional state affect, mood, the onset of such or change thereto. It will be appreciated that the described parameters are not an exhaustive list.
The recorded data may be compiled as a "batch", e.g. a list of times at which events (audio or video for example) happened or not, forming a batch upload 9. Further, recorded data may be uploaded to the cloud in real time, including data such as event tags, diary reports, survey response, video tags or audio tags. With reference to Figure 1, the PC 100 receives batch and live data uploads from the 'mobile' part of the system 101.
The way in which events timings are recorded and compiled is important and will be described here. Different time points include event time and an associated value, for multiple events being recorded. The "sync app" configures the device 1 as a 'controller' which records the start time (starting points) and associated values of the data inputs. For example, a start command or message is sent from the controller 1 to an audio input device to receive audio data. There is a lag time between the time at which the
command/message is sent and the controller sends further queries to check where the audio device started recording. In effect, the controller asks repeatedly whether the audio recording has started before eventually getting a response. This repeated querying is preferably periodical.
Accordingly, the controller 1 records not what time the events happen according to their clock but when they actually happen according to its own clock at which it sends the query (taking lag time into account). For example, the controller 1 may be linked ("hooked") to a wider broadcast network - the external production sync module (timecode) 1 1 in Figure 2. Further, the controller 1 may be linked ("hooked") to other synchronisation systems representing an external timing sync module 12. When 'watching' an event, such as the audio event (audio data stream), the start times and start values of the event are recorded and compiled as metadata. A start command may be sent by the controller checking whether the audio device has started recording. In effect the controller 1 sends repeated commands, asking periodically whether the device has started and eventually getting a response. Watching an event therefore leads to recording the event, marking the data, storing data and compiling data from the various data streams. For example, the recording of metadata can take the form of a CSV file with two or more columns to record the event which occurred and the time at which the event occurred. The compiling step can be either done on the controller 1 or externally in a data processing module 5 which may be internal or external to the device 1 (i.e. on the API (PC) device 100).
Biometric data is slightly different to textual data because the biometric sensors are continuously recording data - still encoding drop-in points.
Advantageously, the system effectively synchronises the various data streams by starting and stopping all the data streams simultaneously. Further, the issue of each sensor having a different recording frequency, which ads further difficulty to data analysis, is overcome as the system extrapolates lowest frequency data recordings with statistical methods to be compared with the highest frequency data recording.
The processed data is displayed onto a display 6 of the mobile device 1 (or external to the mobile device 1). Processed data (i.e. outputs or output data) may also be used to provide real time feedback 7 (such as visualisations) to the user or to offsite data storage (such as a cloud). Feedback may be audio, visual or haptic feedback example.
Data outputs may also be sent to a 'beacons framework' 8 which detects an emotional peak. When an emotional peak is detected this may be sent as a notification 80 to the user via the mobile device 1. The beacons framework 8 may also receive processed data output from the device 100 as shown in Figure 1.
Emotions processing in this example is based on a Valence, Arousal and Dominance (VAD) model. Valence refers to how positive or negative a user feels, arousal refers to how stimulated the user is and dominance refers to how in control the user feels. These three parameters of the model are used in what is called the 'circumplex model' (h,ttp://nied,ic,a,i- dictionary^hefreedictionary m/circumpiex-f model &
https://encyclopedia.thefreedictionary.com/lnterpersonal+Circumplex). Estimating emotions is possible by combining these autonomic responses - i.e. the purely physical - with the deliberate, be they "social" or "communicative". This allows for an estimation of emotional responses in an enriched manner.
Turning now to Figure 3, an optional Web/Desktop part of the system 102 comprises a Web/Desktop device 52 with a user login and a session login. The Web/Desktop device 52 provides live data uploads 60 to the PC 100 (as shown in Figure 1). A setup experience module 80 and tracker 81, in this example an emoji tracker, also send data to the PC 100. The setup experience module 80 comprises survey, audio and video experience and can also feed data to the mobile experience module 3. Data from the setup experience module 80 is sent to a capture (webapp) module 82, which captures survey responses, audio recording, video recording and biometric data capture in this example. The data capture is then communicated to the Web/Desktop device 52.
With reference to Figure 4, third party tools 103 may include a number of plugins, in this example: 'Game Engine Plugins' 160, 'Mobile SDKs 170, iOS SDKs 180, Native Libraries 182 (local and live compiled libraries), and Embedded Computing 184 and 'chips'. Chips' refers to the incorporation of all states of a pipeline running on a microprocessor specifically designed for said pipeline. With reference to Figure 5, the data flow is from the mobile capture module 2 feeding data streams to the mobile device 1, whilst the synchronising and data processing is carried out by the mobile device 1 and data processing module (e.g. cloud API) as described above. 'Data humanisation' refers to the process whereby data can be fed back to the user via various sensors feedback, from any synchronised data including visuals, sounds or vibrations for example.
Continuous playback or skipping functions may be available such that the CSV data set and the synchronicity data which comprises it are aligned. The functionality of mobile device SDK 170, 180 in connection with the API 190 and algorithm for data processing of emotions (module 200) is illustrated in Figure 6. Data streams are fed into the mobile device 1 as described above. In particular, 'live data' is provided by a periodical 'data dump', with data being uploaded every 1 minute for example. This data could be gathered in particular measures of time and evaluated for signal to noise. Benchmark checking and pattern recognition may be performed for an individual for example. The user has SDK authorisation control. API commands include authentication and data upload, leading to an SDK notification. Emotions processing in this example is based on the VAD model. Notifications and arousal values are collected periodically by a SDK call and response from the API.
In performance-critical environments it is valuable to measure the situational data and emotional data in particular, and make them applicable to the athlete's performance and training programs. Advantageously, the system according to embodiments of the invention records multiple data streams in an effective manner to analyse emotions, also supporting interpretation and communication of that data, transforming it into visual and concise information to help athletes understand their emotions.
Applications of the present invention are wide and include consumer car market, as well as motor sports such as F1 in so called 'human reactive cockpits' to enhance safety and comfort. Other applications which could benefit from human state monitoring include ride- share applications (to assess and improve customer experience), e-sports and affective gaming (that is, gaming which responds to the feelings of a player) involving dynamic play/sensitive content, as well as consumer robotics (e.g. emphatics robots in the household industries).
It will be appreciated that the order of performance of the steps in any of the
embodiments in the present description is not essential, unless required by context or otherwise specified. Thus, most steps may be performed in any order. In addition, any of the embodiments may include more or fewer steps than those disclosed.
Additionally, it will be appreciated that the term "comprising" and its grammatical variants must be interpreted inclusively, unless the context requires otherwise. That is, "comprising" should be interpreted as meaning "including but not limited to".
Moreover, the invention has been described in terms of various specific embodiments. However, it will be appreciated that these are only examples which are used to illustrate the invention without limitation to those specific embodiments.

Claims

1. A system comprising a plurality of devices for sensing, detecting or measuring data associated with a person's experience over time, the system comprising a controller having a timer, the controller configured to:
repeatedly send queries, at times recorded by the controller's timer, to each of the plurality of devices, wherein each query requests a signal from the queried device that the device has started to sense, detect or measure data associated with the experience;
record, for each of the plurality of devices, the time according to the timer at which the device sent the signal and signal value received from the queried device; and compile signal values received from the plurality of devices based on the recorded times at which signal values were received, to thereby synchronise data which has been sensed, detected or measured by the plurality of devices.
2. A system according to claim 1, wherein the data associated with a person's
experience over time comprises physiological data and experience context data.
3. A system according to claim 1 or claim 2, wherein the plurality of devices
comprises wearable devices.
4. A system according to any one of the preceding claims, wherein the plurality of devices comprises smart textiles.
5. A system according to any one of the preceding claims, wherein the system further comprises a processor for estimating a human state or changes to ahuman state based on the synchronised data.
6. A system according to claim 5, wherein the processor is further configured to
provide feedback of the human state or of the changes to a human state.
7. A system according to claim 5 or claim 6, wherein estimating a human state or changes to a human state comprises using the circumplex model.
8. A system according to any one of the preceding claims, wherein each one of the plurality of devices has an internal clock and each internal clock runs at different speeds.
9. A system according to claim 8, wherein the plurality of devices start sensing,
detecting or measuring data associated with the experience at a plurality of different times according to their respective internal clocks.
10. A system according to any one of the preceding claims wherein compiling signal values comprises aligning signal values received from one of the plurality of devices based on the controller's timer.
11. An automotive system comprising a system according to any preceding claims.
12. A gaming system comprising a system according to any of claims 1 to 10.
13. A method of synchronising data which has been sensed, detected or measured by a plurality of devices, the data being associated with a person's experience over time, the method comprising the steps of:
repeatedly querying, at times recorded by a controller's timer, each of the plurality of devices, wherein each query requests a signal from the queried device that the device has started to sense, detect or measure data associated with the experience; recording, for each of the plurality of devices, the time according to the timer at which the device sent the signal and signal value received from the queried device; and
compiling signal values received from the plurality of devices based on the recorded times at which signal values were received.
PCT/GB2018/053144 2017-11-03 2018-10-31 Systems and methods for estimating human states WO2019086861A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1718236.1A GB201718236D0 (en) 2017-11-03 2017-11-03 Systems and methods for estimating emotional states
GB1718236.1 2017-11-03

Publications (1)

Publication Number Publication Date
WO2019086861A1 true WO2019086861A1 (en) 2019-05-09

Family

ID=60664749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2018/053144 WO2019086861A1 (en) 2017-11-03 2018-10-31 Systems and methods for estimating human states

Country Status (2)

Country Link
GB (1) GB201718236D0 (en)
WO (1) WO2019086861A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111939556A (en) * 2019-05-15 2020-11-17 腾讯科技(深圳)有限公司 Method, device and system for detecting abnormal operation of game
US11496653B2 (en) * 2020-07-02 2022-11-08 Shimadzu Corporation Measurement recording system and measurement recording method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110152632A1 (en) * 2008-08-06 2011-06-23 E-Vitae Pte. Ltd. Universal Body Sensor Network
US20120290266A1 (en) * 2011-05-13 2012-11-15 Fujitsu Limited Data Aggregation Platform
WO2016150981A1 (en) * 2015-03-23 2016-09-29 Koninklijke Philips N.V. Smart plurality of sensors for power management
WO2016193438A1 (en) * 2015-06-03 2016-12-08 Cortec Gmbh Method and system for processing data streams
WO2017050951A1 (en) * 2015-09-25 2017-03-30 Continental Automotive Gmbh Active motor vehicle instrument cluster system with integrated wearable device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110152632A1 (en) * 2008-08-06 2011-06-23 E-Vitae Pte. Ltd. Universal Body Sensor Network
US20120290266A1 (en) * 2011-05-13 2012-11-15 Fujitsu Limited Data Aggregation Platform
WO2016150981A1 (en) * 2015-03-23 2016-09-29 Koninklijke Philips N.V. Smart plurality of sensors for power management
WO2016193438A1 (en) * 2015-06-03 2016-12-08 Cortec Gmbh Method and system for processing data streams
WO2017050951A1 (en) * 2015-09-25 2017-03-30 Continental Automotive Gmbh Active motor vehicle instrument cluster system with integrated wearable device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111939556A (en) * 2019-05-15 2020-11-17 腾讯科技(深圳)有限公司 Method, device and system for detecting abnormal operation of game
CN111939556B (en) * 2019-05-15 2023-08-22 腾讯科技(深圳)有限公司 Method, device and system for detecting abnormal operation of game
US11496653B2 (en) * 2020-07-02 2022-11-08 Shimadzu Corporation Measurement recording system and measurement recording method

Also Published As

Publication number Publication date
GB201718236D0 (en) 2017-12-20

Similar Documents

Publication Publication Date Title
US20210005224A1 (en) System and Method for Determining a State of a User
US10901509B2 (en) Wearable computing apparatus and method
US20210059591A1 (en) Systems and methods for estimating and predicting emotional states and affects and providing real time feedback
CN108574701B (en) System and method for determining user status
US9723992B2 (en) Mental state analysis using blink rate
US20170143246A1 (en) Systems and methods for estimating and predicting emotional states and affects and providing real time feedback
US20130245396A1 (en) Mental state analysis using wearable-camera devices
EP3164802B1 (en) Method of collecting and processing computer user data during interaction with web-based content
US9204836B2 (en) Sporadic collection of mobile affect data
US20180107793A1 (en) Health activity monitoring and work scheduling
US9934425B2 (en) Collection of affect data from multiple mobile devices
US9646046B2 (en) Mental state data tagging for data collected from multiple sources
US20110301433A1 (en) Mental state analysis using web services
WO2019136485A1 (en) Content generation and control using sensor data for detection of neurological state
US20210401338A1 (en) Systems and methods for estimating and predicting emotional states and affects and providing real time feedback
KR20200020014A (en) Consumption of content with reactions of an individual
WO2019086856A1 (en) Systems and methods for combining and analysing human states
JP2016115057A (en) Biological information processing system, server system, biological information processing apparatus, biological information processing method, and program
WO2019086861A1 (en) Systems and methods for estimating human states
US20130052621A1 (en) Mental state analysis of voters
CN103186701A (en) Method, system and equipment for analyzing eating habits
JP7257381B2 (en) Judgment system and judgment method
WO2014106216A1 (en) Collection of affect data from multiple mobile devices
US20230032290A1 (en) Immersion assessment system and associated methods
EP3503565B1 (en) Method for determining of at least one content parameter of video data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18800278

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18800278

Country of ref document: EP

Kind code of ref document: A1