WO2019048883A1 - Wearable device for collective emotion expression - Google Patents

Wearable device for collective emotion expression Download PDF

Info

Publication number
WO2019048883A1
WO2019048883A1 PCT/GB2018/052557 GB2018052557W WO2019048883A1 WO 2019048883 A1 WO2019048883 A1 WO 2019048883A1 GB 2018052557 W GB2018052557 W GB 2018052557W WO 2019048883 A1 WO2019048883 A1 WO 2019048883A1
Authority
WO
WIPO (PCT)
Prior art keywords
emotional state
user
data
emotional
determined
Prior art date
Application number
PCT/GB2018/052557
Other languages
French (fr)
Inventor
Simon Woollard
Steve Micheal MADINCEA
Original Assignee
Fantastec Sports Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fantastec Sports Technology Ltd filed Critical Fantastec Sports Technology Ltd
Publication of WO2019048883A1 publication Critical patent/WO2019048883A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14507Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue specially adapted for measuring characteristics of body fluids other than blood
    • A61B5/14517Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue specially adapted for measuring characteristics of body fluids other than blood for sweat
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]

Definitions

  • This invention relates to wearable devices, in particular to wearable devices incorporating one or more sensors for providing user-related data.
  • Wearable devices including biometric sensors, electrodes, accelerometers and gyroscopes for collecting user data are known. However, there remains a need for further solutions for using data obtained from such devices.
  • the present invention provides a wearable device (such as a wrist band, although alternatives are possible, such as an in-ear wearable device, an adhesive flexible unit (like a silicon or graphen plaster that contains microconductor circuity), a micro unit surgically implanted under the skin or in the ear, or other surface wearable devices like a hat or a head band) comprising: one or more sensors for determining user-related data (such as EGG heart rate, skin temperature and/ or EDA activity (e.g. skin salinity) sensors); a processor for using the determined user-related data to determine an emotional state of the user; and an output display providing a visual indication (e.g. using light-emitting diodes) of a first emotional state (e.g.
  • a wearable device such as a wrist band, although alternatives are possible, such as an in-ear wearable device, an adhesive flexible unit (like a silicon or graphen plaster that contains microconductor circuity), a micro unit surgically implanted under the skin or in the ear
  • the present invention also provides a method comprising: determining user-related data (such as heart rate, skin temperature and/or skin salinity and motion); using the determined user- related data to determine an emotional state of the user; and displaying a visual indication of a first emotional state (e.g. the determined emotional state of the user or an aggregated emotional state derived from emotional states of multiple users).
  • user-related data such as heart rate, skin temperature and/or skin salinity and motion
  • aggregation may be derived from an emotional state of multiple users that share a defined attribute with the user (e.g. they support the same team or athlete).
  • the output display provides: a rapid pulsing of a first colour when the first emotional state has a first value (e.g. elated); a slow pulsing of the first colour when the first emotional state has a second value (e.g. excited); a impulsed output (e.g. of the first colour) when the first emotional state has a third value (e.g. passive or neutral); and an output of a second colour (e.g. unpulsed) when the first emotional state has a fourth value (e.g. sad).
  • a rapid pulsing of a first colour when the first emotional state has a first value e.g. elated
  • a slow pulsing of the first colour when the first emotional state has a second value e.g. excited
  • a impulsed output e.g. of the first colour
  • a third value e.g. passive or neutral
  • an output of a second colour e.g. unpulsed
  • a data transmission module may be provided for providing data related to the emotional state of the user to a server or a mobile device.
  • a user interface is provided. This could be part of the device, but could be provided as part of a mobile phone (or similar) application.
  • the first emotional state indicated by the output display may be selectable using the user interface from (at least) the following options: the determined emotional state of the user; and an aggregated emotional state derived from emotional states of multiple users.
  • the present invention also provides a system comprising a plurality of wearable devices as described herein and further comprising a server, wherein the server comprises: an input for receiving data from each of said wearable devices; a processor for
  • the present invention also comprising a computer program or a computer program product having computer code (or some other means) configured to: determine user- related data (such as heart rate, skin temperature and/or skin salinity and motion); use the determined user-related data to determine an emotional state of the user; and display a visual indication of a first emotional state.
  • user- related data such as heart rate, skin temperature and/or skin salinity and motion
  • Figure i is a block diagram of a system in accordance with an aspect of the present invention.
  • Figure 2 shows further detail of part of the system of Figure i;
  • FIG. 3 shows further detail of part of the system of Figure 1;
  • Figure 4 is a block diagram of a system in accordance with an aspect of the present invention.
  • FIG. 5 shows further details of a remote system in accordance with an aspect of the present invention
  • Figure 6 is a flow chart demonstrating an aspect of the present invention
  • Figure 7 is a flow chart demonstrating a further aspect of the present invention.
  • FIG. 1 is a block diagram of a system, indicated generally by the reference numeral 1, in accordance with an aspect of the present invention.
  • the exemplary system 1 comprises sensors 2, a processor 4, display 6, user interface 8, a remote system 10 and a mobile device 11.
  • the sensors 2, processor 4, display 6 and user interface 8 maybe incorporated into a wearable device, such as a wrist band, indicated by the reference numeral 12.
  • the sensors 2 are used to collect data concerning a user of the device 12, which data is used by the processor 4 to provide an indication of the emotion of the user (e.g. happy, sad, frustrated, elated).
  • the processor 4 is able to translate the detected emotion (typically in real time) into a light signature that can be displayed using the display 6. In this way, the user's emotion state can be displayed.
  • the processor 4 can transmit and time-stamp the user's emotional status to the remote system 10 (such as a cloud database).
  • the remote system can therefore curate and store data relating to the emotional status of multiple users (such as multiple spectators at a sporting event).
  • the remote system 10 may include a processor 13 and a memory 14.
  • communications between the remote system 10 and the processor 4 take place via a mobile communication device 11 (e.g. a so-called smart phone) generally referred to herein as a mobile device
  • the processor and the mobile device may communicate via a local communication system, such as Bluetooth.
  • the mobile device can then communicate with the remote system 10 via any suitable means.
  • the mobile device 11 may therefore be used to collate emotional data before
  • the sensors 2 include first, second, third and fourth sensors 22 to 28.
  • the first sensor 22 is an electrocardiography (ECG) sensor that is used to measure heart rate (amongst other things)
  • the second sensor 24 is a skin temperature sensor
  • the third sensor 26 is am electrodermal activity sensor (EDA) used to measure skin salinity (amongst other things)
  • the fourth sensor 28 is an accelerometer that is used to track motion.
  • ECG electrocardiography
  • EDA am electrodermal activity sensor
  • the fourth sensor 28 is an accelerometer that is used to track motion.
  • the sensors shown in Figure 2 are provided by way of example only; many different configurations are possible (for example, a gyroscopic sensor could be provided and more of fewer sensors could readily be provided).
  • each of the sensors 22 to 28 are connected to the input of a data analysis module 30.
  • the data analysis module typically forms part of the processor 4 (although it would be possible to provide the data analysis module remote from the device 12, such as within the remote system 10 or within the mobile device 11).
  • the data analysis module 30 takes the data provided by the various sensors and determines an emotional state of the user.
  • the body's 'fight or flight' response which prepares a person to fight back or run away from danger, is called the Autonomic Nervous System (ANS).
  • the ANS takes action by relaxing muscles, calming the heartbeat, and generally slowing down the body's processes down. This happens through the activation of another branch of ANS called the parasympathetic nervous system (PNS).
  • PNS parasympathetic nervous system
  • Activation of both the PNS and the sympathetic branches of the ANS can be measured through electrodermal (e.g. sweat gland) and cardiovascular (i.e., blood circulatory system) variations.
  • Parasympathetic activation occurs when your body needs to slow down and relax. It can be stimulated by the consumption of a large meal or deep breathing.
  • ECG analysis is substantially more accurate than many existing fitness trackers that simply use pulse rates.
  • Electrodermal activity (EDA) sensors can be used to monitor a user's physiological status as a consequence of their external sti muli.
  • Sympathetic activation also sympathetic arousal
  • Sympathetic activation also increases with stresses (e.g. physical, emotional, or cognitive stresses). Stress can be detected as changes in heart rate (the time between the peak of each heart beat) which occur as a result from both sympathetic and parasympathetic activation.
  • Heart rate the time between the peak of each heart beat
  • vagal tone can be made by extracting the high frequency component of this heart rate variability.
  • the skin is the only organ that is purely innervated by the sympathetic nervous system (and not affected by parasympathetic activation). We can observe increases in sympathetic activation by monitoring subtle electrical changes across the surface of the skin.
  • the emotions are categorised and labelled as: elation, excitement, passive- neutral and sad.
  • different emotions maybe detected, for example: excitement, arousal, apathy and stress.
  • Figure 3 shows further detail of part of the system of Figure 1.
  • the data analysis module 30 provides emotion data to a data transmission module 32. Both the data analysis module 30 and the data transmission module 32 are part of the processor 4.
  • the data transmission module 32 provides information to the display 6 and the remote system 10.
  • the data transmission module 32 may also receive information from the remote system 10 (e.g. via the mobile device 11).
  • the data analysis module 30 uses sensor data to determine whether the user is elated, excited, passive-neutral or sad.
  • the data transmission module 32 converts these sensed emotions into information for display using the display device 6.
  • the following displays may be provided using the display device 6: » In the event that the user is elated, a rapid pulsing of a first colour is displayed; * In the event that the user is excited, a slow pulsing of the first colour is displayed; « In the event that the user is passive-neutral, impulsed colour light (possibly of the first colour) is displayed; and
  • the display 6 may be implemented using a bank of light-emitting diodes that form part of the wearable device 12. Clearly, many alternatives to such an arrangement are possible. For example, a liquid crystal output or a plasma output maybe provided.
  • the invention enables a visual display of the emotions of a single user to be displayed using the wearable device.
  • a visual display of multiple users all wearing devices such as the device 12 can provide an interesting visual indication of the emotions of the group. This is particularly interesting in the context of a sporting event in which different groups within a stadium will often experience contrasting emotions.
  • the data transmission module 32 can also provide data to a remote system 10 and/or the mobile device 11.
  • the data transmitted may, for example, be the raw data output by the sensors 2, but more likely is that the data transmission module will provide the processed emotion data (e.g. elated, excited, passive-neutral, sad) to the remote system 10 or the mobile device 11 (typically together with a time- stamp).
  • Data transmitted to the mobile device 11 could be further transmitted, if desired as an option by the relevant user, to the remote system 10, such as a cloud database, where it could, for example, be curated and further analysed for the purpose of providing aggregated statistics of a community's emotional data.
  • FIG 4 is a block diagram of a system, indicated generally by the reference numeral 40, comprising multiple wearable devices 12a, 12b and 12c (each similar to the device 12 described above) and t he remote system 10.
  • Each of the wearable devices 12a, 12b and 12c is in two-way communication with the remote system 10 (often using a respective mobile device 11).
  • the two-way communication mavtake many forms, but is typically implemented using cloud-technology. Note that not all of the devices 12a, 12b and 12c need to be in the same location. For example, in the context of a stadium event, some of the users of the devices may be at the stadium and some of the users of the devices may be watching the event on television (e.g. at home or in a bar).
  • the remote system 10 can receive emotion data from the users of the devices 12a, 12b and 12c (typically from the respective data transmission modules 32 or from respective mobile devices 11). By receiving emotion data from multiple users attending the same event, the remote system is able to perform steps such as aggregating emotional responses amongst large groups of users.
  • the system 40 also includes a further user 42.
  • the user 42 is able to access data collected by the remote system 10, but is not a provider of data (e.g. does not have a wearable device 12).
  • Figure 5 shows further details of the remote system 12 in accordance with an aspect of the present invention.
  • the remote system 10 is being used as part of a stadium sports application.
  • the remote system 10 includes an emotional data curation module 52, a team shirt colour selection module 54, an aggregated emotional status of a fan community module 56 and a live emotional analysis and reporting module 58.
  • the emotional data curation module 52 is used to collate and store emotional data from multiple users, for example as received from the devices 12a, 12b and 12c.
  • the team shirt colour selection module 54 can be used to set the colours used by the displays 6 of the various devices (such as devices 12a, 12b and 12c). For example, a particular user may be registered as a fan of a particular team. When attending an event in which that team is playing, the team shirt colours database may be
  • the team shirt colour selection module may provide a default colour and lighting signature for each respective team based on home and away team colours (users may have the option to change these if desired).
  • the "elation” and “excitement” emotional statuses may be indicated using a red output of the display.
  • the "sad” emotional status may be indicated by a contrasting colour (e.g. blue, but not a colour represented by the opposition in that particular match).
  • the aggregated emotional status of a fan community module 56 can be used to generate averages of emotional response from multiple users.
  • averages may be generated for users supporting each team.
  • the averages may be generated in many ways (e.g. mean, mode or median averages).
  • the averages may be weighted in some way (e.g. with "elated” being a much more significant measure than simply "excited”).
  • the live emotional analysis and reporting module 58 may be provided to allow users (whether one of the users 12a, 12b or 12c, or the further user 42) to access data concerning the event under consideration.
  • the module 58 could be used to provide standard reports and analysis in addition to raw data.
  • the analysis and reporting module 58 could be extended to provide many forms of information. Examples include the emotional high-points and low-points of an event, an indication of the relative happiness of opposing fans at a sporting event (perhaps plotted over time), differences in reactions between fans at a stadium and fans remote from the stadium, most exciting players etc.
  • the system 12 includes a user interface 8.
  • the user interface may be provided as part of the display 6, but many alternatives are possible.
  • One example is to provide the user interface 8 as part of a mobile phone (or similar) application.
  • the user interface 8 enables a user to tailor the settings of the own wearable device.
  • FIG. 6 shows an exemplary application programming interface (APT) indicated generally by the reference numeral 60 that enables a user to select preferences regarding how they allow emotional data to be collected and shared with a wide audience, such as a team-follower community.
  • API application programming interface
  • the user selects a sport or community type (such as football).
  • the user selects a team or community.
  • light preferences are set. These maybe predefined (by virtue of the selection made at step 62), but even if pre-defined, the user may have the option of over-ride such selections.
  • preferences selected at step 63 are matched with fixture data.
  • fixture data By way of example, home and away colours (or light signatures) may be defined at step 63, with the match fixture data 64 identifying whether a team selected at step 62 is playing at home or away on that particular day.
  • emotional preferences can be set. For example, a user may want to enable the detection of all possible emotional states, or may wish to enable only a subset of those.
  • light preferences can be set.
  • the lighting of the device 12 may be turned on or off.
  • the lighting option may be tailored in some way. For example, the user may select between displaying personal emotions and community emotions (e.g. the community defined at step 62).
  • transmit preferences can be set.
  • the user may enable or disable the transmission of emotion data from the user to the remote device 10.
  • the user may be able to decide whether or not to receive aggregated emotional data concerning a particular community (e.g. the community selected at step 62).
  • the API 60 is provided by way of example only. Many variants and alternatives will be readily apparent to the person skilled in the art. For example, some of the steps of the API 60 maybe omitted or may only be available to some users. As described above, the display 6 maybe used to display emotions of the user of the wearable device 12. An alternative is for the display 6 to display an average emotion of multiple users, such as the users 12a, 12b and 12c shown in Figure 4.
  • the user interface 8 could offer the following options for the display 6 (e.g. using the API 60 described above):
  • the colours used output by a particular device 12 may be configurable (for example to those of the user's team or to any selected colour).
  • the user setting may allow a user to turn off the emotional display entirely.
  • FIG. 7 is a flow chart showing an algorithm, indicated generally by the reference numeral 70 in accordance with an aspect of the present invention.
  • the algorithm starts at step 72 where user-related data is determined (using the sensors of the wearable device).
  • step 74 the emotional state of the user of the wearable device is determined.
  • step 76 an emotional state is displayed using the display 6.
  • the emotional state displayed at step 76 may be the emotional state determined at step 74, but alternatives are possible.

Abstract

A wearable device is provided including one or more sensors for providing user-related data (such as heart rate, skin temperature, skin salinity and/or motion). The wearable device also includes a processor for using the user-related data to determine an emotional state of the user and an output display providing a visual indication of a first emotional state.

Description

WEARABLE DEVICE FOR COLLECTIVE EMOTION EXPRESSION
FIELD OF THE INVENTION
This invention relates to wearable devices, in particular to wearable devices incorporating one or more sensors for providing user-related data.
BACKGROUND
Wearable devices including biometric sensors, electrodes, accelerometers and gyroscopes for collecting user data are known. However, there remains a need for further solutions for using data obtained from such devices.
SUMMARY OF THE INVENTION
The present invention provides a wearable device (such as a wrist band, although alternatives are possible, such as an in-ear wearable device, an adhesive flexible unit (like a silicon or graphen plaster that contains microconductor circuity), a micro unit surgically implanted under the skin or in the ear, or other surface wearable devices like a hat or a head band) comprising: one or more sensors for determining user-related data (such as EGG heart rate, skin temperature and/ or EDA activity (e.g. skin salinity) sensors); a processor for using the determined user-related data to determine an emotional state of the user; and an output display providing a visual indication (e.g. using light-emitting diodes) of a first emotional state (e.g. the determined emotional state of the user or an aggregated emotional state derived from emotional states of multiple users). The present invention also provides a method comprising: determining user-related data (such as heart rate, skin temperature and/or skin salinity and motion); using the determined user- related data to determine an emotional state of the user; and displaying a visual indication of a first emotional state (e.g. the determined emotional state of the user or an aggregated emotional state derived from emotional states of multiple users).
Where a visual indication of an aggregated emotional state is displayed, that aggregation may be derived from an emotional state of multiple users that share a defined attribute with the user (e.g. they support the same team or athlete).
Alternatively, or in addition, that aggregation may be derived from an emotional state of multiple users that are present at an event together with the said user. In one form of the invention, the output display provides: a rapid pulsing of a first colour when the first emotional state has a first value (e.g. elated); a slow pulsing of the first colour when the first emotional state has a second value (e.g. excited); a impulsed output (e.g. of the first colour) when the first emotional state has a third value (e.g. passive or neutral); and an output of a second colour (e.g. unpulsed) when the first emotional state has a fourth value (e.g. sad). These options are not essential to all forms of the invention. Any particular embodiment might include a combination of one or more of the outputs set out above, possibly in combination with further options not set out above.
A data transmission module may be provided for providing data related to the emotional state of the user to a server or a mobile device. In some forms of the invention, a user interface is provided. This could be part of the device, but could be provided as part of a mobile phone (or similar) application. In forms of the invention including a user interface, the first emotional state indicated by the output display may be selectable using the user interface from (at least) the following options: the determined emotional state of the user; and an aggregated emotional state derived from emotional states of multiple users.
The present invention also provides a system comprising a plurality of wearable devices as described herein and further comprising a server, wherein the server comprises: an input for receiving data from each of said wearable devices; a processor for
manipulating the data from each of said wearable devices; and an output for providing data to each of said wearable devices.
The present invention also comprising a computer program or a computer program product having computer code (or some other means) configured to: determine user- related data (such as heart rate, skin temperature and/or skin salinity and motion); use the determined user-related data to determine an emotional state of the user; and display a visual indication of a first emotional state.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described in further detail with reference to the following schematic drawings, in which: Figure i is a block diagram of a system in accordance with an aspect of the present invention;
Figure 2 shows further detail of part of the system of Figure i;
Figure 3 shows further detail of part of the system of Figure 1;
Figure 4 is a block diagram of a system in accordance with an aspect of the present invention;
Figure 5 shows further details of a remote system in accordance with an aspect of the present invention;
Figure 6 is a flow chart demonstrating an aspect of the present invention; and Figure 7 is a flow chart demonstrating a further aspect of the present invention.
DETAILED DECRIPTION
There are many environments in which multiple people experience strong emotions. One example is stadium sports events in which large groups of people experience emotions that are strong and may in some cases be shared, whilst in other cases maybe opposite. The present invention allows such groups to display such emotions.
Figure 1 is a block diagram of a system, indicated generally by the reference numeral 1, in accordance with an aspect of the present invention. The exemplary system 1 comprises sensors 2, a processor 4, display 6, user interface 8, a remote system 10 and a mobile device 11. The sensors 2, processor 4, display 6 and user interface 8 maybe incorporated into a wearable device, such as a wrist band, indicated by the reference numeral 12.
The sensors 2 are used to collect data concerning a user of the device 12, which data is used by the processor 4 to provide an indication of the emotion of the user (e.g. happy, sad, frustrated, elated). As described further below, the processor 4 is able to translate the detected emotion (typically in real time) into a light signature that can be displayed using the display 6. In this way, the user's emotion state can be displayed.
At the same time, the processor 4 can transmit and time-stamp the user's emotional status to the remote system 10 (such as a cloud database). The remote system can therefore curate and store data relating to the emotional status of multiple users (such as multiple spectators at a sporting event). As shown in Figure 1, the remote system 10 may include a processor 13 and a memory 14. In one form of the invention, communications between the remote system 10 and the processor 4 take place via a mobile communication device 11 (e.g. a so-called smart phone) generally referred to herein as a mobile device, The processor and the mobile device may communicate via a local communication system, such as Bluetooth. The mobile device can then communicate with the remote system 10 via any suitable means. The mobile device 11 may therefore be used to collate emotional data before
transmitting that data to the remote system. This may have the advantage that the limited power resources of the device 12 are not need to communicate with the remote system (the short range transmission to the mobile device 11 generally requiring much less power than the longer range transmission to the remote system 10). This maybe significant where the device has relatively limited power, perhaps provided by a lithium battery (or similar). It should be noted, however, that communications could be made directly between the processor and the remote system such that the mobile device 11 could be omitted.
Figure 2 shows further detail of part of the system 1. As shown in Figure 2, the sensors 2 include first, second, third and fourth sensors 22 to 28. In the example shown in Figure 2, the first sensor 22 is an electrocardiography (ECG) sensor that is used to measure heart rate (amongst other things), the second sensor 24 is a skin temperature sensor, the third sensor 26 is am electrodermal activity sensor (EDA) used to measure skin salinity (amongst other things) and the fourth sensor 28 is an accelerometer that is used to track motion. Of course, the sensors shown in Figure 2 are provided by way of example only; many different configurations are possible (for example, a gyroscopic sensor could be provided and more of fewer sensors could readily be provided).
As shown in Figure 2, each of the sensors 22 to 28 are connected to the input of a data analysis module 30. The data analysis module typically forms part of the processor 4 (although it would be possible to provide the data analysis module remote from the device 12, such as within the remote system 10 or within the mobile device 11). The data analysis module 30 takes the data provided by the various sensors and determines an emotional state of the user.
The body's 'fight or flight' response, which prepares a person to fight back or run away from danger, is called the Autonomic Nervous System (ANS). The ANS takes action by relaxing muscles, calming the heartbeat, and generally slowing down the body's processes down. This happens through the activation of another branch of ANS called the parasympathetic nervous system (PNS). This corresponds to changes in heartbeat patterns (Heart Rate Variability - HRV) and various indicators in the electrical properties of a user's skin. Activation of both the PNS and the sympathetic branches of the ANS can be measured through electrodermal (e.g. sweat gland) and cardiovascular (i.e., blood circulatory system) variations.
Parasympathetic activation occurs when your body needs to slow down and relax. It can be stimulated by the consumption of a large meal or deep breathing.
All our emotions leave a unique fingerprint in this battle between sympathetic and parasympathetic control. Every emotion causes the heart to beat in a slightly different way. A human heart beats with measurably different characteristics when it experiences different emotional stimuli such as when one is in a rage or a clam state of serene happiness.
These fingerprint can be ready using hear rate variability (HRV) analysis of ECG data provided by the sensors 2 (particularly the sensor 22). ECG analysis is substantially more accurate than many existing fitness trackers that simply use pulse rates.
Furthermore, it is possible to filter out electrical noise due, for example, to muscle contractions, leaving only the electromagnetic feed from the heart (ECD). This means tracking R-peaks that identify the unique signals associated with each emotion is possible and is becoming increasingly accurate. Using the ECG and R-peak analysis, up to 32 emotions can be tracked as defined in Plutchik's wheel of emotions, a standard emotion reference tool in medicine and psychology.
Electrodermal activity (EDA) sensors (such as senor 26) can be used to monitor a user's physiological status as a consequence of their external sti muli. Sympathetic activation (also sympathetic arousal) increases when a user experiences excitement, or something important is happening or about to happen. Sympathetic activation also increases with stresses (e.g. physical, emotional, or cognitive stresses). Stress can be detected as changes in heart rate (the time between the peak of each heart beat) which occur as a result from both sympathetic and parasympathetic activation. Estimates of parasympathetic nervous system activation or vagal tone can be made by extracting the high frequency component of this heart rate variability.
The skin is the only organ that is purely innervated by the sympathetic nervous system (and not affected by parasympathetic activation). We can observe increases in sympathetic activation by monitoring subtle electrical changes across the surface of the skin.
Given sufficient data from sensors such as the sensors 22 to 28, many different emotional states can be recognised. These include some of the following: ecstasy, joy, serenity, love, acceptance, trust, admiration, terror, fear, apprehension, awe, submission, amazement, surprise, distraction, grief, sadness, disapproval, pensiveness, loathing, disgust, boredom, remorse, rage, anger, annoyance, contempt, aggressiveness, vigilance, anticipation, interest and optimism.
In the use of the principles of the invention for identifying emotions of a user of a wearable device (e.g. within a stadium sports environment), a subset of possible emotions can be readily identified by the data analysis module 30. In one form of the invention, the emotions are categorised and labelled as: elation, excitement, passive- neutral and sad. In an alternative form of the invention, different emotions maybe detected, for example: excitement, arousal, apathy and stress.
Figure 3 shows further detail of part of the system of Figure 1. As shown in Figure 3, the data analysis module 30 provides emotion data to a data transmission module 32. Both the data analysis module 30 and the data transmission module 32 are part of the processor 4. The data transmission module 32 provides information to the display 6 and the remote system 10. The data transmission module 32 may also receive information from the remote system 10 (e.g. via the mobile device 11). As noted above, in one form of the invention, the data analysis module 30 uses sensor data to determine whether the user is elated, excited, passive-neutral or sad. The data transmission module 32 converts these sensed emotions into information for display using the display device 6. By way of example, the following displays may be provided using the display device 6: » In the event that the user is elated, a rapid pulsing of a first colour is displayed; * In the event that the user is excited, a slow pulsing of the first colour is displayed; « In the event that the user is passive-neutral, impulsed colour light (possibly of the first colour) is displayed; and
« In the event that the user is sad, an unpulsed (i.e. constant) colour light of a second colour (contrasting with the first) is displayed.
The display 6 may be implemented using a bank of light-emitting diodes that form part of the wearable device 12. Clearly, many alternatives to such an arrangement are possible. For example, a liquid crystal output or a plasma output maybe provided.
In this way, the invention enables a visual display of the emotions of a single user to be displayed using the wearable device. In the context of an event (such as a stadium sports event) in which lots of people participate, a visual display of multiple users all wearing devices such as the device 12 can provide an interesting visual indication of the emotions of the group. This is particularly interesting in the context of a sporting event in which different groups within a stadium will often experience contrasting emotions.
As shown in Figure 3, the data transmission module 32 can also provide data to a remote system 10 and/or the mobile device 11. The data transmitted may, for example, be the raw data output by the sensors 2, but more likely is that the data transmission module will provide the processed emotion data (e.g. elated, excited, passive-neutral, sad) to the remote system 10 or the mobile device 11 (typically together with a time- stamp). Data transmitted to the mobile device 11 could be further transmitted, if desired as an option by the relevant user, to the remote system 10, such as a cloud database, where it could, for example, be curated and further analysed for the purpose of providing aggregated statistics of a community's emotional data.
Figure 4 is a block diagram of a system, indicated generally by the reference numeral 40, comprising multiple wearable devices 12a, 12b and 12c (each similar to the device 12 described above) and t he remote system 10. Each of the wearable devices 12a, 12b and 12c is in two-way communication with the remote system 10 (often using a respective mobile device 11). The two-way communication mavtake many forms, but is typically implemented using cloud-technology. Note that not all of the devices 12a, 12b and 12c need to be in the same location. For example, in the context of a stadium event, some of the users of the devices may be at the stadium and some of the users of the devices may be watching the event on television (e.g. at home or in a bar). Using the system 40, the remote system 10 can receive emotion data from the users of the devices 12a, 12b and 12c (typically from the respective data transmission modules 32 or from respective mobile devices 11). By receiving emotion data from multiple users attending the same event, the remote system is able to perform steps such as aggregating emotional responses amongst large groups of users.
As shown in Figure 4, the system 40 also includes a further user 42. The user 42 is able to access data collected by the remote system 10, but is not a provider of data (e.g. does not have a wearable device 12).
Figure 5 shows further details of the remote system 12 in accordance with an aspect of the present invention. In the example of Figure 5, the remote system 10 is being used as part of a stadium sports application.
As shown in Figure 5, the remote system 10 includes an emotional data curation module 52, a team shirt colour selection module 54, an aggregated emotional status of a fan community module 56 and a live emotional analysis and reporting module 58. The emotional data curation module 52 is used to collate and store emotional data from multiple users, for example as received from the devices 12a, 12b and 12c.
The team shirt colour selection module 54 can be used to set the colours used by the displays 6 of the various devices (such as devices 12a, 12b and 12c). For example, a particular user may be registered as a fan of a particular team. When attending an event in which that team is playing, the team shirt colours database may be
interrogated so that the display 6 of that user displays colours appropriate to the colours that the shirts that the relevant team is wearing on that day. Indeed, the team shirt colour selection module may provide a default colour and lighting signature for each respective team based on home and away team colours (users may have the option to change these if desired). Thus, if the fan is supporting a team that is playing in red, then the "elation" and "excitement" emotional statuses may be indicated using a red output of the display. Similarly, the "sad" emotional status may be indicated by a contrasting colour (e.g. blue, but not a colour represented by the opposition in that particular match). The aggregated emotional status of a fan community module 56 can be used to generate averages of emotional response from multiple users. In the context of a sporting event, separate averages maybe generated for users supporting each team. The averages may be generated in many ways (e.g. mean, mode or median averages). The averages may be weighted in some way (e.g. with "elated" being a much more significant measure than simply "excited").
The live emotional analysis and reporting module 58 may be provided to allow users (whether one of the users 12a, 12b or 12c, or the further user 42) to access data concerning the event under consideration. The module 58 could be used to provide standard reports and analysis in addition to raw data.
The analysis and reporting module 58 could be extended to provide many forms of information. Examples include the emotional high-points and low-points of an event, an indication of the relative happiness of opposing fans at a sporting event (perhaps plotted over time), differences in reactions between fans at a stadium and fans remote from the stadium, most exciting players etc.
As shown in Figure 1, the system 12 includes a user interface 8. The user interface may be provided as part of the display 6, but many alternatives are possible. One example is to provide the user interface 8 as part of a mobile phone (or similar) application. The user interface 8 enables a user to tailor the settings of the own wearable device.
Figure 6 shows an exemplary application programming interface (APT) indicated generally by the reference numeral 60 that enables a user to select preferences regarding how they allow emotional data to be collected and shared with a wide audience, such as a team-follower community.
At step 61, the user selects a sport or community type (such as football). At step 62, the user selects a team or community. Next, at step 63, light preferences are set. These maybe predefined (by virtue of the selection made at step 62), but even if pre-defined, the user may have the option of over-ride such selections. At step 64, preferences selected at step 63 are matched with fixture data. By way of example, home and away colours (or light signatures) may be defined at step 63, with the match fixture data 64 identifying whether a team selected at step 62 is playing at home or away on that particular day. At step 65, emotional preferences can be set. For example, a user may want to enable the detection of all possible emotional states, or may wish to enable only a subset of those.
At step 66 light preferences can be set. At this step, the lighting of the device 12 may be turned on or off. Alternatively, the lighting option may be tailored in some way. For example, the user may select between displaying personal emotions and community emotions (e.g. the community defined at step 62).
At step 67, transmit preferences can be set. For example, the user may enable or disable the transmission of emotion data from the user to the remote device 10.
Further, the user may be able to decide whether or not to receive aggregated emotional data concerning a particular community (e.g. the community selected at step 62).
The API 60 is provided by way of example only. Many variants and alternatives will be readily apparent to the person skilled in the art. For example, some of the steps of the API 60 maybe omitted or may only be available to some users. As described above, the display 6 maybe used to display emotions of the user of the wearable device 12. An alternative is for the display 6 to display an average emotion of multiple users, such as the users 12a, 12b and 12c shown in Figure 4.
In the context of a televised stadium sports event, the user interface 8 could offer the following options for the display 6 (e.g. using the API 60 described above):
* The user's personal emotions;
« The average (or some other aggregate) of emotions of all people attending the event
(provided by the module 56);
« The average emotions of all people attending the event that are registered as
supporting the same team as the relevant user (provided by the module 56);
« The average emotions of all people that are registered as supporting the same team as the relevant user, whether at the event or not (e.g. including those watching the event on television). The colours used output by a particular device 12 may be configurable (for example to those of the user's team or to any selected colour).
Furthermore, the user setting may allow a user to turn off the emotional display entirely.
Figure 7 is a flow chart showing an algorithm, indicated generally by the reference numeral 70 in accordance with an aspect of the present invention. The algorithm starts at step 72 where user-related data is determined (using the sensors of the wearable device). Next, at step 74, the emotional state of the user of the wearable device is determined. Finally, at step 76, an emotional state is displayed using the display 6. As noted above, the emotional state displayed at step 76 may be the emotional state determined at step 74, but alternatives are possible.
The embodiments of the invention described above are provided by way of example only. The skilled person will be aware of many modifications, changes and
substitutions that could be made without departing from the scope of the present invention. For example, although the invention has generally been described with reference to sporting events, the invention could be deployed at other event (e.g. music concerts or a dating event where attraction and emotions could be shared through light signatures). It should be noted that the range of emotions displayed could be selected depending on the nature of the event (for example, musical concerts, dating events and sporting events would result in different ranges of emotions). The claims of the present invention are intended to cover all such modifications, changes and substitutions as fall within the spirit and scope of the invention.

Claims

Claims
1. A wearable device comprising:
one or more sensors for determining user-related data;
a processor for using the determined user-related data to determine an emotional state of the user; and
an output display providing a visual indication of a first emotional state.
2. A device as claimed in claim i, wherein the first emotional state is the determined emotional state of the user.
3. A device as claimed in claim 1 or claim 2, wherein the emotional state is an aggregated emotional state derived from emotional states of multiple users.
4. A device as claimed in any claim 3, wherein said aggregated emotional state is derived from an emotional state of multiple users that share a defined attribute with the user.
5. A device as claimed in claim 3 or claim 4, said aggregated emotional state is derived from emotional state of multiple users that are present at an event together with the said user.
6. A device as claimed in preceding claim, wherein said output display comprises a plurality of light-emitting diodes.
7. A device as claimed in any preceding claim, wherein the output display provides:
a rapid pulsing of a first colour when the first emotional state has a first value; a slow pulsing of the first colour when the first emotional state has a second value;
a unpulsed output when the first emotional state has a third value; and an output of a second colour when the first emotional state has a fourth value.
8. A device as claimed in any preceding claim, further comprising a data transmission module for providing data related to the emotional state of the user to a server or a mobile device.
9. A device as claimed in any preceding claim, further comprising a user interface.
10. A device as claimed in any claim 9, wherein the first emotional state to be indicated by said output display is selectable using the user interface from the following options:
the determined emo tional state of the user ; and
an aggregated emotional state derived from emotional states of multiple users.
11. A system comprising a plurality of wearable devices and a server, wherein each wearable device is a device as claimed in any preceding claim, wherein the server comprises:
an input for receiving data from each of said wearable devices;
a processor for manipulating the data from each of said wearable devices; and an output for providing data to each of said wearable devices.
12. A system as claimed in claim 11, wherein said processor is configured to generate an aggregated emotional state derived from determined emotional states of multiple users.
13. A method comprising:
determining user- related data;
using the determined user- related data to determine an emotional state of the user; and
displaying a visual indication of a first emotional state.
14. A method as claimed in claim 13, wherein the first emotional state is the determined emotional state of the user.
15. A computer program comprising instructions that when executed by a computer control it to perform the method of claim 13 or claim 14.
PCT/GB2018/052557 2017-09-11 2018-09-10 Wearable device for collective emotion expression WO2019048883A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1714560.8 2017-09-11
GB1714560.8A GB2566318A (en) 2017-09-11 2017-09-11 Wearable device

Publications (1)

Publication Number Publication Date
WO2019048883A1 true WO2019048883A1 (en) 2019-03-14

Family

ID=60117338

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2018/052557 WO2019048883A1 (en) 2017-09-11 2018-09-10 Wearable device for collective emotion expression

Country Status (2)

Country Link
GB (1) GB2566318A (en)
WO (1) WO2019048883A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016199938A1 (en) * 2015-06-12 2016-12-15 株式会社イヌパシー Sound collector, animal emotion estimation device, and animal emotion estimation method
KR20170027910A (en) * 2015-09-02 2017-03-13 신정아 Emotional Illumination System Using Bluetooth Signal

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RS51589B (en) * 2007-07-17 2011-08-31 Vladimir RANĐELOVIĆ Apparatus for analysis of relaxation in bathtubs
WO2009059248A1 (en) * 2007-10-31 2009-05-07 Emsense Corporation Systems and methods providing distributed collection and centralized processing of physiological responses from viewers
US20090143695A1 (en) * 2007-11-30 2009-06-04 Palo Alto Research Center Incorporated Brainwave-facilitated presenter feedback mechanism
US20130218663A1 (en) * 2010-06-07 2013-08-22 Affectiva, Inc. Affect based political advertisement analysis
GB201211703D0 (en) * 2012-07-02 2012-08-15 Charles Nduka Plastic Surgery Ltd Biofeedback system
RU2506631C1 (en) * 2012-07-26 2014-02-10 Юрий Геннадьевич Чирков Detection method and apparatus
US20150297140A1 (en) * 2014-04-21 2015-10-22 Microsoft Corporation User stress detection and mitigation
KR101864142B1 (en) * 2015-11-10 2018-06-05 (주)이산로봇 System for controling smart robot using smart terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016199938A1 (en) * 2015-06-12 2016-12-15 株式会社イヌパシー Sound collector, animal emotion estimation device, and animal emotion estimation method
KR20170027910A (en) * 2015-09-02 2017-03-13 신정아 Emotional Illumination System Using Bluetooth Signal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BING LI ET AL: "LightingHair Slice", HUMAN FACTORS IN COMPUTING SYSTEMS, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 6 May 2017 (2017-05-06), pages 1824 - 1828, XP058338661, ISBN: 978-1-4503-4656-6, DOI: 10.1145/3027063.3053093 *

Also Published As

Publication number Publication date
GB2566318A (en) 2019-03-13
GB201714560D0 (en) 2017-10-25

Similar Documents

Publication Publication Date Title
US20210000347A1 (en) Enhanced physiological monitoring devices and computer-implemented systems and methods of remote physiological monitoring of subjects
US20230190100A1 (en) Enhanced computer-implemented systems and methods of automated physiological monitoring, prognosis, and triage
JP7303128B2 (en) A method of operating a device using a wearer's garment sensor platform and a data processing system
US20180233226A1 (en) Method and apparatus for providing a haptic monitoring system using multiple sensors
KR101788485B1 (en) Intelligent device mode shifting based on activity
JP5769630B2 (en) Method and apparatus for providing a tactile monitoring system using a plurality of sensors
US10292646B2 (en) Mobile health care device and operating method thereof
KR101243763B1 (en) Apparatus and method for monitoring health index using electroconductive fiber
EP2967413A1 (en) Systems and methods of multispectral blood measurement
JP2013258555A (en) Head-mounted display, biological information management apparatus, and biological information display method
KR101686070B1 (en) Health caring method of monitoring wellness index for biofeedback and home health mirror-wearable interacting system performing the health caring method
US20150335947A1 (en) Sports device and system
Beh et al. MAUS: A dataset for mental workload assessmenton N-back task using wearable sensor
Akbulut et al. e-Vital: a wrist-worn wearable sensor device for measuring vital parameters
WO2019048883A1 (en) Wearable device for collective emotion expression
KR102545129B1 (en) Method and system for providing remote counseling service
WO2022126024A1 (en) Foldable sensor-based devices
Brown et al. A low-cost portable health platform for monitoring of human physiological signals
WO2020080243A1 (en) Information processing device, information processing method and program
Gradl The Stroop Room: A Wearable Virtual Reality Stress Laboratory Based on the Electrocardiogram
US20230107691A1 (en) Closed Loop System Using In-ear Infrasonic Hemodynography and Method Therefor
US20230062794A1 (en) Health monitoring platform
WO2023157596A1 (en) Information processing method, information processing device, program, and information processing system
Perego Device for mHealth
WO2023052751A1 (en) Method and system for facilitating communication between an electronics module and an audio output device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18769776

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 23.07.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18769776

Country of ref document: EP

Kind code of ref document: A1