WO2014145204A4 - Mental state analysis using heart rate collection based video imagery - Google Patents
Mental state analysis using heart rate collection based video imagery Download PDFInfo
- Publication number
- WO2014145204A4 WO2014145204A4 PCT/US2014/029926 US2014029926W WO2014145204A4 WO 2014145204 A4 WO2014145204 A4 WO 2014145204A4 US 2014029926 W US2014029926 W US 2014029926W WO 2014145204 A4 WO2014145204 A4 WO 2014145204A4
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- heart rate
- video
- individual
- rate information
- analyzing
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02405—Determining heart rate variability
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Cardiology (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Psychiatry (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Primary Health Care (AREA)
- Educational Technology (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Data Mining & Analysis (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Physiology (AREA)
- Hospice & Palliative Care (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
Video of one or more people is obtained and analyzed. Heart rate information is determined from the video and the heart rate information is used in mental state analysis. The heart rate information and resulting mental state analysis are correlated to stimuli, such as digital media which is consumed or with which a person interacts. The heart rate information is used to infer mental states. The mental state analysis, based on the heart rate information, can be used to optimize digital media or modify a digital game.
Claims
1. A computer-implemented method for mental state analysis comprising:
obtaining video of an individual;
analyzing the video to determine heart rate information wherein the analyzing includes evaluation of phasic and tonic heart rate responses; and
inferring mental states of the individual based on the heart rate information.
2. The method of claim 1 further comprising analyzing a media presentation based on the mental states, which were inferred.
3. The method of claim 2 wherein the analyzing the media presentation includes evaluating advertisement effectiveness.
4. The method of claim 2 wherein the analyzing the media presentation includes optimizing the media presentation.
5. The method of claim 2 wherein the media presentation includes one or more of an advertisement, a movie, a television show, a web series, a webisode, a video, a video clip, an electronic game, a concept presentation, an e-book, an e-magazine, or an app.
6. The method of claim 1 wherein the heart rate information is correlated to a stimulus that the individual is encountering.
7. The method of claim 6 wherein the stimulus pertains to a media presentation or is based on a game.
8. The method of claim 7 wherein the game is modified based on the heart rate information.
9. The method of claim 8 wherein a modification to the game includes modifying an avatar.
27
10. (Cancelled)
11. The method of claim 1 further comprising aggregating the heart rate information for the individual with other people.
12. The method of claim 1 further comprising aggregating the mental states for the individual with other people.
13. The method of claim 1 wherein learning about heart rate information is included as part of the analyzing.
14. The method of claim 1 wherein the inferring includes determining arousal, attention, or valence.
15. The method of claim 1 wherein the analyzing includes calculating blood volume pulse.
16. The method of claim 1 wherein the inferring factors in a time lag between a stimulus and the heart rate information.
17. The method of claim 1 wherein the analyzing factors in an occlusion of part of a face for the individual.
18. The method of claim 1 wherein the video has a variable frame rate.
19. The method of claim 1 further comprising determining contextual information.
20. The method of claim 1 wherein the obtaining the video of the individual comprises capturing the video with a webcam.
21. The method of claim 1 wherein the obtaining the video of the individual comprises receiving the video from another computer.
28
22. The method of claim 1 wherein the obtaining the video of the individual comprises receiving the video over the Internet.
23. The method of claim 1 wherein the heart rate information includes heart rate or heart rate variability.
24. The method of claim 1 wherein the analyzing includes identifying a location of a face of the individual in a portion of the video.
25. The method of claim 24 further comprising establishing a region of interest including the face, separating pixels in the region of interest into at least two channel values and combining to form raw traces, transforming and decomposing the raw traces into at least one independent source signal, and processing the at least one independent source signal to obtain the heart rate information.
26. The method of claim 25 wherein the heart rate information includes heart rate and the heart rate is determined based on changes in an amount of reflected light.
27. The method of claim 1 wherein the video includes a plurality of other people.
28. The method of claim 27 further comprising identifying locations for faces of the plurality of other people and analyzing the video to determine heart rate information on the plurality of other people.
29. The method of claim 28 further comprising inferring mental states of the plurality of other people based on the heart rate information on the plurality of other people.
30. The method of claim 1 further comprising obtaining biosensor data for the individual.
31. The method of claim 30 wherein the biosensor data augments the heart rate information.
29
32. The method of claim 30 wherein the biosensor data includes one or more of electrodermal activity, heart rate, heart rate variability, skin temperature, or respiration.
33. The method of claim 1 wherein the video includes a series of images of the individual.
34. The method of claim 1 further comprising collecting facial data based on the video.
35. The method of claim 34 wherein the facial data includes facial movements.
36. The method of claim 35 wherein the inferring is based on the facial data.
37. The method of claim 35 wherein the facial data is used in combination with the heart rate information.
38. The method of claim 1 wherein the mental states include one or more of frustration, confusion, disappointment, hesitation, cognitive overload, focusing, engagement, attention, boredom, exploration, confidence, trust, delight, disgust, skepticism, doubt, satisfaction, excitement, laughter, calmness, sadness, stress, happiness, anger, sentimentality, and curiosity.
39. The method of claim 1 wherein the analyzing includes extracting a heart rate from evaluation of a face of the individual in the video.
40. The method of claim 39 wherein the heart rate is an equivalent to a blood volume pulse value.
41. The method of claim 1 wherein the analyzing uses a green channel from the video.
42. The method of claim 1 further comprising converting the video to a constant frame rate and performing filtering on the video to facilitate the analyzing.
43. The method of claim 1 further comprising interpreting physiological arousal from the heart rate information.
30
44. A computer program product embodied in a computer readable medium for mental state analysis, the computer program product comprising:
code for obtaining video of an individual;
code for analyzing the video to determine heart rate information wherein the analyzing includes evaluation of phasic and tonic heart rate responses; and
code for inferring mental states of the individual based on the heart rate information.
45. The computer program product of claim 44 further comprising code for analyzing a media presentation based on the mental states, which were inferred.
46. The computer program product of claim 44 further comprising code for aggregating the heart rate information for the individual with other people.
47. The computer program product of claim 44 further comprising code for aggregating the mental states for the individual with other people.
48. The computer program product of claim 44 wherein the inferring includes
determining arousal, attention, or valence.
49. The computer program product of claim 44 wherein the analyzing includes identifying a location of a face of the individual in a portion of the video.
50. The computer program product of claim 49 further comprising code for establishing a region of interest including the face, separating pixels in the region of interest into at least two channel values and combining to form raw traces, transforming and decomposing the raw traces into at least one independent source signal, and processing the at least one independent source signal to obtain the heart rate information.
51. The computer program product of claim 44 wherein the mental states include one or more of frustration, confusion, disappointment, hesitation, cognitive overload, focusing, engagement, attention, boredom, exploration, confidence, trust, delight, disgust, skepticism,
31 doubt, satisfaction, excitement, laughter, calmness, sadness, stress, happiness, anger, sentimentality, and curiosity.
52. The computer program product of claim 44 wherein the analyzing includes extracting a heart rate from evaluation of a face of the individual in the video.
53. The computer program product of claim 44 further comprising code for converting the video to a constant frame rate and performing filtering on the video to facilitate the analyzing.
54. The computer program product of claim 44 further comprising code for interpreting physiological arousal from the heart rate information.
55. A computer system for mental state analysis comprising:
a memory which stores instructions;
one or more processors attached to the memory wherein the one or more processors, when executing the instructions which are stored, are configured to:
obtain video of an individual;
analyze the video to determine heart rate information wherein analysis includes evaluation of phasic and tonic heart rate responses; and
infer mental states of the individual based on the heart rate information.
56. The system of claim 55 wherein the one or more processors are further configured to analyze a media presentation based on the mental states, which were inferred.
57. The system of claim 55 wherein the one or more processors are further configured to aggregate the heart rate information for the individual with other people.
58. The system of claim 55 wherein the one or more processors are further configured to aggregate the mental states for the individual with other people.
59. The system of claim 55 wherein inferring includes determining arousal, attention, or valence.
32
60. The system of claim 55 wherein analyzing includes identifying a location of a face of the individual in a portion of the video.
61. The system of claim 60 wherein the one or more processors are further configured to establish a region of interest including the face, separating pixels in the region of interest into at least two channel values and combining to form raw traces, transforming and decomposing the raw traces into at least one independent source signal, and processing the at least one independent source signal to obtain the heart rate information.
62. The system of claim 55 wherein the mental states include one or more of frustration, confusion, disappointment, hesitation, cognitive overload, focusing, engagement, attention, boredom, exploration, confidence, trust, delight, disgust, skepticism, doubt, satisfaction, excitement, laughter, calmness, sadness, stress, happiness, anger, sentimentality, and curiosity.
63. The system of claim 55 wherein analyzing includes extracting a heart rate from evaluation of a face of the individual in the video.
64. The system of claim 55 wherein the one or more processors are further configured to convert the video to a constant frame rate and performing filtering on the video to facilitate the analyzing.
65. The system of claim 55 wherein the one or more processors are further configured to interpret physiological arousal from the heart rate information.
Applications Claiming Priority (16)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361798731P | 2013-03-15 | 2013-03-15 | |
US201361789038P | 2013-03-15 | 2013-03-15 | |
US201361790461P | 2013-03-15 | 2013-03-15 | |
US201361793761P | 2013-03-15 | 2013-03-15 | |
US61/790,461 | 2013-03-15 | ||
US61/798,731 | 2013-03-15 | ||
US61/793,761 | 2013-03-15 | ||
US61/789,038 | 2013-03-15 | ||
US201361844478P | 2013-07-10 | 2013-07-10 | |
US61/844,478 | 2013-07-10 | ||
US201361916190P | 2013-12-14 | 2013-12-14 | |
US61/916,190 | 2013-12-14 | ||
US201461924252P | 2014-01-07 | 2014-01-07 | |
US61/924,252 | 2014-01-07 | ||
US201461927481P | 2014-01-15 | 2014-01-15 | |
US61/927,481 | 2014-01-15 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2014145204A1 WO2014145204A1 (en) | 2014-09-18 |
WO2014145204A4 true WO2014145204A4 (en) | 2014-11-27 |
Family
ID=51537904
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2014/029926 WO2014145204A1 (en) | 2013-03-15 | 2014-03-15 | Mental state analysis using heart rate collection based video imagery |
PCT/US2014/029951 WO2014145228A1 (en) | 2013-03-15 | 2014-03-15 | Mental state well being monitoring |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2014/029951 WO2014145228A1 (en) | 2013-03-15 | 2014-03-15 | Mental state well being monitoring |
Country Status (1)
Country | Link |
---|---|
WO (2) | WO2014145204A1 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11253155B2 (en) | 2015-06-12 | 2022-02-22 | Daikin Industries, Ltd. | Brain activity estimation device |
EP3207868A1 (en) | 2016-02-19 | 2017-08-23 | Patonomics AB | Method and apparatus for identifying a transitory emotional state of a living mammal |
US9814420B2 (en) | 2016-03-09 | 2017-11-14 | International Business Machines Corporation | Burnout symptoms detection and prediction |
US10755044B2 (en) | 2016-05-04 | 2020-08-25 | International Business Machines Corporation | Estimating document reading and comprehension time for use in time management systems |
CN109688909B (en) * | 2016-05-27 | 2022-07-01 | 詹森药业有限公司 | System and method for assessing cognitive and emotional states of real-world users based on virtual world activity |
US10586257B2 (en) | 2016-06-07 | 2020-03-10 | At&T Mobility Ii Llc | Facilitation of real-time interactive feedback |
JP6665696B2 (en) * | 2016-06-09 | 2020-03-13 | 株式会社デンソー | In-vehicle equipment |
US11157880B2 (en) | 2017-01-09 | 2021-10-26 | International Business Machines Corporation | Enforcement of services agreement and management of emotional state |
JP6325154B1 (en) | 2017-06-07 | 2018-05-16 | スマート ビート プロフィッツ リミテッド | Information processing system |
US10956831B2 (en) | 2017-11-13 | 2021-03-23 | International Business Machines Corporation | Detecting interaction during meetings |
CN114450756A (en) * | 2019-08-22 | 2022-05-06 | 株式会社闪灵 | Medical device, system and method |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4284538B2 (en) * | 2004-10-19 | 2009-06-24 | ソニー株式会社 | Playback apparatus and playback method |
US20060183980A1 (en) * | 2005-02-14 | 2006-08-17 | Chang-Ming Yang | Mental and physical health status monitoring, analyze and automatic follow up methods and its application on clothing |
US9833184B2 (en) * | 2006-10-27 | 2017-12-05 | Adidas Ag | Identification of emotional states using physiological responses |
WO2009059246A1 (en) * | 2007-10-31 | 2009-05-07 | Emsense Corporation | Systems and methods providing en mass collection and centralized processing of physiological responses from viewers |
US8462996B2 (en) * | 2008-05-19 | 2013-06-11 | Videomining Corporation | Method and system for measuring human response to visual stimulus based on changes in facial expression |
KR101119867B1 (en) * | 2009-01-22 | 2012-03-14 | 한국산업기술대학교산학협력단 | Apparatus for providing information of user emotion using multiple sensors |
US20110251493A1 (en) * | 2010-03-22 | 2011-10-13 | Massachusetts Institute Of Technology | Method and system for measurement of physiological parameters |
CN102933136A (en) * | 2010-06-07 | 2013-02-13 | 阿弗科迪瓦公司 | Mental state analysis using web services |
BR112013007260A2 (en) * | 2010-09-30 | 2019-09-24 | Affectiva Inc | "computer-implemented method, computer program product incorporated into a computer readable medium, and web state-enabled application traffic analysis system" |
BR112013011819A2 (en) * | 2010-11-17 | 2019-09-24 | Affectiva Inc | "Computer-implemented method for communicating mental states, computer program product incorporated in computer readable medium for communicating mental states, system for sharing mental states, and computer method for sharing mental states." |
-
2014
- 2014-03-15 WO PCT/US2014/029926 patent/WO2014145204A1/en active Application Filing
- 2014-03-15 WO PCT/US2014/029951 patent/WO2014145228A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2014145228A1 (en) | 2014-09-18 |
WO2014145228A4 (en) | 2014-11-27 |
WO2014145204A1 (en) | 2014-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014145204A4 (en) | Mental state analysis using heart rate collection based video imagery | |
US9642536B2 (en) | Mental state analysis using heart rate collection based on video imagery | |
Bulagang et al. | A review of recent approaches for emotion classification using electrocardiography and electrodermography signals | |
Dzedzickis et al. | Human emotion recognition: Review of sensors and methods | |
Gonzalez Viejo et al. | Non-contact heart rate and blood pressure estimations from video analysis and machine learning modelling applied to food sensory responses: A case study for chocolate | |
US10517521B2 (en) | Mental state mood analysis using heart rate collection based on video imagery | |
Hui et al. | Coverage of emotion recognition for common wearable biosensors | |
Gjoreski et al. | Datasets for cognitive load inference using wearable sensors and psychological traits | |
US20150099987A1 (en) | Heart rate variability evaluation for mental state analysis | |
Messinger et al. | The eyes have it: making positive expressions more positive and negative expressions more negative. | |
US20120083675A1 (en) | Measuring affective data for web-enabled applications | |
Jerritta et al. | Emotion recognition from facial EMG signals using higher order statistics and principal component analysis | |
US20120124122A1 (en) | Sharing affect across a social network | |
Abadi et al. | Inference of personality traits and affect schedule by analysis of spontaneous reactions to affective videos | |
Boccignone et al. | Amhuse: a multimodal dataset for humour sensing | |
US20190313966A1 (en) | Pain level determination method, apparatus, and system | |
Danner et al. | Automatic facial expressions analysis in consumer science | |
Lanata et al. | Quantitative heartbeat coupling measures in human-horse interaction | |
Andreu-Cabedo et al. | Mirror mirror on the wall… An intelligent multisensory mirror for well-being self-assessment | |
Molinaro et al. | Multi-roi spectral approach for the continuous remote cardio-respiratory monitoring from mobile device built-in cameras | |
D'Arcey | Assessing the validity of FaceReader using facial EMG | |
Khanal et al. | A review on computer vision technology for physical exercise monitoring | |
Billeci et al. | Wearable sensors to evaluate autonomic response to olfactory stimulation: the influence of short, intensive sensory training | |
JP6201520B2 (en) | Gaze analysis system and method using physiological indices | |
Jáuregui et al. | Toward automatic detection of acute stress: Relevant nonverbal behaviors and impact of personality traits |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14763176 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase in: |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14763176 Country of ref document: EP Kind code of ref document: A1 |