WO2014044503A1 - Method and device of detecting the emotional impact of an audience watching a time-stamped movie - Google Patents
Method and device of detecting the emotional impact of an audience watching a time-stamped movie Download PDFInfo
- Publication number
- WO2014044503A1 WO2014044503A1 PCT/EP2013/067717 EP2013067717W WO2014044503A1 WO 2014044503 A1 WO2014044503 A1 WO 2014044503A1 EP 2013067717 W EP2013067717 W EP 2013067717W WO 2014044503 A1 WO2014044503 A1 WO 2014044503A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- movie
- audience
- detecting
- biological signal
- variations
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 230000002996 emotional effect Effects 0.000 title claims abstract description 18
- 230000002123 temporal effect Effects 0.000 claims abstract description 29
- 238000012935 Averaging Methods 0.000 claims abstract description 6
- 238000006243 chemical reaction Methods 0.000 claims description 10
- 230000001186 cumulative effect Effects 0.000 claims 2
- 230000001955 cumulated effect Effects 0.000 claims 1
- 238000012545 processing Methods 0.000 description 12
- 230000037007 arousal Effects 0.000 description 6
- 239000000523 sample Substances 0.000 description 6
- 230000008451 emotion Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 230000005284 excitation Effects 0.000 description 3
- 238000000585 Mann–Whitney U test Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000002146 bilateral effect Effects 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000013186 photoplethysmography Methods 0.000 description 2
- 208000003443 Unconsciousness Diseases 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000002500 effect on skin Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000033001 locomotion Effects 0.000 description 1
- 230000006461 physiological response Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 231100000430 skin reaction Toxicity 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0531—Measuring skin impedance
- A61B5/0533—Measuring galvanic skin response
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7239—Details of waveform analysis using differentiation including higher order derivatives
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H60/00—Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
- H04H60/29—Arrangements for monitoring broadcast services or broadcast-related services
- H04H60/33—Arrangements for monitoring the users' behaviour or opinions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H60/00—Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
- H04H60/35—Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users
- H04H60/38—Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space
- H04H60/40—Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space for identifying broadcast time
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02405—Determining heart rate variability
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02416—Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7221—Determining signal validity, reliability or quality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
Definitions
- the present invention generally relates to the technical domain of physiological response of an audience watching a movie. More precisely it relates to a method of detecting the emotional impact of an audience watching a movie and reporting a corresponding emotional profile for this movie.
- Catching the emotion of an audience member watching a movie clip may have many potential applications in a video content creation and distribution context.
- An advertising agency, a studio or a movie director trying to evaluate the emotional impact of its creation may be interested by such information.
- the creator maybe interested to validate his artistically choices and/or adapt the editing to minimize/maximize the emotional "level”.
- a precise affective profile of a movie may be useful for predicting the potential impact of the movie and associated potential business. It may also help selecting the appropriate countries or population where to promote and distribute the movie.
- Face analysis through an adapted camera could be an alternative, but the associated algorithms may be very sensitive to the user environment.
- Correlating the user's emotion with the movie is another approach which tries to classify each segment of a movie into an emotional class.
- the present invention aims at alleviating some of the inconveniences of prior art.
- the invention relates to a method of detecting the emotional impact of an audience watching a timestamped movie.
- the method further comprises the following steps:
- the movie and the captured biological signal may be temporally synchronized together. Then, selecting multiple temporal windows to cover the whole duration of the movie, the method gets the mean reaction of a whole audience over the time. This mean reaction is much related to the study of the arousal fluctuations of this same audience during the show when considering a bi-dimensional "Valence/Arousal" representation of the emotion. Thus, the method leads to a time-variant affective signal (profile) which reflects the significant excitation variations of the whole audience over the time. 4. List of figures.
- FIG. 1 shows a diagram of the steps of the method of detecting the emotional impact of an audience watching a time-stamped movie according to the invention.
- FIG. 2 shows an example of a temporal window which gathers samples of a biological signal captured from a user watching a movie.
- FIG. 4 illustrates a method according to the present invention.
- FIG. 5 shows a device that comprises means configured to implement the method of the invention.
- Figure 1 shows a diagram of the steps of the method of detecting the emotional impact of an audience watching a time-stamped movie according to the invention.
- N samples s[i] u are obtained from a biological signal captured from a user u.
- the samples s [i] u are obtained from a raw Electro Dermal Activity (EDA) signal sampled at a predefined frequency, for example 32Hz.
- EDA Electro Dermal Activity
- An EDA signal measures the local variations of the skin electrical conductance in the palm area which is known to be highly correlated with the user affective arousal.
- This signal embeds high-level human-centered information and might be used to provide a continuous and unconscious feedback about the level of excitation of the end-user. More specifically, each shift in the arousal flow of a given user is correlated with slow (around 2s) phasic changes in his/her EDA consisting in a fast increasing step until a maximum value (the higher the emotional impact is, the higher the peak is), followed by a slow return to the initial state.
- the detecting method comprises a step 1 of detecting the positive variations of the captured biological signal and a step 2 of averaging or cumulating said detected positive variations over a temporal window T[j].
- the highlights of the movie are temporally located in the temporal windows with the highest mean values of the detected positive variations computed over these temporal windows.
- the biological signal is first derivate and truncated to positive values in order to highlight the relevant phasic changes.
- the truncation for example, removes the detected negative variations.
- Figure 2 shows an example of a temporal window T[j] which gathers N samples s[i] u of the biological signal captured from a user u watching a movie.
- First derivative of the biological signal quantifies the variations between the amplitudes of two consecutive samples s[i - 1]" and s [i] u .
- the variations d[1 ] and d[2] which have positive values are considered in the following and the variation d[3] which has a negative value is truncated (not considered in the following).
- K positive variations d[k], k ⁇ ⁇ 1; K] are obtained for each temporal window T[j].
- Each of the positive variation d[k] is relative to a sample s[k] u of the biological signal.
- the sample s[i] u is relative to the positive variation d[1 ] on Figure 2. This sample defines the temporal instant of the beginning of a highlight of the movie.
- a mean value p [j] " of positive detected variations over a temporal window TjJ] is computed by equation (1 ):
- M means values p[j] u are then obtained for the total duration T of the movie where M is the number of temporal windows which cover the total duration of the movie.
- each mean value p[j] u is normalized (step 4) by a scaling factor x in order to remove the user-independent part related to the amplitude of the biological signal which may vary from one subject to another.
- the normalized mean value Np ⁇ j ⁇ u may then be interpreted as a probability of reaction over the time of a current audience member. More precisely, the user-dependent part related to the amplitude of p ⁇ j ⁇ u has been removed and may allow a comparison between multiple users.
- two consecutive temporal windows overlap one to each other.
- a temporal window Tjj+1 ] starts at the midpoint of a previous temporal window T[j]. If the duration of a temporal window equals 60s then the next window starts at 30s. This is sufficient to analyze the emotional flow in a movie. Other overlapping can also be used (not necessarily at the mid-point).
- the method further comprises a step 3 of low-filtering the captured biological signal to remove possible artifacts.
- the cut frequency of the low-filter is 1 HZ for example. The positive variations of the biological signal are thus detected from samples of a low-filtered version of the raw biological signal.
- the mean values p ⁇ j] u (or their normalized versions Np[j] u ) only reflect the reaction of one given user u. Thus, these mean values are much related to the user's sensitivity and may be also affected by user-specific noises such as motions artifacts or user's external reactions for instance.
- the method further comprises a step 5 of averaging the normalized mean values Np [j] u for all the users of an audience.
- the average value Np [j] is scaled (step 6) by a factor xj according to equation (6):
- the normalized average value NP ⁇ j] takes into account the arousal fluctuations of the whole audience over the time.
- a criterion is defined to quantify the significance of the quantity NP ⁇ j] in term of affective reaction.
- a criterion is mathematically defined as follows: for each temporal window, a random variable X N [j] distributed according to a probability density the sample of which is given by
- a subset B of these M random variables is considered as being a background noise in terms of affective reaction.
- This subset B may be obtained by considering a selection criterion.
- the selection criterion is keep the variables with the lowest average values , for example the first decile.
- a relevancy value E[j] is computed by averaging the p- value of the bilateral Mann-Whitney-Wilcoxon test (also known as the U-test) performed between x N [j] and each element b of the subset B.
- a bilateral Mann-Whitney-Wilcoxon test also known as the U-test
- Mann-Whitney-Wilcoxon test is defined, for example by Kruskal, William H. (September 1957), "Historical Notes on the Wilcoxon Unpaired Two-Sample Test", Journal of the American Statistical Association 52 (279): 356-360.
- Each X N [j] with an associated Ejj] lower than a given threshold is considered as significantly different from the background noise in terms of affective reaction.
- Figure 5 shows a device 500 that comprises means configured to implement the method of the invention.
- the device comprises the following components, interconnected by a digital data- and address bus 501 :
- processing unit 503 (or CPU for Central Processing Unit);
- Processing unit 503 can be implemented as a microprocessor, a custom chip, a dedicated (micro-) controller, and so on.
- Memory 505 can be implemented in any form of volatile and/or non-volatile memory, such as a RAM (Random Access Memory), hard disk drive, non-volatile random-access memory, EPROM (Erasable Programmable ROM), and so on.
- Device 500 is suited for implementing a data processing device according to the method of the invention.
- the data processing device 500 and the memory 505 are configured to detect the positive variations of a captured biological signal and to average or cumulate said detected positive variations over a temporal window.
- the data processing device 500 and the memory 505 are also configured to first derivate the biological signal and to truncate the first derivate to positive values in order to highlight the relevant phasic changes.
- the data processing device 500 and the memory 505 are also configured to compute a mean value p[j] u of positive detected variations over a temporal window T[j].
- the data processing device 500 and the memory 505 are also configured to normalize a mean value p[j] u by a scaling factor x .
- the data processing device 500 and the memory 505 are also configured to low-filtering the captured biological signal to remove possible artifacts.
- the data processing device 500 and the memory 505 are also configured to average the normalized mean values Np[j] u for all the users of an audience.
- the data processing device 500 and the memory 505 are also configured to scale the average value Np[j] by a factor
- the data processing device 500 and the memory 505 are also configured to define a criterion to quantify the significance of the quantity NP ⁇ j] in term of affective reaction.
- the invention is entirely implemented in hardware, for example as a dedicated component (for example as an ASIC, FPGA or VLSI) (respectively « Application Specific Integrated Circuit » « Field-Programmable Gate Array » and « Very Large Scale Integration ») or according to another variant embodiment, as distinct electronic components integrated in a device or, according to yet another embodiment, in a form of a mix of hardware and software.
- a dedicated component for example as an ASIC, FPGA or VLSI
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Psychiatry (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Dermatology (AREA)
- Social Psychology (AREA)
- Educational Technology (AREA)
- Radiology & Medical Imaging (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Hospice & Palliative Care (AREA)
- Psychology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
A method of detecting the emotional impact of an audience watching a time-stamped movie comprises the following steps: • - detecting the positive variations of a captured biological signal, and • - averaging said detected positive variations over a temporal window.
Description
Method and device of detecting the emotional impact of an audience watching a time-stamped movie.
1. Field of invention.
The present invention generally relates to the technical domain of physiological response of an audience watching a movie. More precisely it relates to a method of detecting the emotional impact of an audience watching a movie and reporting a corresponding emotional profile for this movie.
2. Technical background.
Catching the emotion of an audience member watching a movie clip may have many potential applications in a video content creation and distribution context. An advertising agency, a studio or a movie director trying to evaluate the emotional impact of its creation may be interested by such information. The creator maybe interested to validate his artistically choices and/or adapt the editing to minimize/maximize the emotional "level". A precise affective profile of a movie may be useful for predicting the potential impact of the movie and associated potential business. It may also help selecting the appropriate countries or population where to promote and distribute the movie.
Many biological signals are known to vary with the emotional state of a subject: Skin Conductivity (SC) measured through Galvanic Skin Response (GSR), SKin Temperature (SKT), Heart Rate Variability (HRV) measured by the means of a PhotoPlethysmoGraphy (PPG) devices or facial surface ElectroMyoGram (EMG) are some exemplary signals linked to emotional states. The link of such signals with the emotional state is described, for instance, in P. Lang, "The emotion probe", American Psychologist, vol. 50, no. 5, pp. 372-385, 1995; J. Wagner and E. Andre.
Nevertheless, obtaining the emotional state of an audience in an objective and low-intrusive manner is not an easy task.
One direct method would be to collect a direct self-assessment of each viewer minute per minute. This approach is clearly too intrusive, quite
subjective and may biased the reporting by distracting the subject from the movie.
Face analysis through an adapted camera could be an alternative, but the associated algorithms may be very sensitive to the user environment.
Correlating the user's emotion with the movie is another approach which tries to classify each segment of a movie into an emotional class.
All these methods do not provide any means which reflects the significant excitation variations of a whole audience over the time. They do not also provide any information to quantify the significance of highlights of a movie and means to compare the highlights of two movies one to each other.
3. Summary of the invention.
The present invention aims at alleviating some of the inconveniences of prior art.
More precisely, the invention relates to a method of detecting the emotional impact of an audience watching a timestamped movie. The method further comprises the following steps:
- detecting the positive variations of a captured biological signal, and
- averaging or cumulating said detected positive variations over a temporal window.
Thanks to the timestamps, the movie and the captured biological signal may be temporally synchronized together. Then, selecting multiple temporal windows to cover the whole duration of the movie, the method gets the mean reaction of a whole audience over the time. This mean reaction is much related to the study of the arousal fluctuations of this same audience during the show when considering a bi-dimensional "Valence/Arousal" representation of the emotion. Thus, the method leads to a time-variant affective signal (profile) which reflects the significant excitation variations of the whole audience over the time.
4. List of figures.
More advantages of the invention will appear through the description of particular, non-restricting embodiments of the invention. The embodiments will be described with reference to the following figures:
- Figure 1 shows a diagram of the steps of the method of detecting the emotional impact of an audience watching a time-stamped movie according to the invention.
- Figure 2 shows an example of a temporal window which gathers samples of a biological signal captured from a user watching a movie.
- Figure 3 shows an example of two overlapped temporal windows.
- Figure 4 illustrates a method according to the present invention.
- Figure 5 shows a device that comprises means configured to implement the method of the invention.
5. Detailed description of the invention.
Figure 1 shows a diagram of the steps of the method of detecting the emotional impact of an audience watching a time-stamped movie according to the invention.
Capturing a biological signal and synchronizing it with the timestamps of a movie are well-known techniques which are no more detailed in the following. For more details, refers to Julien Fleureau, Philippe Guillotel, Quan Huynh-Thu, "Physiological-Based Affect Event Detector for Entertainment Video Applications," IEEE Transactions on Affective Computing, Vol. 3, No 2, 2012.
N samples s[i]u are obtained from a biological signal captured from a user u.
According to an embodiment, the samples s [i]u are obtained from a raw Electro Dermal Activity (EDA) signal sampled at a predefined frequency, for example 32Hz.
An EDA signal measures the local variations of the skin electrical conductance in the palm area which is known to be highly correlated with the user affective arousal. This signal embeds high-level human-centered information and might be used to provide a continuous and unconscious feedback about the level of excitation of the end-user. More specifically, each shift in the arousal flow of a given user is correlated with slow (around 2s) phasic changes in his/her EDA consisting in a fast increasing step until a maximum value (the higher the emotional impact is, the higher the peak is), followed by a slow return to the initial state.
The detecting method comprises a step 1 of detecting the positive variations of the captured biological signal and a step 2 of averaging or cumulating said detected positive variations over a temporal window T[j].
Then, selecting multiple temporal windows to cover the whole duration of the movie, the highlights of the movie are temporally located in the temporal windows with the highest mean values of the detected positive variations computed over these temporal windows.
According to an embodiment of the detecting step, the biological signal is first derivate and truncated to positive values in order to highlight the relevant phasic changes. The truncation, for example, removes the detected negative variations.
Such an embodiment is illustrated by the Figure 2 which is just given as an illustrative example and does not limit the scope of the invention.
Figure 2 shows an example of a temporal window T[j] which gathers N samples s[i]u of the biological signal captured from a user u watching a movie. First derivative of the biological signal quantifies the variations between the amplitudes of two consecutive samples s[i - 1]" and s [i]u. Next, only variations which are greater than the zero value are considered. Here, the variations d[1 ] and d[2] which have positive values are considered in the following and the variation d[3] which has a negative value is truncated (not considered in the following). By this way, K positive variations d[k], k ε {1; K] are obtained for each temporal window T[j]. Each of the positive variation d[k] is relative to a sample s[k]u of the biological signal. For instance, the sample
s[i]u is relative to the positive variation d[1 ] on Figure 2. This sample defines the temporal instant of the beginning of a highlight of the movie.
According to the invention and as illustrated in Figure 4, a mean value p [j] " of positive detected variations over a temporal window TjJ] is computed by equation (1 ):
pWu = ~∑Li lk]u (1 )
with K the number of positive variations detected in the temporal window T[j].
M means values p[j]u are then obtained for the total duration T of the movie where M is the number of temporal windows which cover the total duration of the movie.
According to a variant, each mean value p[j]u is normalized (step 4) by a scaling factor x in order to remove the user-independent part related to the amplitude of the biological signal which may vary from one subject to another.
The normalized mean value Np\j]u is done, for example, by equation
The normalized mean value Np\j\u may then be interpreted as a probability of reaction over the time of a current audience member. More precisely, the user-dependent part related to the amplitude of p\j\u has been removed and may allow a comparison between multiple users.
According to an embodiment, two consecutive temporal windows overlap one to each other.
A illustrated on Figure 3, a temporal window Tjj+1 ] starts at the midpoint of a previous temporal window T[j]. If the duration of a temporal window equals 60s then the next window starts at 30s. This is sufficient to analyze the emotional flow in a movie. Other overlapping can also be used (not necessarily at the mid-point).
According to an embodiment, the method further comprises a step 3 of low-filtering the captured biological signal to remove possible artifacts. The cut frequency of the low-filter is 1 HZ for example. The positive variations of
the biological signal are thus detected from samples of a low-filtered version of the raw biological signal.
The mean values p\j]u (or their normalized versions Np[j]u) only reflect the reaction of one given user u. Thus, these mean values are much related to the user's sensitivity and may be also affected by user-specific noises such as motions artifacts or user's external reactions for instance.
To obtain more relevant information concerning the arousal variations of a global audience during a movie, the method further comprises a step 5 of averaging the normalized mean values Np [j]u for all the users of an audience.
Mathematically speaking, the average value Npj of the normalized mean values Np [/] " is given by equation (3):
with U the number of users of the audience, Np[j]u the normalized mean value of the positive detected variations over a same temporal window
According to a variant the average value Np [j] is scaled (step 6) by a factor xj according to equation (6):
NP J] = Xj * Np [j] (4)
with Xj = ^∑' = l Xj' the scaling factor.
The normalized average value NP\j] takes into account the arousal fluctuations of the whole audience over the time.
Thus, by observing the variations of the normalized average values NP\j] computed for temporal window which cover the wole duration T of the movie, one can affectively compare different time instants of a movie and thus identify highlights of a movie. But it is also possible to compare time instants of different movies to compare quantitatively their highlights.
As soon as the number of users of the audience is small, a strong inter-subject variability makes the interpretation, by an expert, of the quantity NP\j] more tricky.
According to an embodiment, a criterion is defined to quantify the significance of the quantity NP{j] in term of affective reaction.
According to a variant, such a criterion is mathematically defined as follows: for each temporal window, a random variable XN [j] distributed according to a probability density the sample of which is given by
A subset B of these M random variables is considered as being a background noise in terms of affective reaction. This subset B may be obtained by considering a selection criterion.
According to a variant, the selection criterion is keep the variables with the lowest average values
, for example the first decile.
For each XN [j] a relevancy value E[j] is computed by averaging the p- value of the bilateral Mann-Whitney-Wilcoxon test (also known as the U-test) performed between xN [j] and each element b of the subset B. A bilateral
Mann-Whitney-Wilcoxon test is defined, for example by Kruskal, William H. (September 1957), "Historical Notes on the Wilcoxon Unpaired Two-Sample Test", Journal of the American Statistical Association 52 (279): 356-360.
Each XN [j] with an associated Ejj] lower than a given threshold is considered as significantly different from the background noise in terms of affective reaction.
Figure 5 shows a device 500 that comprises means configured to implement the method of the invention. The device comprises the following components, interconnected by a digital data- and address bus 501 :
- a processing unit 503 (or CPU for Central Processing Unit);
- a memory 505 ;
- a network interface 504, for interconnection of device 500 to other devices connected in a network via connection 502.
Processing unit 503 can be implemented as a microprocessor, a custom chip, a dedicated (micro-) controller, and so on. Memory 505 can be implemented in any form of volatile and/or non-volatile memory, such as a RAM (Random Access Memory), hard disk drive, non-volatile random-access memory, EPROM (Erasable Programmable ROM), and so on. Device 500 is suited for implementing a data processing device according to the method of the invention. The data processing device 500 and the memory 505 are
configured to detect the positive variations of a captured biological signal and to average or cumulate said detected positive variations over a temporal window.
According to an embodiment, the data processing device 500 and the memory 505 are also configured to first derivate the biological signal and to truncate the first derivate to positive values in order to highlight the relevant phasic changes.
According to an embodiment, the data processing device 500 and the memory 505 are also configured to compute a mean value p[j]u of positive detected variations over a temporal window T[j].
According to a variant, the data processing device 500 and the memory 505 are also configured to normalize a mean value p[j]u by a scaling factor x .
According to an embodiment, the data processing device 500 and the memory 505 are also configured to low-filtering the captured biological signal to remove possible artifacts.
According to an embodiment, the data processing device 500 and the memory 505 are also configured to average the normalized mean values Np[j]u for all the users of an audience.
According to a variant, the data processing device 500 and the memory 505 are also configured to scale the average value Np[j] by a factor
Xj .
According to an embodiment, the data processing device 500 and the memory 505 are also configured to define a criterion to quantify the significance of the quantity NP\j] in term of affective reaction.
According to a particular embodiment, the invention is entirely implemented in hardware, for example as a dedicated component (for example as an ASIC, FPGA or VLSI) (respectively « Application Specific Integrated Circuit », « Field-Programmable Gate Array » and « Very Large Scale Integration ») or according to another variant embodiment, as distinct electronic components integrated in a device or, according to yet another embodiment, in a form of a mix of hardware and software.
Claims
1 . Method of detecting the emotional impact of an audience watching a time- stamped movie, the method comprises the following steps:
- detecting the positive variations of a captured biological signal, and
- averaging or cumulating said detected positive variations over a temporal window.
2. Method according to the claim 1 , wherein multiple temporal windows are considered to cover the whole duration of the movie and said detected positive variations is averaged or cumulated over each of them.
3. Method according to the claim 2, wherein two consecutive temporal windows overlap one to each other.
4. Method according to any one of the preceding claims, wherein the captured biological signal is first derivate and truncated to positive values in order to highlight the relevant phasic changes.
5. Method according to any one of the preceding claims, wherein the captured biological signal is low-filtered.
6. Method according to any one of the preceding claims, wherein the averaged or cumulative value of the detected positive variations computed over a temporal window for each user of the audience is averaged for all the users of the audience.
7. Method according to any one of the preceding claims, wherein a criterion is defined to quantify the significance of the averaged or cumulative value of the detected positive variations computed over a temporal window in term of affective reaction.
8. Apparatus comprising means configured for implementing a method according to one of the claims 1 -7.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12306134 | 2012-09-19 | ||
EP12306134.3 | 2012-09-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014044503A1 true WO2014044503A1 (en) | 2014-03-27 |
Family
ID=47115669
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2013/067717 WO2014044503A1 (en) | 2012-09-19 | 2013-08-27 | Method and device of detecting the emotional impact of an audience watching a time-stamped movie |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014044503A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017044139A1 (en) * | 2015-09-09 | 2017-03-16 | Thomson Licensing | A fast group-wise technique for decomposing gsr signals across groups of individuals |
WO2017048304A1 (en) * | 2015-09-16 | 2017-03-23 | Thomson Licensing | Determining fine-grain responses in gsr signals |
CN110013261A (en) * | 2019-05-24 | 2019-07-16 | 京东方科技集团股份有限公司 | Method, apparatus, electronic equipment and the storage medium of mood monitoring |
US10636449B2 (en) | 2017-11-06 | 2020-04-28 | International Business Machines Corporation | Dynamic generation of videos based on emotion and sentiment recognition |
RU2723732C1 (en) * | 2019-10-23 | 2020-06-17 | Акционерное общество «Нейротренд» | Method of analysing emotional perception of audiovisual content in group of users |
-
2013
- 2013-08-27 WO PCT/EP2013/067717 patent/WO2014044503A1/en active Application Filing
Non-Patent Citations (7)
Title |
---|
JULIEN FLEUREAU ET AL: "Physiological-Based Affect Event Detector for Entertainment Video Applications", IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, IEEE, USA, vol. 3, no. 3, 1 July 2012 (2012-07-01), pages 379 - 385, XP011466977, ISSN: 1949-3045, DOI: 10.1109/T-AFFC.2012.2 * |
JULIEN FLEUREAU; PHILIPPE GUILLOTEL; QUAN HUYNH-THU: "Physiological-Based Affect Event Detector for Entertainment Video Applications", IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, vol. 3, no. 2, 2012 |
KRUSKAL, WILLIAM H.: "Historical Notes on the Wilcoxon Unpaired Two-Sample Test", JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, vol. 52, no. 279, September 1957 (1957-09-01), pages 356 - 360 |
MOHAMMAD SOLEYMANI ET AL: "Continuous emotion detection in response to music videos", AUTOMATIC FACE&GESTURE RECOGNITION AND WORKSHOPS (FG 2011), 2011 IEEE INTERNATIONAL CONFERENCE ON, IEEE, 21 March 2011 (2011-03-21), pages 803 - 808, XP031869355, ISBN: 978-1-4244-9140-7, DOI: 10.1109/FG.2011.5771352 * |
P. LANG; J. WAGNER; E. ANDRE: "The emotion probe", AMERICAN PSYCHOLOGIST, vol. 50, no. 5, 1995, pages 372 - 385 |
SANDER KOELSTRA ET AL: "Single Trial Classification of EEG and Peripheral Physiological Signals for Recognition of Emotions Induced by Music Videos", 28 August 2010, BRAIN INFORMATICS, SPRINGER BERLIN HEIDELBERG, BERLIN, HEIDELBERG, PAGE(S) 89 - 100, ISBN: 978-3-642-15313-6, XP019149400 * |
SOLEYMANI M ET AL: "Affective Characterization of Movie Scenes Based on Multimedia Content Analysis and User's Physiological Emotional Responses", MULTIMEDIA, 2008. ISM 2008. TENTH IEEE INTERNATIONAL SYMPOSIUM ON, IEEE, PISCATAWAY, NJ, USA, 15 December 2008 (2008-12-15), pages 228 - 235, XP031403541, ISBN: 978-0-7695-3454-1 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017044139A1 (en) * | 2015-09-09 | 2017-03-16 | Thomson Licensing | A fast group-wise technique for decomposing gsr signals across groups of individuals |
US20190038172A1 (en) * | 2015-09-09 | 2019-02-07 | Thomson Licensing | Fast group-wise technique for decomposing gsr signals across groups of individuals |
WO2017048304A1 (en) * | 2015-09-16 | 2017-03-23 | Thomson Licensing | Determining fine-grain responses in gsr signals |
US10636449B2 (en) | 2017-11-06 | 2020-04-28 | International Business Machines Corporation | Dynamic generation of videos based on emotion and sentiment recognition |
US11315600B2 (en) | 2017-11-06 | 2022-04-26 | International Business Machines Corporation | Dynamic generation of videos based on emotion and sentiment recognition |
CN110013261A (en) * | 2019-05-24 | 2019-07-16 | 京东方科技集团股份有限公司 | Method, apparatus, electronic equipment and the storage medium of mood monitoring |
CN110013261B (en) * | 2019-05-24 | 2022-03-08 | 京东方科技集团股份有限公司 | Emotion monitoring method and device, electronic equipment and storage medium |
RU2723732C1 (en) * | 2019-10-23 | 2020-06-17 | Акционерное общество «Нейротренд» | Method of analysing emotional perception of audiovisual content in group of users |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014044503A1 (en) | Method and device of detecting the emotional impact of an audience watching a time-stamped movie | |
US9258607B2 (en) | Methods and apparatus to determine locations of audience members | |
US11213219B2 (en) | Determining intensity of a biological response to a presentation | |
CA2886597C (en) | Predicting response to stimulus | |
US10448829B2 (en) | Biological rhythm disturbance degree calculating device, biological rhythm disturbance degree calculating system, and biological rhythm disturbance degree calculating method | |
US5775330A (en) | Neurometric assessment of intraoperative anesthetic | |
Chowdhury | Using Wi-Fi channel state information (CSI) for human activity recognition and fall detection | |
US20110046503A1 (en) | Dry electrodes for electroencephalography | |
US20090171240A1 (en) | Fusion-based spatio-temporal feature detection for robust classification of instantaneous changes in pupil response as a correlate of cognitive response | |
US20090024475A1 (en) | Neuro-feedback based stimulus compression device | |
Moctezuma et al. | Classification of low-density EEG for epileptic seizures by energy and fractal features based on EMD | |
CN108416367A (en) | Sleep stage method based on multi-sensor data decision level fusion | |
US20160043819A1 (en) | System and method for predicting audience responses to content from electro-dermal activity signals | |
Sejdić et al. | Baseline characteristics of dual-axis cervical accelerometry signals | |
Wache et al. | Implicit user-centric personality recognition based on physiological responses to emotional videos | |
Li et al. | Continuous arousal self-assessments validation using real-time physiological responses | |
CN107545904B (en) | Audio detection method and device | |
Alam et al. | Detection of epileptic seizures using chaotic and statistical features in the EMD domain | |
Prado et al. | Optimizing the detection of nonstationary signals by using recurrence analysis | |
US20160232317A1 (en) | Apparatus for and method of providing biological information | |
Zanotelli et al. | Faster automatic ASSR detection using sequential tests | |
Papapanagiotou et al. | Fractal nature of chewing sounds | |
Mooney et al. | Investigating biometric response for information retrieval applications | |
Sarkar et al. | A simultaneous EEG and EMG study to quantify emotions from hindustani classical music | |
Choudhury et al. | Heartsense: Estimating heart rate from smartphone photoplethysmogram using adaptive filter and interpolation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13756413 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13756413 Country of ref document: EP Kind code of ref document: A1 |