WO2008108799A1 - A method and system for rating media and events in media based on physiological data - Google Patents

A method and system for rating media and events in media based on physiological data Download PDF

Info

Publication number
WO2008108799A1
WO2008108799A1 PCT/US2007/014955 US2007014955W WO2008108799A1 WO 2008108799 A1 WO2008108799 A1 WO 2008108799A1 US 2007014955 W US2007014955 W US 2007014955W WO 2008108799 A1 WO2008108799 A1 WO 2008108799A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
media
specific
instances
rating
Prior art date
Application number
PCT/US2007/014955
Other languages
French (fr)
Other versions
WO2008108799A8 (en
Inventor
Hans Lee
Timmie Hong
William Williams
Michael Fettiplace
Original Assignee
Emsense Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Emsense Corporation filed Critical Emsense Corporation
Priority to CN200780052879A priority Critical patent/CN101755406A/en
Priority to JP2009552656A priority patent/JP5746472B2/en
Priority to EP07796518A priority patent/EP2135370A4/en
Publication of WO2008108799A1 publication Critical patent/WO2008108799A1/en
Publication of WO2008108799A8 publication Critical patent/WO2008108799A8/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/29Arrangements for monitoring broadcast services or broadcast-related services
    • H04H60/33Arrangements for monitoring the users' behaviour or opinions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/29Arrangements for monitoring broadcast services or broadcast-related services
    • H04H60/31Arrangements for monitoring the use made of the broadcast services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/35Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users
    • H04H60/37Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying segments of broadcast information, e.g. scenes or extracting programme ID
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/35Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users
    • H04H60/46Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for recognising users' preferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/61Arrangements for services using the result of monitoring, identification or recognition covered by groups H04H60/29-H04H60/54
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/251Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/252Processing of multiple end-users' preferences to derive collaborative data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42201Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/4667Processing of monitored end-user data, e.g. trend analysis based on the log file of viewer selections
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4756End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for rating content, e.g. scoring a recommended movie
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/20ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6582Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number

Definitions

  • This invention relates to the field of media and event rating based on physiological response from viewers.
  • Prior approaches to analyze viewers' responses to a media focus around a top down view, which is based on an averaged response to a survey, viewer "knobs", physiological data or other rating schemes.
  • This view limits the accuracy of the analysis due to cognitive bias of each individual viewer, as the viewer usually only remembers a small number of key events and forgets others. Consequently, one or two negative events in the media can dominate what the viewer thinks of the media afterwards, even if other positive events happened during the viewer's experience of the media.
  • the physiological data which includes but is not limited to heart rate, brain waves, motion, muscle movement, galvanic skin response, and others responses of the viewer of the media, can give a trace of the viewer's emotion changes while he/she is watching the media.
  • data by itself does not create an objective measure of the media that allows the media or its events to be benchmarked and/or compared to other instances of media or events objectively.
  • Various embodiments of the present invention enable a bottom up analysis approach that derives physiological responses from measured physiological data of viewers of a media, and calculates scores of instances of an event type based on the physiological responses. The scores are then aggregated to rate the event type in addition to scoring the individual event instances. The approach can further form an overall rating of the media by aggregating the ratings of set of event types within the media.
  • Figure 1 is an illustration of an exemplary system to support media and events rating in accordance with one embodiment of the present invention.
  • Figure 2 (a)-(b) are flow charts illustrating exemplary processes to support media and events rating in accordance with one embodiment of the present invention.
  • Figure 3 shows an exemplary integrated headset used with one embodiment of the present invention.
  • Figure 4(a)-(c) show exemplary traces of physiological responses measured and exemplary dividing lines of a media in accordance with one embodiment of the present invention.
  • Figure 5 shows an exemplary profile of a joke in an advertisement as generated in accordance with one embodiment of the present invention.
  • Figure 6 shows overall event ratings for three exemplary types of events in two movies calculated in accordance with one embodiment of the present invention.
  • An effective media that connects with its audience/viewers is able to elicit the desired emotional response and it is well established that physiological response is a valid measurement for viewers' changes in emotions.
  • Various embodiments of the present invention enable a bottom up analysis approach that derives physiological responses from measured physiological data of viewers of a media, and calculates scores of instances of an event type based on the physiological responses. The scores are then aggregated to rate the event type in addition to scoring the individual event instances. The approach can further form an overall rating of the media by aggregating the ratings of set of event types within the media. Such an approach allows instances of an event type to be objectively measured against prior instances of the same event type in the media, and the current media to be objectively measured against another media.
  • a slice and combine approach can be adopted, which defines the media into one or more key and repeatable event types each having a plurality of event instances based on the physiological data measured, and then aggregates the score of every event instance and every event type to measure the viewers' overall responses to the media.
  • the entire approach can also be automated as each step of the approach can be processed by a computing device, allowing for objective measure of a media without much human input or intervention.
  • a key principle behind the present invention is that one cannot look at an individual response to a media and make a judgment about the media.
  • a movie having multiple key scenes can only be ranked accurately via a full analysis of viewers' responses to each of the scenes. Such analysis includes but is not limited to whether the intense scenes were actually intense, whether the love scenes turn viewers off or engage them, whether the viewers thought the jokes were funny, etc. Only when these individual scene types are aggregated and evaluate as a whole, can the movie be compared objectively and automatically to other movies in the same genre. In addition, only by knowing the typical response to be expected to a certain scene (event type), can a new instance of the scene be rated objectively.
  • FIG. 1 is an illustration of an exemplary system to support media and events rating in accordance with one embodiment of the present invention.
  • this diagram depicts components as . functionally separate, such depiction is merely for illustrative purposes. It will be apparent to those skilled in the art that the components portrayed in this figure can be arbitrarily combined or divided into separate software, firmware and/or hardware components. Furthermore, it will also be apparent to those skilled in the art that such components, regardless of how they are combined or divided, can execute on the same computing device or multiple computing devices, and wherein the multiple computing devices can be connected by one or more networks.
  • one or more sensors 103 are utilized to measure and record physiological data from each of a plurality of viewers 102 who are watching a media 101.
  • the media can be one or more of a movie, a video, a television program, a television commercial, an advertisement, a video game, an interactive online media, a print, and any other media from which a viewer can learn information or be emotionally impacted.
  • the physiological data measured can include but is not limited to, heart rate, brain waves, electroencephalogram (EEG) signals, blink rate, breathing, motion, muscle movement, galvanic skin response and any other response correlated with changes in emotion.
  • EEG electroencephalogram
  • a receiving module 104 is operable to accept and/or record the physiological data of each of a plurality of viewers watching the media, wherein the physiological data may be measured and/retrieved via other means and/or from a storage.
  • a defining module 105 is operable to define and mark occurrences and durations of one or more event types each having a plurality of event instances happening in the media. The duration of each of event instances in the media can be constant, non-linear, or semi-linear in time.
  • an integrated physiological sensing headset capable of sensing a plurality of measures of biological response can be placed on a viewer's head for measurement of his/her physiological data while the viewer is watching an event of the media.
  • the data can be recorded in a program on a computer that allows viewers to interact with media while wearing the headset.
  • Figure 3 shows an exemplary integrated headset used with one embodiment of the present invention.
  • Processing unit 301 is a microprocessor that digitizes physiological data and then processes it into physiological responses that include but are not limited to thought, engagement, immersion, physical engagement, valence, vigor and others.
  • a three axis accelerometer 302 senses movement of the head.
  • a silicon stabilization strip 303 allows for more robust sensing through stabilization of the headset that minimizes movement.
  • the integrated headset can be turned on with a push button and the viewer's physiological data is measured and recorded instantly.
  • the data transmission can be handled wirelessly through a computer interface that the headset links to. No skin preparation or gels are needed on the viewer to obtain an accurate measurement, and the headset can be removed from the viewer easily and can be instantly used by another viewer, allows measurement to be done on many participants in a short amount of time and at low cost. No degradation of the headset occurs during use and the headset can be reused thousands of times.
  • the viewers' physiological responses can be derived via a plurality of . formulas, which use the physiological data of the viewers as inputs. Facial expression recognition, "knob” and other measures of emotion can also be used as inputs with comparable validity.
  • Each of the derived physiological responses which can include but are not limited to, “Engagement,” “Adrenaline,” “Thought,” and “Valence,” combines physiological data from multiple sensors into a multi-dimensional, simple-to-understand, representation of viewers' emotional response.
  • Figure 4(a) shows an exemplary trace of "Engagement” for a player playing Call of Duty 3 on the Xbox 360 measured in accordance with one embodiment of the present invention.
  • an event type in a video game may be defined as occurring every time a "battle tank" appears in the player's screen and lasting as long as it remains on the screen.
  • an event in a movie may be defined as occurring every time a • joke is made.
  • An event type may be defined in such a way that an instance of the event type occurs only once for each piece of media. Alternatively, the event type may also be defined in such a way that many instances of the event type occur in a media.
  • the formula can be defined by a weighted sum of multiple inputs from each of physiological responses over time (vector), wherein each of the vectors may rise, fall, peak, reach high, reach low, or have a distinct profile.
  • each of the vectors may rise, fall, peak, reach high, reach low, or have a distinct profile.
  • the profile There can be two aspects of the profile: the first is that the "Thought" vector of physiological response must increase, showing that the viewer thought about what was happening directly before the punch line and during the first part of the punch line; the second is that the "Valence” or reward feeling for viewers must increase once the punch line is given, indicating that the viewers liked the punch line after engaging in thinking about the punch line.
  • a mathematical profile of rise in Thought and Valence at specific times is created for the event type of a joke.
  • Such profile can then be applied to each instance of a joke to assess the effectiveness of the punch line of the joke. Punch lines that do not fit this response profile will not create a good experience in viewers.
  • Figure 5 shows an exemplary profile of Joke Punchlines in an advertisement as generated in accordance with one embodiment of the present invention.
  • the profile can be created either through expert knowledge of the subject matter or through a mathematical formula. If the physiological response of an instance of this event type matches this profile, the event instance is considered a success.
  • scores of event instances in the media can be used to pinpoint whether and/or which of the event instances need to be improved or changed, and which of the event instances should be kept intact based on the physiological responses from the viewers.
  • a punch line that does not fit its response profile and thus does not create a good experience in viewers should be improved.
  • a set of logical rules can be adopted, which define an event instance as "good” or “successful” as having a score above a predetermined number, whereby the formula outputs a score reflecting how engaging the event instance is.
  • Other embodiments use the score for rankings or ratings are also possible. Referring back to the non-limiting example shown in Figure 4(b) where a weapon event type is tagged, the simple explanation of the profile for this event type is that if
  • an event instance of the type is good.
  • all event instances are good, which would lead to a high score for the event type.
  • This calculation scheme can be done over hundreds of instances of multiple key event types.
  • Such measures may include but are not limited to, average, 1 st order derivative,
  • 2nd order derivative polynomial approximations, (standard) deviations from the mean, (standard) deviations of derivatives from the mean, and profiles of the physiological responses, which can be implemented with convolution or other methods that takes into account one or more of: peaking in the middle, spiking in the beginning, being flat, etc. 4. Repeating calculation at step 3 for all physiological responses. 5. Transforming the large number of measures into defined outcome (score) for the event instance. Such transformation can be done via one or more of: convolution, weighted sum, positive or negative slope, a polynomial formula, least squares, support vector machines, neural networks and other machine learning approaches.
  • step 5 Repeating transformation at step 5 for all instances of the event type.
  • a ranking, rating or score for each individual event instance can be calculated via this or any other similar approaches, allowing instances of an event type to be compared objectively.
  • a subset of the overall population of viewers can be aggregated to differentiate the responses for the subset, which groups the viewers by one or more of: race, gender, age, buying habits, demographics, and income.
  • the averaged rating of the event type can then be associated and/or compared to such grouping of the viewers.
  • a simple average may not always give an accurate view of the overall performance of the media. For a non-limiting example, if 75% of event instances are very highly rated, while the rest are mediocre, the overall rating of the event type may be around 75-80%, which could seem to be good. In reality however, the viewers who watch the media will not like one quarter of the media, which leads to a very low score.
  • a performance metric takes into account the distribution of scores of instances of the event type. Using prior distributions of the instances as a reference, the performance metric can define how good the overall distribution of the instances is and if viewers will like the event type/media or not.
  • One non-limiting implementation of performance metric can be done through histogram matching for the - event type.
  • the formulas output a "success" ratio in percentage points, or another score for each event type. More specifically, a “success” ratio of an event type can be defined via an aggregated view of the performance of the event type/media in a specific aspect characterized by the definition of the event type, and a "success" ratio above a predetermined number is defined to be a successful event type.
  • the rating process described above can be repeated for different types of events within the same media.
  • the media itself can be rated with the ratings of different event types used as inputs to a rating formula. This rating formula or rule can also be based on expert knowledge or previous physiological response to the media as described above.
  • a media can be rated as "success” if a majority of its event types are rated as “success.”
  • a media can be rated as "success” its event types are rated as “success” more than other comparable media.
  • Other rules, linear or other rating scales can also be used.
  • Figure 6 shows overall event ratings for three exemplary types of events in two movies calculated in accordance with one embodiment of the present invention. The first movie is an action movie and the response to the "Action Packed" and "Suspenseful” event types is very strong while the "Funniness" event type, such as jokes do not create a strong response.
  • the second movie is a comedy, which creates the opposite responses compared to the action movie. Both movies are given a high rating because the event types have the correct profile for a successful movie in their genre.
  • One embodiment may be implemented using a conventional general purpose or a specialized digital computer or microprocessors) programmed according to the teachings of the present disclosure, as will be apparent to those skilled in the computer art.
  • Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art.
  • the invention may also be implemented by the preparation of integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be readily apparent to those skilled in the art.
  • One embodiment includes a computer program product which is a machine readable medium (media) having instructions stored thereon/in which can be used to program one or more computing devices to perform any of the features presented herein.
  • the machine readable medium can include, but is . not limited to, one or more types of disks including floppy disks, optical discs, DVD, CD-ROMs, micro drive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data.
  • the present invention includes software for controlling both the hardware of the general purpose/specialized computer or microprocessor, and for enabling the computer or microprocessor to interact with a human viewer or other mechanism utilizing the results of the present invention.
  • software may include, but is not limited to, device drivers, operating systems, execution environments/containers, and applications.

Abstract

Various embodiments of the present invention enable a bottom up analysis approach that derives physiological responses from measured physiological data of viewers of a media, and calculates scores of instances of an event type based on the physiological responses. The scores are then aggregated to rate the event type in addition to scoring the individual event instances. The approach can also form an overall rating of the media by aggregating the event ratings of set of event types within the media.

Description

A METHOD AND SYSTEM FOR RATING MEDIA AND EVENTS IN MEDIA BASED
ON PHYSIOLOGICAL DATA
FIELD OF INVENTION
This invention relates to the field of media and event rating based on physiological response from viewers.
BACKGROUND OF THE INVENTION
Prior approaches to analyze viewers' responses to a media focus around a top down view, which is based on an averaged response to a survey, viewer "knobs", physiological data or other rating schemes. This view limits the accuracy of the analysis due to cognitive bias of each individual viewer, as the viewer usually only remembers a small number of key events and forgets others. Consequently, one or two negative events in the media can dominate what the viewer thinks of the media afterwards, even if other positive events happened during the viewer's experience of the media.
The physiological data, which includes but is not limited to heart rate, brain waves, motion, muscle movement, galvanic skin response, and others responses of the viewer of the media, can give a trace of the viewer's emotion changes while he/she is watching the media. However, such data by itself does not create an objective measure of the media that allows the media or its events to be benchmarked and/or compared to other instances of media or events objectively.
SUMMARY OF INVENTION
Various embodiments of the present invention enable a bottom up analysis approach that derives physiological responses from measured physiological data of viewers of a media, and calculates scores of instances of an event type based on the physiological responses. The scores are then aggregated to rate the event type in addition to scoring the individual event instances. The approach can further form an overall rating of the media by aggregating the ratings of set of event types within the media.
BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is an illustration of an exemplary system to support media and events rating in accordance with one embodiment of the present invention.
Figure 2 (a)-(b) are flow charts illustrating exemplary processes to support media and events rating in accordance with one embodiment of the present invention.
Figure 3 shows an exemplary integrated headset used with one embodiment of the present invention.
Figure 4(a)-(c) show exemplary traces of physiological responses measured and exemplary dividing lines of a media in accordance with one embodiment of the present invention.
Figure 5 shows an exemplary profile of a joke in an advertisement as generated in accordance with one embodiment of the present invention. Figure 6 shows overall event ratings for three exemplary types of events in two movies calculated in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
The invention is illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to "an" or "one" or "some" embodiment(s) in this disclosure are not necessarily to the same embodiment, and such references mean at least one.
An effective media that connects with its audience/viewers is able to elicit the desired emotional response and it is well established that physiological response is a valid measurement for viewers' changes in emotions. Various embodiments of the present invention enable a bottom up analysis approach that derives physiological responses from measured physiological data of viewers of a media, and calculates scores of instances of an event type based on the physiological responses. The scores are then aggregated to rate the event type in addition to scoring the individual event instances. The approach can further form an overall rating of the media by aggregating the ratings of set of event types within the media. Such an approach allows instances of an event type to be objectively measured against prior instances of the same event type in the media, and the current media to be objectively measured against another media. In addition, a slice and combine approach can be adopted, which defines the media into one or more key and repeatable event types each having a plurality of event instances based on the physiological data measured, and then aggregates the score of every event instance and every event type to measure the viewers' overall responses to the media. The entire approach can also be automated as each step of the approach can be processed by a computing device, allowing for objective measure of a media without much human input or intervention.
A key principle behind the present invention is that one cannot look at an individual response to a media and make a judgment about the media. For a non-limiting example, a movie having multiple key scenes (events) can only be ranked accurately via a full analysis of viewers' responses to each of the scenes. Such analysis includes but is not limited to whether the intense scenes were actually intense, whether the love scenes turn viewers off or engage them, whether the viewers thought the jokes were funny, etc. Only when these individual scene types are aggregated and evaluate as a whole, can the movie be compared objectively and automatically to other movies in the same genre. In addition, only by knowing the typical response to be expected to a certain scene (event type), can a new instance of the scene be rated objectively.
Figure 1 is an illustration of an exemplary system to support media and events rating in accordance with one embodiment of the present invention. Although this diagram depicts components as . functionally separate, such depiction is merely for illustrative purposes. It will be apparent to those skilled in the art that the components portrayed in this figure can be arbitrarily combined or divided into separate software, firmware and/or hardware components. Furthermore, it will also be apparent to those skilled in the art that such components, regardless of how they are combined or divided, can execute on the same computing device or multiple computing devices, and wherein the multiple computing devices can be connected by one or more networks.
Referring to Figure 1, one or more sensors 103 are utilized to measure and record physiological data from each of a plurality of viewers 102 who are watching a media 101. Alternatively, an integrated sensor headset can be adopted as discussed in details later. Here, the media can be one or more of a movie, a video, a television program, a television commercial, an advertisement, a video game, an interactive online media, a print, and any other media from which a viewer can learn information or be emotionally impacted. The physiological data measured can include but is not limited to, heart rate, brain waves, electroencephalogram (EEG) signals, blink rate, breathing, motion, muscle movement, galvanic skin response and any other response correlated with changes in emotion. Each of the one or more sensors can be one of: an electroencephalogram, an accelerometer, a blood oxygen sensor, a galvanometer, an electromygraph, and any other physiological sensor. Physiological data in the body have been shown to correlate with emotional changes in humans. By sensing these exact changes instead of using surveys, knobs or other easily biased measures of response, the present invention improves both . the data that is recorded and the granularity of such data as physiological responses can be recorded many times per second.
In some embodiments, a receiving module 104 is operable to accept and/or record the physiological data of each of a plurality of viewers watching the media, wherein the physiological data may be measured and/retrieved via other means and/or from a storage. In some embodiments, a defining module 105 is operable to define and mark occurrences and durations of one or more event types each having a plurality of event instances happening in the media. The duration of each of event instances in the media can be constant, non-linear, or semi-linear in time. In some embodiments, such event definition may happen after the physiological data of the viewers has been measured, where the defining module 105 can define the media into one or more event types each having a plurality of event instances in the media based on the physiological data measured from the plurality of viewers.
In some embodiments, a rating module 106 is operable to derive a plurality of physiological responses from the physiological data measured from the plurality of viewers and calculate a score for each of the plurality of event instances of one of the event types in the media based on the plurality of physiological responses. Here, the physiological response can be one or more of: thought, liking, engagement, immersion, physical engagement, valence, and vigor, wherein thought and liking can be calculated from EEG. The rating module is further operable to rate the specific event type by aggregating the scores of instances of the event type once such scores are calculated. In addition, the rating module may optionally rate the media by aggregating ratings of the event types in the media.
Figure 2 (a)-(b) are flow charts illustrating exemplary processes to support media and events rating in accordance with one embodiment of the present invention. Although this figure depicts functional steps in a particular order for purposes of illustration, the process is not limited to any particular order or arrangement of steps. One skilled in the art will appreciate that the various steps portrayed in this figure could be omitted, rearranged, combined and/or adapted in various ways.
Referring to Figure 2 (a), physiological data from each of a plurality of viewers watching a media is received or measured at step 201. The media can then be defined into a plurality of event types each having a plurality of event instances in the media based on the physiological data measured from the plurality of viewers at step 202. A plurality of physiological responses can be derived from the physiological data at step 203. At step 204, a score for each of the plurality of event instances of a specific event type is calculated based on the plurality of physiological responses, and step 205 rates the specific event type by aggregating the scores of the plurality of event instances of the specific event type. Step 204 and 205 can be repeated for each type in the media, and the media itself can be rated based on aggregated ratings of the event types in the media at step 206.
In some embodiments, the physiological data can be used to define the media into key, repeated event instances/types that are comparable across other media in the same genre. Referring to Figure 2 (b), occurrences and durations of a plurality of event types each having a plurality of event instances in a media can be defined and/or marked at step 201, before physiological data from each of a plurality of viewers watching the media is received or measured at step 202. Step 203-206 are identical to those steps shown in Figure 2 (a).
In some embodiments, an integrated physiological sensing headset capable of sensing a plurality of measures of biological response can be placed on a viewer's head for measurement of his/her physiological data while the viewer is watching an event of the media. The data can be recorded in a program on a computer that allows viewers to interact with media while wearing the headset. Figure 3 shows an exemplary integrated headset used with one embodiment of the present invention. Processing unit 301 is a microprocessor that digitizes physiological data and then processes it into physiological responses that include but are not limited to thought, engagement, immersion, physical engagement, valence, vigor and others. A three axis accelerometer 302 senses movement of the head. A silicon stabilization strip 303 allows for more robust sensing through stabilization of the headset that minimizes movement. The right EEG electrode 304 and left EEG electrode 306 are prefrontal dry electrodes that do not need preparation to be used. Contact is needed between the electrodes and skin but without excessive . pressure. The heart rate sensor 305 is a robust blood volume pulse sensor positioned about the center of the forehead and a rechargeable or replaceable battery module 307 is located over one of the ears. The adjustable strap 308 in the rear is used to adjust the headset to a comfortable tension setting for many different head sizes.
In some embodiments, the integrated headset can be turned on with a push button and the viewer's physiological data is measured and recorded instantly. The data transmission can be handled wirelessly through a computer interface that the headset links to. No skin preparation or gels are needed on the viewer to obtain an accurate measurement, and the headset can be removed from the viewer easily and can be instantly used by another viewer, allows measurement to be done on many participants in a short amount of time and at low cost. No degradation of the headset occurs during use and the headset can be reused thousands of times.
In some embodiments, the viewers' physiological responses can be derived via a plurality of . formulas, which use the physiological data of the viewers as inputs. Facial expression recognition, "knob" and other measures of emotion can also be used as inputs with comparable validity. Each of the derived physiological responses, which can include but are not limited to, "Engagement," "Adrenaline," "Thought," and "Valence," combines physiological data from multiple sensors into a multi-dimensional, simple-to-understand, representation of viewers' emotional response. Figure 4(a) shows an exemplary trace of "Engagement" for a player playing Call of Duty 3 on the Xbox 360 measured in accordance with one embodiment of the present invention. The trace is a time based graph, with the beginning of the session on the left and the end on the right. Two sections (event instances) 401 and 402 are circled, where 401 on the left shows low "Engagement" during a game play that happens during a boring tutorial section. 402 shows a high "Engagement" section that has been recorded when the player experiences the first battle of the game. In some embodiments, the viewers' physiological responses (e.g., strong, boring, funny, engaging, etc) over a large number of instances of each event type can be calculated, analyzed and correlated with the measures of such responses to individual event types (e.g., branding moment, product introduction, video game cut scene, fight, level restart, etc) to create a context-specific event profile for each event type. These profiles can then be used to rate each event instance that happens in a new piece of media, creating, in some cases, thousands to millions of individual bottom up measures of the media over up to hundreds to thousands of participating viewers. Combining this many individual scores greatly increases the accuracy of measurement of the overall media compared to asking survey questions and getting one hundred responses to 1 to 10 dimensions of subjective measures. For a non-limiting example of 100 viewers watching a movie having 1000 key event instances, 5 dimensions of physiological responses (e.g., thought, engagement, valence, immersion, physical engagement) are calculated for each of the viewers, and 6 math permutations (e.g., average value, deviation from mean, 1st order trend, 2nd order trend, positive response, negative response, etc) are calculated for each event instance based on the physiological responses of the viewers. Consequently, 3,000,000 (100""100O+S+O) pieces of scores are available to rate the movie vs. 100-1000 measures from surveys. These scores are then combined to create an overall rating for the movie, such rating may include but is not limited to, Individual event (scene) strength - did players like cut scenes, were jokes funny, overall quality of the events, strong and weak types of events, strong and weak events of the movie — the events in the middle had a strong response but the beginning ones did not.
In some embodiments, the occurrence and duration of an event instance or type can be defined and recorded. For a non-limiting example, an event type in a video game may be defined as occurring every time a "battle tank" appears in the player's screen and lasting as long as it remains on the screen. For another non-limiting example, an event in a movie may be defined as occurring every time a • joke is made. An event type may be defined in such a way that an instance of the event type occurs only once for each piece of media. Alternatively, the event type may also be defined in such a way that many instances of the event type occur in a media.
In some embodiments, event instances can be tagged for each recorded piece of media, allowing for efficient and accurate conclusions to be made. For a non-limiting example, Figure 4(b) shows two exemplary traces of the "Engagement" data of a video game player measured in accordance with one embodiment of the present invention. The boxes 403, 404, and 405 in the pictures correspond to a specific "weapon use" event type that has been tagged. At each point where the event instance appears, "Engagement" rises sharply. The picture in Figure 4(b) shows one type of event being tagged, but the approach can be extended to many event instances being tagged with different event types, allowing the media to be sliced into pieces. For another non-limiting example, Figure 4(c) shows exemplary vertical lines that divide the piece of media into event instances in accordance with one embodiment of the present invention. Key event types define every important thing that a player of the video game or other media may encounter and/or interact with. Here, the physiological data/response is overlaid with the tags of the event instances, and both can be toggled on and off. hi some embodiments, a score can be generated for an event instance based on the physiological data or response of the viewer during the event instance using a formula or rule, hi one embodiment, the formula can be based on expert knowledge of the desired response to the event type of the instance and the formula can take the form of a weighted sum of the changes in each of the derived physiological responses across the event instance. For a non-limiting example, the formula can be defined by a weighted sum of multiple inputs from each of physiological responses over time (vector), wherein each of the vectors may rise, fall, peak, reach high, reach low, or have a distinct profile. For a non-limiting example, there can be a profile of the physiological response to punch lines in jokes that correlates with how good the joke is based on the analysis of many advertisements. There can be two aspects of the profile: the first is that the "Thought" vector of physiological response must increase, showing that the viewer thought about what was happening directly before the punch line and during the first part of the punch line; the second is that the "Valence" or reward feeling for viewers must increase once the punch line is given, indicating that the viewers liked the punch line after engaging in thinking about the punch line. Thus, a mathematical profile of rise in Thought and Valence at specific times is created for the event type of a joke. Such profile can then be applied to each instance of a joke to assess the effectiveness of the punch line of the joke. Punch lines that do not fit this response profile will not create a good experience in viewers.
Figure 5 shows an exemplary profile of Joke Punchlines in an advertisement as generated in accordance with one embodiment of the present invention. The profile can be created either through expert knowledge of the subject matter or through a mathematical formula. If the physiological response of an instance of this event type matches this profile, the event instance is considered a success. For a non-limiting example, the following formula can be used to calculate the score of an instance of the joke: Score = .25 x (Thought Rose during punch line) + .75 x (Valence rose after punch line)
Where the resulting weights are 25% based on Thought rising during the punch line and 75% based on Valence rising after the punch line.
In some embodiments, scores of event instances in the media can be used to pinpoint whether and/or which of the event instances need to be improved or changed, and which of the event instances should be kept intact based on the physiological responses from the viewers. In the non-limiting example above, a punch line that does not fit its response profile and thus does not create a good experience in viewers should be improved.
In some embodiments, the number of variables used in the scoring formula can be very large and the formula can also be a higher ordered polynomial to account for non-linear scoring if need be. For a non-limiting example, a more complex version of the formula shown above would calculate the score based on how much thought and how much positive valence there is at each point during the joke. This would penalize small increases in Thought and Valence where people did not strongly engage in the joke, while rewarding punch lines which had very large rises in Thought and Valence, corresponding to strong engagement in the j oke.
In some embodiments, a set of logical rules can be adopted, which define an event instance as "good" or "successful" as having a score above a predetermined number, whereby the formula outputs a score reflecting how engaging the event instance is. Other embodiments use the score for rankings or ratings are also possible. Referring back to the non-limiting example shown in Figure 4(b) where a weapon event type is tagged, the simple explanation of the profile for this event type is that if
Engagement rises strongly, an event instance of the type is good. In this example, all event instances are good, which would lead to a high score for the event type. This calculation scheme can be done over hundreds of instances of multiple key event types.
In some embodiments, the formula can utilize prior instances of similar event types in the current and/or other pieces of media to calculate the score of the instance. A set of rules can be created with viewers' physiological responses across those other similar events as inputs. For a non-limiting example, a score of 1 (representing "good" or "successful") could be given to an instance of an event type if the slope of Engagement over the event exceeds the average slope of Engagement over other event instances of similar event types in other pieces of media. The exact implementation of this approach can be done many ways, and the following process is a non-limiting example of such implementation:
1. Tagging a large set of instances of an event type in the media along with the physiological responses from the viewers of these instances.
2. Choosing a rating mechanism to allow for each instance of the event type to be rated.
3. Calculating various different mathematical measures of the physiological responses from the viewers over an event instance. Such measures may include but are not limited to, average, 1 st order derivative,
2nd order derivative, polynomial approximations, (standard) deviations from the mean, (standard) deviations of derivatives from the mean, and profiles of the physiological responses, which can be implemented with convolution or other methods that takes into account one or more of: peaking in the middle, spiking in the beginning, being flat, etc. 4. Repeating calculation at step 3 for all physiological responses. 5. Transforming the large number of measures into defined outcome (score) for the event instance. Such transformation can be done via one or more of: convolution, weighted sum, positive or negative slope, a polynomial formula, least squares, support vector machines, neural networks and other machine learning approaches.
6. Repeating transformation at step 5 for all instances of the event type. A ranking, rating or score for each individual event instance can be calculated via this or any other similar approaches, allowing instances of an event type to be compared objectively.
In some embodiments, a subset of the overall population of viewers can be aggregated to differentiate the responses for the subset, which groups the viewers by one or more of: race, gender, age, buying habits, demographics, and income. The averaged rating of the event type can then be associated and/or compared to such grouping of the viewers.
In some embodiments, the scores of event instances of an event type can be aggregated as input to a formula used to rate of the event type, where the formula may be mathematical or may be logical; it may also be designed by expert knowledge or by previous ratings of similar event types. Such aggregation can be done via one or more of: 'Averaging for each event type. This approach averages the scores of all event instances of the event type throughout the media and also over many viewers.
•Performance. A simple average may not always give an accurate view of the overall performance of the media. For a non-limiting example, if 75% of event instances are very highly rated, while the rest are mediocre, the overall rating of the event type may be around 75-80%, which could seem to be good. In reality however, the viewers who watch the media will not like one quarter of the media, which leads to a very low score. A performance metric takes into account the distribution of scores of instances of the event type. Using prior distributions of the instances as a reference, the performance metric can define how good the overall distribution of the instances is and if viewers will like the event type/media or not. One non-limiting implementation of performance metric can be done through histogram matching for the - event type.
•"Success" ratio. The formulas output a "success" ratio in percentage points, or another score for each event type. More specifically, a "success" ratio of an event type can be defined via an aggregated view of the performance of the event type/media in a specific aspect characterized by the definition of the event type, and a "success" ratio above a predetermined number is defined to be a successful event type. In some embodiments, the rating process described above can be repeated for different types of events within the same media. In addition, the media itself can be rated with the ratings of different event types used as inputs to a rating formula. This rating formula or rule can also be based on expert knowledge or previous physiological response to the media as described above. For a non-limiting example, a media can be rated as "success" if a majority of its event types are rated as "success." Alternatively, a media can be rated as "success" its event types are rated as "success" more than other comparable media. Other rules, linear or other rating scales can also be used. Figure 6 shows overall event ratings for three exemplary types of events in two movies calculated in accordance with one embodiment of the present invention. The first movie is an action movie and the response to the "Action Packed" and "Suspenseful" event types is very strong while the "Funniness" event type, such as jokes do not create a strong response. The second movie is a comedy, which creates the opposite responses compared to the action movie. Both movies are given a high rating because the event types have the correct profile for a successful movie in their genre.
One embodiment may be implemented using a conventional general purpose or a specialized digital computer or microprocessors) programmed according to the teachings of the present disclosure, as will be apparent to those skilled in the computer art. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art. The invention may also be implemented by the preparation of integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be readily apparent to those skilled in the art.
One embodiment includes a computer program product which is a machine readable medium (media) having instructions stored thereon/in which can be used to program one or more computing devices to perform any of the features presented herein. The machine readable medium can include, but is . not limited to, one or more types of disks including floppy disks, optical discs, DVD, CD-ROMs, micro drive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data. Stored on any one of the computer readable medium (media), the present invention includes software for controlling both the hardware of the general purpose/specialized computer or microprocessor, and for enabling the computer or microprocessor to interact with a human viewer or other mechanism utilizing the results of the present invention. Such software may include, but is not limited to, device drivers, operating systems, execution environments/containers, and applications.
The foregoing description of the preferred embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations will be apparent to the practitioner skilled in the art. Particularly, while the concept "module" is used in the embodiments of the systems and methods described above, it will be evident that such concept can be interchangeably used with equivalent concepts such as, class, method, type, interface, bean, component, object model, and other suitable concepts. Embodiments were chosen and described in order to best describe the principles of the invention and its practical application, thereby enabling others skilled in the art to understand the invention, the various embodiments and with various modifications that are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.

Claims

CLAIMSWhat is claimed is:
1. A system to support media and events rating, comprising: one or more sensors operable to measure physiological data from each of a plurality of viewers watching a media; a defining module operable to define the media into one or more event types each having a plurality of event instances in the media based on the physiological data measured from the plurality of viewers; and a rating module operable to: derive a plurality of physiological responses from the physiological data measured from the plurality of viewers; calculate a score for each of the plurality of event instances of a specific one of the one or more event types in the media based on the plurality of physiological responses; and rate the specific one event type by aggregating the scores of the plurality of event instances of the specific one event type.
2. A system to support media and events rating, comprising: a defining module operable to define and mark occurrences and durations of one or more event types each having a plurality of event instances in a media; and a receiving module operable to accept physiological data from each of a plurality of viewers watching the media; a rating module operable to: derive a plurality of physiological responses from the physiological data; calculate a score for each of the plurality of event instances of a specific one of the one or more event types in the media based on the plurality of physiological responses; and rate the specific one event type by aggregating the scores of the plurality of event instances of the specific one event type.
3. The system of claim 2, wherein: the receiving module is further operable to record the physiological data measured from each of . the plurality of viewers watching the media.
4. The system of claim 1 , wherein: the media is one of: a movie, a video, a television program, a television commercial, an advertisement, a video game, an interactive online media, a print and any other media from which a viewer can learn information or be emotionally impacted.
5. The system of claim 1 , wherein: duration of each of the one or more event types in the media is constant, non-linear, or semi- linear in time.
6. The system of claim 1, wherein: each of the one or more sensors is one of: an electroencephalogram, an accelerometer, a blood oxygen sensor, a galvanometer, an electromygraph, and any other physiological sensor.
7. The system of claim 1, wherein: the one or more sensors include an integrated sensor headset comprising one or more of: one or more axis accelerometers; one or more EEG electrodes; one or more heart rate sensors; and a processing unit.
8. The system of claim 1 , wherein: the physiological data is one or more of: heart rate, brain waves, EEG signals, blink rate, breathing, motion, muscle movement, galvanic skin response and any other response correlated with changes in emotion.
9. The system of claim 1 , wherein: the physiological responses are one or more of: thought, liking, engagement, immersion, physical engagement, valence, and vigor in the media.
10. The system of claim 1 , wherein: the rating module is further operable to: calculate one of more of the following mathematical measures of the plurality of physiological responses: average, derivative, polynomial approximation, deviation from the mean, and deviation of the derivative from the mean; and transform the mathematical measures into score for each of the plurality of event instances of the specific one event type via one or more of: convolution, weighted sum, positive or negative slope, a polynomial formula, a least squares formula, a support vector machine, and a neural network.
11. The system of claim 1 , wherein: the rating module is further operable to calculate the score for each of the plurality of event instances of the specific one event type using one or more of: a formula, a rule, prior instances of the specific one event type in this or other media of same genre, or a profile of one or more the plurality of physiological responses to the specific one event type.
12. The system of claim 11 , wherein: the rating module is further operable to calculate a good score for an event instance of the specific one event type if the physiological responses to the event instance matches the profile.
13. The system of claim 11 , wherein: the rating module is further operable to pinpoint one or more of the plurality of event instances of the specific one event type that need to be improved or changed based on their scores.
14. The system of claim 11 , wherein: the rating module is further operable to define the profile based on one or more of: expert knowledge, physiological responses to similar event types, and data from other media.
15. The system of claim 1 , wherein: the rating module is further operable to rate the specific one event type by one or more of: averaging, performance, a mathematical formula, and a logic formula.
16. The system of claim 1 , wherein: the rating module is further operable to rate the media by aggregating ratings of the one or more event types in the media.
17. The system of claim 1, wherein: the rating module is further operable to display the score/rating of one or more of: the media, the specific one event type, and the plurality of event instances of the specific one event type.
18. The system of claim 1 , wherein: the rating module is further operable to: group the plurality of viewers by one or more of race, gender, age, demographics, income, habits, and interests; and associate and/or compare the score/rating of one or more of the media, the specific one event type, and the plurality of event instances of the specific one event type according to such grouping of the plurality of viewers.
19. A method to support media and events rating, comprising: receiving and/or measuring physiological data from each of a plurality of viewers watching a media; defining the media into one or more event types each having a plurality of event instances in the media based on the physiological data measured from the plurality of viewers; deriving a plurality of physiological responses from the physiological data; calculating a score for each of the plurality of event instances of a specific one of the one or more event types in the media based on the plurality of physiological responses; and rating the specific one event type by aggregating the scores of the plurality of event instances of the specific one event type.
20. A method to support media and events rating, comprising: defining and marking occurrences and durations of one or more event types each having a plurality of event instances in a media; receiving and/or measuring physiological data from each of a plurality of viewers watching the media; derive a plurality of physiological responses from the physiological data; calculating a score for each of the plurality of event instances of a specific one of the one or more event types in the media based on the plurality of physiological responses; and rating the specific one event type by aggregating the scores of the plurality of event instances of the specific one event type.
21. The method of claim 19, further comprising: calculating one of more of the following mathematical measures of the physiological data: average, derivative, polynomial approximation, deviation from the mean, and deviation of the derivative from the mean; and transforming the mathematical measures into score for each of the plurality of event instances of the specific one event type via one or more of: convolution, weighted sum, positive or negative slope, a polynomial formula, a least squares formula, a support vector machine, and a neural network.
22. The method of claim 19, further comprising: calculating the score for each of the plurality of event instances of the specific one event type using one or more of: a formula, a rule, prior instances of the specific one event type in this or other media of same genre, or a profile of one or more of the plurality of physiological responses to the specific one event type.
23. The method of claim 22, further comprising: calculating a good score for an event instance of the specific one event type if the physiological responses to the event instance matches the profile.
24. The method of claim 22, further comprising: pinpointing one or more of the plurality of event instances that need to be improved or changed based on their scores.
25. The method of claim 22, further comprising: defining the profile based on one or more of: expert knowledge, physiological responses to similar event types, and data from other media.
26. The method of claim 19, further comprising: rating the specific one event type by one or more of: averaging, performance, a mathematical formula, and a logic formula.
27. The method of claim 19, further comprising: rating the media by aggregating ratings of the one or more event types in the media.
28. The method of claim 19, further comprising: recording the physiological data measured from each of the plurality of viewers watching the media.
29. The method of claim 19, further comprising: displaying the score/rating of one or more of: the media, the specific one event type, and the plurality of event instances of the specific one event type.
30. The method of claim 19, further comprising: grouping the plurality of viewers by one or more of race, gender, age, demographics, income, habits, and interests; and associating and/or comparing the score/rating of one or more of the media, the specific one event type, and the plurality of event instances of the specific one event type according to such grouping of the plurality of viewers.
31. A machine readable medium having instructions stored thereon that when executed cause a system to: receive and/or measure physiological data from each of a plurality of viewers watching a media; define the media into one or more event types each having a plurality of event instances in the media based on the physiological data measured from the plurality of viewers; derive a plurality of physiological responses from the physiological data; calculate a score for each of the plurality of event instances of a specific one of the one or more event types in the media based on the plurality of physiological responses; and rate the specific one event type by aggregating the scores of the plurality of event instances of the specific one event type.
32. A system to support media and events rating, comprising: means for defining and marking occurrences and durations of one or more event types each having a plurality of event instances in a media; means for receiving and/or measuring physiological data from each of a plurality of viewers watching the media; means for derive a plurality of physiological responses from the physiological data; means for calculating a score for each of the plurality of event instances of a specific one of the one or more event types in the media based on the plurality physiological responses; and means for rating the specific one event type by aggregating the scores of the plurality of event instances of the specific one event type.
PCT/US2007/014955 2007-03-08 2007-06-27 A method and system for rating media and events in media based on physiological data WO2008108799A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN200780052879A CN101755406A (en) 2007-03-08 2007-06-27 A method and system for rating media and events in media based on physiological data
JP2009552656A JP5746472B2 (en) 2007-03-08 2007-06-27 Method and system for evaluating media and media events based on physiological data
EP07796518A EP2135370A4 (en) 2007-03-08 2007-06-27 A method and system for rating media and events in media based on physiological data

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US90561607P 2007-03-08 2007-03-08
US60/905,616 2007-03-08
US11/804,555 US8782681B2 (en) 2007-03-08 2007-05-17 Method and system for rating media and events in media based on physiological data
US11/805,555 2007-05-17
US11/804,555 2007-05-17

Publications (2)

Publication Number Publication Date
WO2008108799A1 true WO2008108799A1 (en) 2008-09-12
WO2008108799A8 WO2008108799A8 (en) 2009-12-23

Family

ID=39742969

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/014955 WO2008108799A1 (en) 2007-03-08 2007-06-27 A method and system for rating media and events in media based on physiological data

Country Status (5)

Country Link
US (1) US8782681B2 (en)
EP (1) EP2135370A4 (en)
JP (1) JP5746472B2 (en)
CN (1) CN101755406A (en)
WO (1) WO2008108799A1 (en)

Families Citing this family (113)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090030717A1 (en) 2007-03-29 2009-01-29 Neurofocus, Inc. Intra-modality synthesis of central nervous system, autonomic nervous system, and effector data
JP4281819B2 (en) * 2007-04-02 2009-06-17 ソニー株式会社 Captured image data processing device, viewing information generation device, viewing information generation system, captured image data processing method, viewing information generation method
US9886981B2 (en) 2007-05-01 2018-02-06 The Nielsen Company (Us), Llc Neuro-feedback based stimulus compression device
JP5361868B2 (en) 2007-05-01 2013-12-04 ニューロフォーカス・インコーポレーテッド Neural information storage system
US8392253B2 (en) 2007-05-16 2013-03-05 The Nielsen Company (Us), Llc Neuro-physiology and neuro-behavioral based stimulus targeting system
EP2152155A4 (en) * 2007-06-06 2013-03-06 Neurofocus Inc Multi-market program and commercial response monitoring system using neuro-response measurements
US8494905B2 (en) 2007-06-06 2013-07-23 The Nielsen Company (Us), Llc Audience response analysis using simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI)
JP5542051B2 (en) 2007-07-30 2014-07-09 ニューロフォーカス・インコーポレーテッド System, method, and apparatus for performing neural response stimulation and stimulation attribute resonance estimation
US8386313B2 (en) 2007-08-28 2013-02-26 The Nielsen Company (Us), Llc Stimulus placement system using subject neuro-response measurements
US8635105B2 (en) 2007-08-28 2014-01-21 The Nielsen Company (Us), Llc Consumer experience portrayal effectiveness assessment system
KR20100047865A (en) 2007-08-28 2010-05-10 뉴로포커스, 인크. Consumer experience assessment system
US8392255B2 (en) 2007-08-29 2013-03-05 The Nielsen Company (Us), Llc Content based selection and meta tagging of advertisement breaks
US20090083129A1 (en) * 2007-09-20 2009-03-26 Neurofocus, Inc. Personalized content delivery using neuro-response priming data
US9191450B2 (en) * 2007-09-20 2015-11-17 Disney Enterprises, Inc. Measuring user engagement during presentation of media content
US8494610B2 (en) 2007-09-20 2013-07-23 The Nielsen Company (Us), Llc Analysis of marketing and entertainment effectiveness using magnetoencephalography
US8332883B2 (en) 2007-10-02 2012-12-11 The Nielsen Company (Us), Llc Providing actionable insights based on physiological responses from viewers of media
US8776102B2 (en) * 2007-10-09 2014-07-08 At&T Intellectual Property I, Lp System and method for evaluating audience reaction to a data stream
US9513699B2 (en) 2007-10-24 2016-12-06 Invention Science Fund I, LL Method of selecting a second content based on a user's reaction to a first content
US9582805B2 (en) * 2007-10-24 2017-02-28 Invention Science Fund I, Llc Returning a personalized advertisement
WO2009059246A1 (en) 2007-10-31 2009-05-07 Emsense Corporation Systems and methods providing en mass collection and centralized processing of physiological responses from viewers
US8487772B1 (en) 2008-12-14 2013-07-16 Brian William Higgins System and method for communicating information
US9357240B2 (en) 2009-01-21 2016-05-31 The Nielsen Company (Us), Llc Methods and apparatus for providing alternate media for video decoders
US8270814B2 (en) * 2009-01-21 2012-09-18 The Nielsen Company (Us), Llc Methods and apparatus for providing video with embedded media
US8464288B2 (en) * 2009-01-21 2013-06-11 The Nielsen Company (Us), Llc Methods and apparatus for providing personalized media in video
US20100250325A1 (en) 2009-03-24 2010-09-30 Neurofocus, Inc. Neurological profiles for market matching and stimulus presentation
CN101853259A (en) * 2009-03-31 2010-10-06 国际商业机器公司 Methods and device for adding and processing label with emotional data
US8655437B2 (en) 2009-08-21 2014-02-18 The Nielsen Company (Us), Llc Analysis of the mirror neuron system for evaluation of stimulus
US10987015B2 (en) 2009-08-24 2021-04-27 Nielsen Consumer Llc Dry electrodes for electroencephalography
US8209224B2 (en) 2009-10-29 2012-06-26 The Nielsen Company (Us), Llc Intracluster content management using neuro-response priming data
US20110106750A1 (en) * 2009-10-29 2011-05-05 Neurofocus, Inc. Generating ratings predictions using neuro-response data
US9560984B2 (en) 2009-10-29 2017-02-07 The Nielsen Company (Us), Llc Analysis of controlled and automatic attention for introduction of stimulus material
US8335715B2 (en) 2009-11-19 2012-12-18 The Nielsen Company (Us), Llc. Advertisement exchange using neuro-response data
US8335716B2 (en) 2009-11-19 2012-12-18 The Nielsen Company (Us), Llc. Multimedia advertisement exchange
WO2011133548A2 (en) 2010-04-19 2011-10-27 Innerscope Research, Inc. Short imagery task (sit) research method
US8655428B2 (en) 2010-05-12 2014-02-18 The Nielsen Company (Us), Llc Neuro-response data synchronization
US10289898B2 (en) 2010-06-07 2019-05-14 Affectiva, Inc. Video recommendation via affect
KR20130122535A (en) 2010-06-07 2013-11-07 어펙티바,아이엔씨. Mental state analysis using web services
US9247903B2 (en) 2010-06-07 2016-02-02 Affectiva, Inc. Using affect within a gaming context
CN101883292A (en) * 2010-06-30 2010-11-10 中山大学 System and method for testing digital television interactive service availability
US9532734B2 (en) * 2010-08-09 2017-01-03 Nike, Inc. Monitoring fitness using a mobile device
US8392251B2 (en) 2010-08-09 2013-03-05 The Nielsen Company (Us), Llc Location aware presentation of stimulus material
US8392250B2 (en) 2010-08-09 2013-03-05 The Nielsen Company (Us), Llc Neuro-response evaluated stimulus in virtual reality environments
US10572721B2 (en) 2010-08-09 2020-02-25 Nike, Inc. Monitoring fitness using a mobile device
US8396744B2 (en) 2010-08-25 2013-03-12 The Nielsen Company (Us), Llc Effective virtual reality environments for presentation of marketing materials
WO2012052559A1 (en) * 2010-10-21 2012-04-26 Holybrain Bvba Method and apparatus for neuropsychological modeling of human experience and purchasing behavior
JP5649425B2 (en) * 2010-12-06 2015-01-07 株式会社東芝 Video search device
AU2012256402A1 (en) * 2011-02-27 2013-07-11 Affectiva, Inc, Video recommendation based on affect
US9141982B2 (en) 2011-04-27 2015-09-22 Right Brain Interface Nv Method and apparatus for collaborative upload of content
AU2012258732A1 (en) * 2011-05-24 2013-12-12 WebTuner, Corporation System and method to increase efficiency and speed of analytics report generation in Audience Measurement Systems
US9015746B2 (en) * 2011-06-17 2015-04-21 Microsoft Technology Licensing, Llc Interest-based video streams
US8433815B2 (en) 2011-09-28 2013-04-30 Right Brain Interface Nv Method and apparatus for collaborative upload of content
US20140317647A1 (en) * 2011-10-27 2014-10-23 Yuichiro Itakura Content evaluation/playback device
US11064257B2 (en) 2011-11-07 2021-07-13 Monet Networks, Inc. System and method for segment relevance detection for digital content
US10638197B2 (en) 2011-11-07 2020-04-28 Monet Networks, Inc. System and method for segment relevance detection for digital content using multimodal correlations
CN102497518A (en) * 2011-11-18 2012-06-13 Tcl王牌电器(惠州)有限公司 Method and device for step control on TV programs
CN102523493A (en) * 2011-12-09 2012-06-27 深圳Tcl新技术有限公司 Method and system for grading television program according to mood
US9292858B2 (en) 2012-02-27 2016-03-22 The Nielsen Company (Us), Llc Data collection system for aggregating biologically based measures in asynchronous geographically distributed public environments
US9451303B2 (en) 2012-02-27 2016-09-20 The Nielsen Company (Us), Llc Method and system for gathering and computing an audience's neurologically-based reactions in a distributed framework involving remote storage and computing
US9569986B2 (en) 2012-02-27 2017-02-14 The Nielsen Company (Us), Llc System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
US20130288212A1 (en) * 2012-03-09 2013-10-31 Anurag Bist System and A Method for Analyzing Non-verbal Cues and Rating a Digital Content
US20140303450A1 (en) * 2013-04-03 2014-10-09 Dylan Caponi System and method for stimulus optimization through closed loop iterative biological sensor feedback
CN104769954A (en) * 2012-06-14 2015-07-08 汤姆逊许可公司 Method, apparatus and system for determining viewer reaction to content elements
US20140026156A1 (en) * 2012-07-18 2014-01-23 David Deephanphongs Determining User Interest Through Detected Physical Indicia
US10034049B1 (en) 2012-07-18 2018-07-24 Google Llc Audience attendance monitoring through facial recognition
US20140047316A1 (en) * 2012-08-10 2014-02-13 Vimbli, Inc. Method and system to create a personal priority graph
US8989835B2 (en) 2012-08-17 2015-03-24 The Nielsen Company (Us), Llc Systems and methods to gather and analyze electroencephalographic data
US9734730B2 (en) * 2013-01-31 2017-08-15 Sri International Multi-modal modeling of temporal interaction sequences
US8769557B1 (en) 2012-12-27 2014-07-01 The Nielsen Company (Us), Llc Methods and apparatus to determine engagement levels of audience members
US9320450B2 (en) 2013-03-14 2016-04-26 The Nielsen Company (Us), Llc Methods and apparatus to gather and analyze electroencephalographic data
US9100694B1 (en) 2013-03-14 2015-08-04 Google Inc. TV mode change in accordance with number of viewers present
CN103268560B (en) * 2013-04-19 2017-02-08 杭州电子科技大学 Before-release advertising effect evaluation method based on electroencephalogram indexes
US11090003B2 (en) * 2013-09-09 2021-08-17 Healthy.Io Ltd. Systems for personal portable wireless vital signs scanner
KR101535432B1 (en) * 2013-09-13 2015-07-13 엔에이치엔엔터테인먼트 주식회사 Contents valuation system and contents valuating method using the system
CN103657056A (en) * 2013-12-18 2014-03-26 北京东方之星幼儿教育科技有限公司 Sensory integration evaluation system and method
CN103702146B (en) * 2013-12-27 2017-09-19 中国标准化研究院 Digital television service performance regulation and control method, device and digital TV terminal
US10311095B2 (en) * 2014-01-17 2019-06-04 Renée BUNNELL Method and system for qualitatively and quantitatively analyzing experiences for recommendation profiles
US10448075B2 (en) 2014-03-06 2019-10-15 Cox Communications, Inc. Content conditioning and distribution of conditioned media assets at a content platform
US9622702B2 (en) 2014-04-03 2017-04-18 The Nielsen Company (Us), Llc Methods and apparatus to gather and analyze electroencephalographic data
US11494390B2 (en) 2014-08-21 2022-11-08 Affectomatics Ltd. Crowd-based scores for hotels from measurements of affective response
US10198505B2 (en) 2014-08-21 2019-02-05 Affectomatics Ltd. Personalized experience scores based on measurements of affective response
US11269891B2 (en) 2014-08-21 2022-03-08 Affectomatics Ltd. Crowd-based scores for experiences from measurements of affective response
US9805381B2 (en) 2014-08-21 2017-10-31 Affectomatics Ltd. Crowd-based scores for food from measurements of affective response
CN104349206A (en) * 2014-11-26 2015-02-11 乐视致新电子科技(天津)有限公司 Method, device and system for processing television information
US10873777B2 (en) * 2014-12-18 2020-12-22 Sony Corporation Information processing device and information processing method to calculate score for evaluation of action
DE102016101650A1 (en) 2015-01-29 2016-08-04 Affectomatics Ltd. CORRECTION OF BIAS IN MEASURES OF THE AFFECTIVE RESPONSE
US11232466B2 (en) 2015-01-29 2022-01-25 Affectomatics Ltd. Recommendation for experiences based on measurements of affective response that are backed by assurances
US9467718B1 (en) 2015-05-06 2016-10-11 Echostar Broadcasting Corporation Apparatus, systems and methods for a content commentary community
US9936250B2 (en) 2015-05-19 2018-04-03 The Nielsen Company (Us), Llc Methods and apparatus to adjust content presented to an individual
CN104899309B (en) * 2015-06-12 2019-04-30 百度在线网络技术(北京)有限公司 The method and apparatus of displaying event comment viewpoint
US10484439B2 (en) * 2015-06-30 2019-11-19 Amazon Technologies, Inc. Spectating data service for a spectating system
CN104994409A (en) * 2015-06-30 2015-10-21 北京奇艺世纪科技有限公司 Media data editing method and device
CN105159990B (en) * 2015-08-31 2019-02-01 北京奇艺世纪科技有限公司 A kind of method and apparatus of media data grading control
CN105204626B (en) * 2015-08-31 2018-05-04 北京奇艺世纪科技有限公司 A kind of method and apparatus to user's grading control
CN106658202A (en) * 2015-10-30 2017-05-10 中国移动通信集团公司 Method and equipment for triggering interaction application
US9525912B1 (en) 2015-11-20 2016-12-20 Rovi Guides, Inc. Systems and methods for selectively triggering a biometric instrument to take measurements relevant to presently consumed media
US10268689B2 (en) 2016-01-28 2019-04-23 DISH Technologies L.L.C. Providing media content based on user state detection
US10568572B2 (en) 2016-03-14 2020-02-25 The Nielsen Company (Us), Llc Headsets and electrodes for gathering electroencephalographic data
US10187694B2 (en) 2016-04-07 2019-01-22 At&T Intellectual Property I, L.P. Method and apparatus for enhancing audience engagement via a communication network
US10984036B2 (en) 2016-05-03 2021-04-20 DISH Technologies L.L.C. Providing media content based on media element preferences
US10045076B2 (en) 2016-11-22 2018-08-07 International Business Machines Corporation Entertainment content ratings system based on physical expressions of a spectator to scenes of the content
CN106778539A (en) * 2016-11-25 2017-05-31 鲁东大学 Teaching effect information acquisition methods and device
JP2018093350A (en) * 2016-12-01 2018-06-14 有限会社曽根地所 Attention degree evaluation system
US11205103B2 (en) 2016-12-09 2021-12-21 The Research Foundation for the State University Semisupervised autoencoder for sentiment analysis
US10390084B2 (en) 2016-12-23 2019-08-20 DISH Technologies L.L.C. Communications channels in media systems
US11196826B2 (en) 2016-12-23 2021-12-07 DISH Technologies L.L.C. Communications channels in media systems
US10764381B2 (en) 2016-12-23 2020-09-01 Echostar Technologies L.L.C. Communications channels in media systems
CN109961303B (en) * 2017-12-22 2021-09-21 新华网股份有限公司 Method and device for comparing audience reaction
CN108416542A (en) * 2018-05-11 2018-08-17 新华网股份有限公司 Experience Degree assessment system and method and computer readable storage medium based on physiology sensing technology
CN110019853A (en) * 2018-06-20 2019-07-16 新华网股份有限公司 Scene of interest recognition methods and system
US11037550B2 (en) 2018-11-30 2021-06-15 Dish Network L.L.C. Audio-based link generation
US11553871B2 (en) 2019-06-04 2023-01-17 Lab NINE, Inc. System and apparatus for non-invasive measurement of transcranial electrical signals, and method of calibrating and/or using same for various applications
DE102020123554A1 (en) * 2020-09-09 2022-03-10 Imagine AG Examination of a preliminary film product
CN112887771A (en) * 2021-01-28 2021-06-01 Oppo广东移动通信有限公司 Video evaluation method and device, computer readable medium and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6292688B1 (en) * 1996-02-28 2001-09-18 Advanced Neurotechnologies, Inc. Method and apparatus for analyzing neurological response to emotion-inducing stimuli
US7113916B1 (en) * 2001-09-07 2006-09-26 Hill Daniel A Method of facial coding monitoring for the purpose of gauging the impact and appeal of commercially-related stimuli

Family Cites Families (166)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4846190A (en) 1983-08-23 1989-07-11 John Erwin R Electroencephalographic system data display
US4695879A (en) * 1986-02-07 1987-09-22 Weinblatt Lee S Television viewer meter
US4755045A (en) 1986-04-04 1988-07-05 Applied Science Group, Inc. Method and system for generating a synchronous display of a visual presentation and the looking response of many viewers
US4931934A (en) * 1988-06-27 1990-06-05 Snyder Thomas E Method and system for measuring clarified intensity of emotion
US5243517A (en) 1988-08-03 1993-09-07 Westinghouse Electric Corp. Method and apparatus for physiological evaluation of short films and entertainment materials
EP0355506B1 (en) 1988-08-16 1994-12-14 Siemens Aktiengesellschaft Arrangement for measuring local bioelectric currents in biological tissue
US5024235A (en) 1990-02-26 1991-06-18 Ayers Margaret A Electroencephalic neurofeedback apparatus and method for bioelectrical frequency inhibition and facilitation
WO1995018565A1 (en) 1991-09-26 1995-07-13 Sam Technology, Inc. Non-invasive neurocognitive testing method and system
US5724987A (en) 1991-09-26 1998-03-10 Sam Technology, Inc. Neurocognitive adaptive computer-aided training method and system
US6850252B1 (en) 1999-10-05 2005-02-01 Steven M. Hoffberg Intelligent electronic appliance system and method
US5406957A (en) * 1992-02-05 1995-04-18 Tansey; Michael A. Electroencephalic neurofeedback apparatus for training and tracking of cognitive states
US5692906A (en) 1992-04-01 1997-12-02 Corder; Paul R. Method of diagnosing and remediating a deficiency in communications skills
US5450855A (en) 1992-05-13 1995-09-19 Rosenfeld; J. Peter Method and system for modification of condition with neural biofeedback using left-right brain wave asymmetry
US6785568B2 (en) 1992-05-18 2004-08-31 Non-Invasive Technology Inc. Transcranial examination of the brain
US5405957A (en) 1992-10-30 1995-04-11 The University Of British Columbia Wavelength-specific photosensitive compounds and expanded porphyrin-like compounds and methods of use
CN1273077C (en) 1993-01-07 2006-09-06 精工爱普生株式会社 Pulse wave analyser and diagnostic device using the same
US6206829B1 (en) 1996-07-12 2001-03-27 First Opinion Corporation Computerized medical diagnostic and treatment advice system including network access
US6349231B1 (en) 1994-01-12 2002-02-19 Brain Functions Laboratory, Inc. Method and apparatus for will determination and bio-signal control
US5601090A (en) 1994-07-12 1997-02-11 Brain Functions Laboratory, Inc. Method and apparatus for automatically determining somatic state
US5579774A (en) 1994-03-07 1996-12-03 Camino Neurocare, Inc. Method and apparatus for monitoring local cerebral physiology
US5513649A (en) 1994-03-22 1996-05-07 Sam Technology, Inc. Adaptive interference canceler for EEG movement and eye artifacts
US5649061A (en) 1995-05-11 1997-07-15 The United States Of America As Represented By The Secretary Of The Army Device and method for estimating a mental decision
US6001065A (en) 1995-08-02 1999-12-14 Ibva Technologies, Inc. Method and apparatus for measuring and analyzing physiological signals for active or passive control of physical and virtual spaces and the contents therein
US5774591A (en) * 1995-12-15 1998-06-30 Xerox Corporation Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images
US5740812A (en) 1996-01-25 1998-04-21 Mindwaves, Ltd. Apparatus for and method of providing brainwave biofeedback
US5676138A (en) * 1996-03-15 1997-10-14 Zawilinski; Kenneth Michael Emotional response analyzer system with multimedia display
US5867799A (en) 1996-04-04 1999-02-02 Lang; Andrew K. Information system and method for filtering a massive flow of information entities to meet user information classification needs
US20050097594A1 (en) * 1997-03-24 2005-05-05 O'donnell Frank Systems and methods for awarding affinity points based upon remote control usage
US6402520B1 (en) 1997-04-30 2002-06-11 Unique Logic And Technology, Inc. Electroencephalograph based biofeedback system for improving learning skills
US6425764B1 (en) 1997-06-09 2002-07-30 Ralph J. Lamson Virtual reality immersion therapy for treating psychological, psychiatric, medical, educational and self-help problems
US6097927A (en) 1998-01-27 2000-08-01 Symbix, Incorporated Active symbolic self design method and apparatus
US5983129A (en) 1998-02-19 1999-11-09 Cowan; Jonathan D. Method for determining an individual's intensity of focused attention and integrating same into computer program
US6099319A (en) 1998-02-24 2000-08-08 Zaltman; Gerald Neuroimaging as a marketing tool
US6102846A (en) 1998-02-26 2000-08-15 Eastman Kodak Company System and method of managing a psychological state of an individual using images
AUPP354898A0 (en) 1998-05-15 1998-06-11 Swinburne Limited Mass communication assessment system
JP3511029B2 (en) 1998-06-30 2004-03-29 株式会社博報堂 Notification information display device, notification information display system, and recording medium
US6322368B1 (en) 1998-07-21 2001-11-27 Cy Research, Inc. Training and testing human judgment of advertising materials
US6481013B1 (en) 1998-11-09 2002-11-12 Peracom Networks, Inc. Entertainment and computer coaxial network and method of distributing signals therethrough
KR100291596B1 (en) 1998-11-12 2001-06-01 정선종 Emotional Positive / Negative State Discrimination Method Using Asymmetry of Left / Right Brain Activity
US20020140675A1 (en) 1999-01-25 2002-10-03 Ali Ammar Al System and method for altering a display mode based on a gravity-responsive sensor
US6430539B1 (en) 1999-05-06 2002-08-06 Hnc Software Predictive modeling of consumer financial behavior
US7006999B1 (en) 1999-05-13 2006-02-28 Xerox Corporation Method for enabling privacy and trust in electronic communities
IL130818A (en) 1999-07-06 2005-07-25 Intercure Ltd Interventive-diagnostic device
WO2001039664A1 (en) 1999-12-02 2001-06-07 The General Hospital Corporation Method and apparatus for measuring indices of brain activity
US6652283B1 (en) 1999-12-30 2003-11-25 Cerego, Llc System apparatus and method for maximizing effectiveness and efficiency of learning retaining and retrieving knowledge and skills
US7146329B2 (en) 2000-01-13 2006-12-05 Erinmedia, Llc Privacy compliant multiple dataset correlation and content delivery system and methods
GB0003853D0 (en) 2000-02-19 2000-04-05 Diagnostic Potentials Limited Method for investigating neurological function
JP3350656B2 (en) 2000-02-21 2002-11-25 株式会社博報堂 URL notification device for mobile phones
US7194186B1 (en) 2000-04-21 2007-03-20 Vulcan Patents Llc Flexible marking of recording data by a recording unit
US7050753B2 (en) 2000-04-24 2006-05-23 Knutson Roger C System and method for providing learning material
US6606102B1 (en) * 2000-06-02 2003-08-12 Gary Odom Optimizing interest potential
US6699188B2 (en) 2000-06-22 2004-03-02 Guidance Interactive Technologies Interactive reward devices and methods
JP2002000577A (en) 2000-06-23 2002-01-08 Canon Inc Method of analyzing brain wave
US6434419B1 (en) 2000-06-26 2002-08-13 Sam Technology, Inc. Neurocognitive ability EEG measurement method and system
JP3824848B2 (en) 2000-07-24 2006-09-20 シャープ株式会社 Communication apparatus and communication method
JP2002056500A (en) 2000-08-11 2002-02-22 Denso Corp On-vehicle device coping with occupant and recording medium
JP2002112969A (en) 2000-09-02 2002-04-16 Samsung Electronics Co Ltd Device and method for recognizing physical and emotional conditions
US6801803B2 (en) 2000-10-16 2004-10-05 Instrumentarium Corp. Method and apparatus for determining the cerebral state of a patient with fast response
US6904408B1 (en) 2000-10-19 2005-06-07 Mccarthy John Bionet method, system and personalized web content manager responsive to browser viewers' psychological preferences, behavioral responses and physiological stress indicators
US7150715B2 (en) 2001-02-05 2006-12-19 Collura Thomas F Network enabled biofeedback administration
JP3644502B2 (en) 2001-02-06 2005-04-27 ソニー株式会社 Content receiving apparatus and content presentation control method
US20020154833A1 (en) 2001-03-08 2002-10-24 Christof Koch Computation of intrinsic perceptual saliency in visual environments, and applications
US7593618B2 (en) * 2001-03-29 2009-09-22 British Telecommunications Plc Image processing for analyzing video content
US6978115B2 (en) 2001-03-29 2005-12-20 Pointecast Corporation Method and system for training in an adaptive manner
US20020188216A1 (en) 2001-05-03 2002-12-12 Kayyali Hani Akram Head mounted medical device
KR20040019013A (en) 2001-06-07 2004-03-04 로렌스 파웰 Method and apparatus for brain fingerprinting, measurement, assessment and analysis of brain function
JP4537703B2 (en) 2001-06-13 2010-09-08 コンピュメディクス・リミテッド Device for monitoring consciousness
JP2003016095A (en) 2001-06-28 2003-01-17 Sony Corp Apparatus for information processing, method therefor, network system, recording medium and program
US20030003433A1 (en) 2001-06-29 2003-01-02 Ignite, Inc. Method and system for constructive, modality focused learning
US6644976B2 (en) 2001-09-10 2003-11-11 Epoch Innovations Ltd Apparatus, method and computer program product to produce or direct movements in synergic timed correlation with physiological activity
WO2003026252A2 (en) 2001-09-19 2003-03-27 Ambient Devices Inc. System and method for presentation of remote information in ambient form
JP2003111106A (en) 2001-09-28 2003-04-11 Toshiba Corp Apparatus for acquiring degree of concentration and apparatus and system utilizing degree of concentration
US7308133B2 (en) 2001-09-28 2007-12-11 Koninklijke Philips Elecyronics N.V. System and method of face recognition using proportions of learned model
US20030066071A1 (en) 2001-10-03 2003-04-03 Koninklijke Philips Electronics N.V. Program recommendation method and system utilizing a viewing history of commercials
KR100624403B1 (en) * 2001-10-06 2006-09-15 삼성전자주식회사 Human nervous-system-based emotion synthesizing device and method for the same
US6623428B2 (en) 2001-10-11 2003-09-23 Eastman Kodak Company Digital image sequence display system and method
US20030081834A1 (en) * 2001-10-31 2003-05-01 Vasanth Philomin Intelligent TV room
US8561095B2 (en) * 2001-11-13 2013-10-15 Koninklijke Philips N.V. Affective television monitoring and control in response to physiological data
JP2003178078A (en) 2001-12-12 2003-06-27 Matsushita Electric Ind Co Ltd Additional indicator data to image and voice data, and its adding method
US6585521B1 (en) * 2001-12-21 2003-07-01 Hewlett-Packard Development Company, L.P. Video indexing based on viewers' behavior and emotion feedback
KR100450758B1 (en) 2002-01-22 2004-10-01 한국전자통신연구원 Apparatus and method for measuring electroencephalogram
EP1492453A4 (en) 2002-04-06 2009-03-04 Randall L Barbour A system and method for quantifying the dynamic response of a target system
US7079888B2 (en) 2002-04-11 2006-07-18 Ansar, Inc. Method and apparatus for monitoring the autonomic nervous system using non-stationary spectral analysis of heart rate and respiratory activity
JP2004021844A (en) * 2002-06-19 2004-01-22 Sony Corp Method for preparing data base, equipment for preparing data base, program for preparing data base, and method for regenerating data base, recording medium, contents, method, device, program reproducing contents
JP4359810B2 (en) * 2002-10-01 2009-11-11 ソニー株式会社 User terminal, data processing method, program, and data processing system
JP3993069B2 (en) 2002-10-30 2007-10-17 三菱電機株式会社 Control device using EEG signals
US20030126593A1 (en) * 2002-11-04 2003-07-03 Mault James R. Interactive physiological monitoring system
US7931028B2 (en) 2003-08-26 2011-04-26 Jay Harvey H Skin injury or damage prevention method using optical radiation
US7396330B2 (en) 2003-01-07 2008-07-08 Triage Data Networks Wireless, internet-based medical-diagnostic system
US20040161730A1 (en) 2003-02-19 2004-08-19 Urman John F. Device and method for designated hemispheric programming
GB2400667B (en) 2003-04-15 2006-05-31 Hewlett Packard Development Co Attention detection
BRPI0410296A (en) 2003-05-06 2006-05-16 Aspect Medical Systems Inc system and method for determining the efficacy of treatment of neurological disorders using electroencephalogram
WO2005113099A2 (en) 2003-05-30 2005-12-01 America Online, Inc. Personalizing content
GB2421329B (en) 2003-06-20 2007-10-24 Brain Fingerprinting Lab Inc Apparatus for a classification guilty knowledge test and integrated system for detection of deception and information
US7367949B2 (en) 2003-07-07 2008-05-06 Instrumentarium Corp. Method and apparatus based on combination of physiological parameters for assessment of analgesia during anesthesia or sedation
US8200775B2 (en) 2005-02-01 2012-06-12 Newsilike Media Group, Inc Enhanced syndication
JP4200370B2 (en) * 2003-08-12 2008-12-24 ソニー株式会社 Recording apparatus, recording / reproducing apparatus, reproducing apparatus, recording method, recording / reproducing method, and reproducing method
JP3931889B2 (en) 2003-08-19 2007-06-20 ソニー株式会社 Image display system, image display apparatus, and image display method
JP2005084770A (en) 2003-09-05 2005-03-31 Sony Corp Content providing system and method, providing device and method, reproducing device and method, and program
US7305654B2 (en) 2003-09-19 2007-12-04 Lsi Corporation Test schedule estimator for legacy builds
US20050071865A1 (en) * 2003-09-30 2005-03-31 Martins Fernando C. M. Annotating meta-data with user responses to digital content
JP2005124909A (en) * 2003-10-24 2005-05-19 Sony Corp Method for presenting emotional information, emotional information display device, and method for retrieving information content
US20050096311A1 (en) 2003-10-30 2005-05-05 Cns Response Compositions and methods for treatment of nervous system disorders
KR20050072965A (en) 2004-01-08 2005-07-13 림스테크널러지주식회사 Active dry sensor module for measurement of bioelectricity
US8301218B2 (en) 2004-01-08 2012-10-30 Neurosky, Inc. Contoured electrode
US20080177197A1 (en) 2007-01-22 2008-07-24 Lee Koohyoung Method and apparatus for quantitatively evaluating mental states based on brain wave signal processing system
JP4604494B2 (en) * 2004-01-15 2011-01-05 セイコーエプソン株式会社 Biological information analysis system
US20050172311A1 (en) 2004-01-31 2005-08-04 Nokia Corporation Terminal and associated method and computer program product for monitoring at least one activity of a user
WO2005084538A1 (en) 2004-02-27 2005-09-15 Axon Sleep Research Laboratories, Inc. Device for and method of predicting a user’s sleep state
EP1582965A1 (en) * 2004-04-01 2005-10-05 Sony Deutschland Gmbh Emotion controlled system for processing multimedia data
US7543330B2 (en) 2004-04-08 2009-06-02 International Business Machines Corporation Method and apparatus for governing the transfer of physiological and emotional user data
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
DE102004034266A1 (en) 2004-07-15 2006-02-09 Ge Bayer Silicones Gmbh & Co. Kg Phyllosilicate-containing polysiloxane compositions
WO2006033104A1 (en) 2004-09-22 2006-03-30 Shalon Ventures Research, Llc Systems and methods for monitoring and modifying behavior
ATE481920T1 (en) 2004-11-02 2010-10-15 Medtronic Inc METHOD FOR DATA RETENTION IN AN IMPLANTABLE MEDICAL DEVICE
US20060111621A1 (en) 2004-11-03 2006-05-25 Andreas Coppi Musical personal trainer
US7751878B1 (en) 2004-11-10 2010-07-06 Sandia Corporation Real-time human collaboration monitoring and intervention
US7819812B2 (en) 2004-12-15 2010-10-26 Neuropace, Inc. Modulation and analysis of cerebral perfusion in epilepsy and other neurological disorders
US20070168461A1 (en) 2005-02-01 2007-07-19 Moore James F Syndicating surgical data in a healthcare environment
US7805009B2 (en) 2005-04-06 2010-09-28 Carl Zeiss Meditec, Inc. Method and apparatus for measuring motion of a subject using a series of partial images from an imaging system
JP2006323547A (en) 2005-05-17 2006-11-30 Fuji Xerox Co Ltd Information processor, information processing method and program
WO2006133229A2 (en) 2005-06-06 2006-12-14 Better, Inc. System and method for generating effective advertisements in electronic commerce
WO2007008930A2 (en) 2005-07-13 2007-01-18 Ultimate Balance, Inc. Orientation and motion sensing in athletic training systems, physical rehabilitation and evaluation systems, and hand-held devices
US8784109B2 (en) 2005-08-03 2014-07-22 Bob Gottfried Cognitive enhancement
US20070048707A1 (en) 2005-08-09 2007-03-01 Ray Caamano Device and method for determining and improving present time emotional state of a person
JP4697949B2 (en) 2005-08-10 2011-06-08 親次 佐藤 Mental symptom / psychological state evaluation apparatus and evaluation method
WO2007030275A2 (en) 2005-09-02 2007-03-15 Emsense Corporation A device and method for sensing electrical activity in tissue
US20070055166A1 (en) * 2005-09-02 2007-03-08 Chandrashekhar Patil Method and system for recording and transmitting data from biometric sensors
US7865235B2 (en) 2005-09-12 2011-01-04 Tan Thi Thai Le Method and system for detecting and classifying the mental state of a subject
KR20080074099A (en) 2005-09-12 2008-08-12 이모티브 시스템즈 피티와이 엘티디. Detection of and interaction using mental states
US20070060830A1 (en) 2005-09-12 2007-03-15 Le Tan Thi T Method and system for detecting and classifying facial muscle movements
JP3970920B2 (en) 2005-12-08 2007-09-05 松下電器産業株式会社 Information processing system, information processing apparatus and method
US9721480B2 (en) 2006-02-08 2017-08-01 Honeywell International Inc. Augmented tutoring
EP2007271A2 (en) 2006-03-13 2008-12-31 Imotions - Emotion Technology A/S Visual attention and emotional response detection and display system
US20070235716A1 (en) 2006-03-22 2007-10-11 Emir Delic Electrode
AU2007293092A1 (en) * 2006-09-05 2008-03-13 Innerscope Research, Inc. Method and system for determining audience response to a sensory stimulus
US20080081512A1 (en) * 2006-10-03 2008-04-03 Shawn Chawgo Coaxial Cable Connector With Threaded Post
JP4325875B2 (en) * 2006-11-06 2009-09-02 株式会社日立製作所 Friction stir welding tool and friction stir welding apparatus
USD565735S1 (en) 2006-12-06 2008-04-01 Emotiv Systems Pty Ltd Electrode headset
US20080211768A1 (en) 2006-12-07 2008-09-04 Randy Breen Inertial Sensor Input Device
US20090222330A1 (en) 2006-12-19 2009-09-03 Mind Metrics Llc System and method for determining like-mindedness
US20080144882A1 (en) 2006-12-19 2008-06-19 Mind Metrics, Llc System and method for determining like-mindedness
US20080159365A1 (en) 2006-12-22 2008-07-03 Branislav Dubocanin Analog Conditioning of Bioelectric Signals
US8768718B2 (en) 2006-12-27 2014-07-01 Cardiac Pacemakers, Inc. Between-patient comparisons for risk stratification of future heart failure decompensation
US8352980B2 (en) * 2007-02-15 2013-01-08 At&T Intellectual Property I, Lp System and method for single sign on targeted advertising
US20080218472A1 (en) 2007-03-05 2008-09-11 Emotiv Systems Pty., Ltd. Interface to convert mental states and facial expressions to application input
US20090105576A1 (en) 2007-10-22 2009-04-23 Nam Hoai Do Electrode conductive element
US20090030717A1 (en) 2007-03-29 2009-01-29 Neurofocus, Inc. Intra-modality synthesis of central nervous system, autonomic nervous system, and effector data
JP5361868B2 (en) 2007-05-01 2013-12-04 ニューロフォーカス・インコーポレーテッド Neural information storage system
US9886981B2 (en) 2007-05-01 2018-02-06 The Nielsen Company (Us), Llc Neuro-feedback based stimulus compression device
US20090024449A1 (en) 2007-05-16 2009-01-22 Neurofocus Inc. Habituation analyzer device utilizing central nervous system, autonomic nervous system and effector system measurements
EP2152155A4 (en) 2007-06-06 2013-03-06 Neurofocus Inc Multi-market program and commercial response monitoring system using neuro-response measurements
US8494905B2 (en) 2007-06-06 2013-07-23 The Nielsen Company (Us), Llc Audience response analysis using simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI)
US20090030287A1 (en) 2007-06-06 2009-01-29 Neurofocus Inc. Incented response assessment at a point of transaction
US20090024446A1 (en) * 2007-07-20 2009-01-22 Shan Jerry Z Providing a model of a life cycle of an enterprise offering
US20090036755A1 (en) 2007-07-30 2009-02-05 Neurofocus, Inc. Entity and relationship assessment and extraction using neuro-response measurements
JP5542051B2 (en) 2007-07-30 2014-07-09 ニューロフォーカス・インコーポレーテッド System, method, and apparatus for performing neural response stimulation and stimulation attribute resonance estimation
US8386313B2 (en) 2007-08-28 2013-02-26 The Nielsen Company (Us), Llc Stimulus placement system using subject neuro-response measurements
US8635105B2 (en) 2007-08-28 2014-01-21 The Nielsen Company (Us), Llc Consumer experience portrayal effectiveness assessment system
KR20100047865A (en) 2007-08-28 2010-05-10 뉴로포커스, 인크. Consumer experience assessment system
TW200910875A (en) * 2007-08-29 2009-03-01 Inventec Appliances Corp Method and system for instantly translating text within an image
US8392255B2 (en) 2007-08-29 2013-03-05 The Nielsen Company (Us), Llc Content based selection and meta tagging of advertisement breaks
US20090083129A1 (en) 2007-09-20 2009-03-26 Neurofocus, Inc. Personalized content delivery using neuro-response priming data
US8494610B2 (en) 2007-09-20 2013-07-23 The Nielsen Company (Us), Llc Analysis of marketing and entertainment effectiveness using magnetoencephalography
WO2009102430A1 (en) 2008-02-13 2009-08-20 Neurosky, Inc. Audio headset with bio-signal sensors
US7742623B1 (en) 2008-08-04 2010-06-22 Videomining Corporation Method and system for estimating gaze target, gaze sequence, and gaze map from video

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6292688B1 (en) * 1996-02-28 2001-09-18 Advanced Neurotechnologies, Inc. Method and apparatus for analyzing neurological response to emotion-inducing stimuli
US7113916B1 (en) * 2001-09-07 2006-09-26 Hill Daniel A Method of facial coding monitoring for the purpose of gauging the impact and appeal of commercially-related stimuli

Also Published As

Publication number Publication date
WO2008108799A8 (en) 2009-12-23
JP2010520552A (en) 2010-06-10
EP2135370A1 (en) 2009-12-23
US20080222671A1 (en) 2008-09-11
EP2135370A4 (en) 2012-08-08
JP5746472B2 (en) 2015-07-08
US8782681B2 (en) 2014-07-15
CN101755406A (en) 2010-06-23

Similar Documents

Publication Publication Date Title
US8782681B2 (en) Method and system for rating media and events in media based on physiological data
US8230457B2 (en) Method and system for using coherence of biological responses as a measure of performance of a media
US8347326B2 (en) Identifying key media events and modeling causal relationships between key events and reported feelings
US20080295126A1 (en) Method And System For Creating An Aggregated View Of User Response Over Time-Variant Media Using Physiological Data
US11343596B2 (en) Digitally representing user engagement with directed content based on biometric sensor data
US10839350B2 (en) Method and system for predicting audience viewing behavior
US9894399B2 (en) Systems and methods to determine media effectiveness
US20090030762A1 (en) Method and system for creating a dynamic and automated testing of user response
US20090150919A1 (en) Correlating Media Instance Information With Physiological Responses From Participating Subjects
WO2010123770A2 (en) Method and system for measuring user experience for interactive activities
Arifin et al. Can electrocardiographic response define the effectiveness of an advertisement video?
Sun et al. Personal rating prediction for on-line video lectures using gaze information

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780052879.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07796518

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2009552656

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2007796518

Country of ref document: EP