WO2008030542A2 - Methods for measuring emotive response and selection preference - Google Patents

Methods for measuring emotive response and selection preference Download PDF

Info

Publication number
WO2008030542A2
WO2008030542A2 PCT/US2007/019487 US2007019487W WO2008030542A2 WO 2008030542 A2 WO2008030542 A2 WO 2008030542A2 US 2007019487 W US2007019487 W US 2007019487W WO 2008030542 A2 WO2008030542 A2 WO 2008030542A2
Authority
WO
WIPO (PCT)
Prior art keywords
consumer
data
visual stimulus
presenting
aoi
Prior art date
Application number
PCT/US2007/019487
Other languages
French (fr)
Other versions
WO2008030542A3 (en
Inventor
Charles John Berg, Jr.
David Keith Ewart
Nick Robert Harrington
Original Assignee
The Procter & Gamble Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Procter & Gamble Company filed Critical The Procter & Gamble Company
Priority to EP07837845A priority Critical patent/EP2062206A4/en
Priority to BRPI0716106-9A priority patent/BRPI0716106A2/en
Priority to JP2009527416A priority patent/JP5249223B2/en
Priority to CA002663078A priority patent/CA2663078A1/en
Priority to MX2009002419A priority patent/MX2009002419A/en
Publication of WO2008030542A2 publication Critical patent/WO2008030542A2/en
Publication of WO2008030542A3 publication Critical patent/WO2008030542A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0203Market surveys; Market polls

Definitions

  • the present invention relates generally to methods for conducting consumer research.
  • the present invention attempts to address these and other needs by providing, in a first aspect of the invention, a method comprising the steps: presenting a visual stimulus to a consumer; collecting eye gazing data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer; collecting non-ocular biometric data in a non- tethered manner from the consumer while presenting the visual stimulus to the consumer.
  • Another aspect of the invention provides for a method of obtaining consumer research data comprising the steps: presenting a visual stimulus to a consumer; defining an area of interest (AOI) in the visual stimulus; collecting eye gazing data from the consumer while presenting the visual stimulus to the consumer and with regard to the AOI; collecting biometric data from the consumer while presenting the visual stimulus to the consumer; and associating the collected biometric data and the collected eye gazing data regarding the AOI.
  • AOI area of interest
  • Another aspect of the invention provides for a method of obtaining consumer research data comprising the steps; presenting a visual stimulus to a consumer; defining an area of interest (AOI) in the visual stimulus; collecting eye gazing data from the consumer while presenting the visual stimulus to the consumer and with regard to the AOI; collecting biometric data from the consumer while presenting the visual stimulus to the consumer; and translating the collected biometric data to an emotional metric data; and associating the emotional metric data and the collected eye gazing data regarding the AOI.
  • AOI area of interest
  • Another aspect of the invention provides for a method of obtaining consumer research data comprising the steps: presenting a visual stimulus to a consumer; collecting face direction data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer; collecting non-ocular biometric data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer.
  • Systems and software are also provided.
  • consumer(s) is used in the broadest sense and is a mammal, usually human, that includes but is not limited to a shopper, user, beneficiary, or an observer or viewer of products or services by at least one physiological sense such as visually by magazines, a sign, virtual, TV, or, auditory by music, speech, white noise, or olfactory by smell, scent, insult; or, by tactile, among others.
  • a consumer can also be involved in a test (real world or simulation) whereas they may also be called a test panelist or panelist.
  • the consumer is an observer of another person who is using the product or service. The observation may be by way of viewing in-person or via photograph or video.
  • shopper is used in the broadest sense and refers to an individual who is considering the selection or purchase of a product for immediate or future use by themselves or someone else. A shopper may engage in comparisons between consumer products. A shopper can receive information and impressions by various methods.
  • Visual methods may include but are not limited to the product or its package within a retail store, a picture or description of a product or package, or the described or imaged usage or benefits of a product on a website; electronic or electrical media such as television, videos, illuminated panels & billboards & displays; or, printed forms such as ads or information on billboards, posters, displays, "Point-of- purchase” POP materials, coupons, flyers, signage, banners, magazine or newspaper pages or inserts, circulars, mailers, etc.
  • a shopper sometimes is introduced into a shopping mode without prior planning or decision to do so such as with television program commercial, product placement within feature films, etc.
  • the shopper / consumer / panelist may be referred to as "she" for efficiency but will collectively include both female and male shoppers / consumers / and panelists.
  • viewer is used in the broadest sense and refers to a recipient of visual media communication where the product is entertainment information including information needed for decisions or news. Similar to the shopper examples, visual methods may include but are not limited to websites; electronic or electrical media such as television, videos, illuminated panels & billboards & displays; or, printed forms. The visual media can be supplemented with other sensorial stimulus such as auditory, among others.
  • consumer analysis is used in the broadest sense and refers to research involving the consumer reacting to in relation to a company's products such as in shopping, usage, post-application benefits receipt situations.
  • Many current techniques with significant drawbacks, exist to attempt to understand the emotive response or selection interest in one or more products, or a task involving one or more products. See e.g., US 2007/0005425.
  • product(s) is used in the broadest sense and refers to any product, product group, services, communications, entertainment, environments, organizations, systems, tools, and the like. Exemplary product forms and brands are described on The Procter & Gamble Company's website www.pg.com, and the linked sites found thereon. It is to be understood that consumer products that are part of product categories other than those listed above are also contemplated by the present invention, and that alternative product forms and brands other than those disclosed on the above-identified website are also encompassed by the present invention.
  • the term "emotive response indicator(s)” refers to a measure of a physiological or biological process or state of a human or mammal which is believed to be linked or influenced at least in part by the emotive state of the human or mammal at a point or over a period of time. It can also be linked or influenced to just one of the internal feelings at a point or period in time even if multiple internal feelings are present; or, it can be linked to any combination of present feelings. Additionally, the amount of impact or weighting that a given feeling influences an emotive response indicator can vary from person-to-person or other situational factors, e.g., the person is experiencing hunger, to even environmental factors such as room temperature.
  • the term "emotive state(s)” refers to the collection of internal feelings of the consumer at a point or over a period of time. It should be appreciated that multiple feelings can be present such as anxiousness and fear, or anxiousness and delight, among others.
  • imaging apparatus is used in the broadest sense and refers to an apparatus for viewing of visual stimulus images including, but not limited to: drawings, animations, computer renderings, photographs, and text, among others.
  • the images can be representations of real physical objects, or virtual images, or artistic graphics or text, and the like.
  • the viewable images can be static, or dynamically changing or transforming such as in sequencing through a deck of static images, showing motions, and the like.
  • the images can be presented or displayed in many different forms including, but not limited to print or painted media such as on paper, posters, displays, walls, floors, canvases, and the like.
  • the images can be presented or displayed via light imaging techniques and displayed for viewing by the consumer on a computer monitor, plasma screen, LCD screen, CRT, projection screen, fogscreen, water screen, VR goggles, headworn helmets or eyeglasses with image display screens, or any other structure that allows an image to be displayed, among others.
  • Projected imagery "in air” such as holographic and other techniques are also suitable.
  • An example of a means for displaying a virtual reality environment, as well as receiving feed-back response to the environment, is described in US 6,425,764; and US 2006/0066509 Al.
  • a method is provided the steps: presenting a visual stimulus to a consumer; collecting head position tracking and/or face direction tracking of the consumer while presenting the visual stimulus to the consumer; optionally collecting eye gazing data from the consumer while presenting the visual stimulus to the consumer; collecting biometric data from the consumer while presenting the visual stimulus to the consumer.
  • face direction data means determining the field of view the consumer's face is facing from the wholly available visual environment surrounding the consumer. Without wishing to be bound by theory, this approach provides an estimation (for the sake of efficiency) of whether the consumer is viewing the visual stimulus (including any AOFs).
  • Face direction data can be gathered by various known means including head position tracking, and face tracking.
  • face direction data may be obtained by remote video tracking means, by remote electromagnetic wave tracking, or by placing fixed sensor(s) or tracking point(s) at or near the consumer's head or face.
  • visual stimulus is used in the broadest sense and refers to any virtual or nonvirtual image including but not limited to a product, object, stimulus, and the like, that an individual may view with their eyes.
  • a non-visual stimulus e.g., smell, sound, and the like
  • the visual stimulus may be archived as a physical image (e.g., photograph) or digital image for analysis.
  • physiological measurement(s) broadly includes both biological measures as well as body language measures which measure both the autonomic responses of the consumer, as well as learned responses whether executed consciously or subconsciously, often executed as a learned habit.
  • Physiological measurements are sometimes referred to as “biometric expressions” or “biometric data.” See e.g., US 5,676,138; US 6,190,314; US 6,309,342; US 7,249,603; and US 2005/0289582.
  • biometric expressions or biometric data
  • each emotion can cause a detectable physical response in the body.
  • any set - or even a newly derived set of emotion definitions and hierarchies can be used which is recognized as capturing at least a human emotion element. See e.g., US2003/0028383.
  • body language broadly includes forms of communication using body movements or gestures, instead of, or in addition to, sounds, verbal language, or other forms of communication.
  • Body language is part of the category of paralanguage, which for purposes of the present invention describes all forms of human or mammalian communication that are not verbal language. This includes, but is not limited to, the most subtle movements of many consumers, including winking and slight movement of the eyebrows.
  • Examples of body language data include facial electromyography or vision-based facial expression data. See e.g., US 2005/0289582; US 5,436,638; US 7,227,976.
  • paralanguage or "paralinguistic element(s)” refers to the non-verbal elements of communication used to modify meaning and convey emotion.
  • Paralanguage may be expressed consciously or unconsciously, and it includes voice pitch, volume, intonation of speech, among others. Paralanguage can also comprise vocally-produced sounds. In text-only communication such as email, chat rooms, and instant messaging, paralinguistic elements can be displayed by emoticons, font and color choices, capitalization, the use of non-alphabetic or abstract characters, among others.
  • One example of evaluating paralanguage is provided with the layered voice analysis apparatus, which may include the determination of an emotional state of an individual.
  • U.S. Patent No. 6,638,217 Another example is described in published PCT Application WO 97/01984 (PCT/IL96/00027).
  • LVA “Layered voice analysis” or “LVA” is broadly defined as any means of detecting the mental state and/or emotional makeup of voice by a speaker at a given moment / voice segment by detecting the emotional content of the speaker's speech.
  • Non-limiting examples of commercially available LVA products include those from Nemesysco Ltd., Zuran, Israel, such as LVA 6.50, TiPi 6.40, GKl and SCAl. See e.g., US 6,638,217.
  • LVA identifies various types of stress levels, cognitive processes, and/or emotional reactions that are reflected in the properties of voice. In one embodiment, LVA divides a voice segment into: (i) emotional parameters; or (ii) categories of emotions.
  • the LVA analyzes an arousal level or an attention level in a voice segment.
  • voice is recorded by a voice recorder, wherein the voice recording is then analyzed by LVA.
  • Examples of recording devices include: a computer via a microphone, telephone, television, radio, voice recorder (digital or analogue), computer-to-computer, video, CD, DVD, or the like. The less compressed the voice sample, the more likely accurate the LVA will be.
  • the voice being recorded / analyzed may be the same or different language than the investigator's native language. Alternatively the voice is not recorded but analyzed as the consumer / shopper / panelist is speaking.
  • a potential advantage of LVA is that the analysis may be done without looking at the language of the speech.
  • one approach of LVA is using data with regard to any sound (or lack thereof) that the consumer / shopper / panelist produces during testing. These sounds may include intonations, pauses, a gasp, an "err” or “hmm” or a sharp inhale/exhale of breath. Of course words may also form part of the analysis. Frequency of sound (or lack thereof) may used as part of the analysis.
  • LVA in consumer or market research including consumer analysis.
  • LVA may be used with or without other emotive response indicators or physiological measurements.
  • qualitative data is also obtained from the consumer / shopper / panelist.
  • Non-limiting examples of qualitative data are a written questionnaire or an oral interview (person-to-person or over the phone / Internet).
  • at least one facet of the consumer or market research is conducted with the consumer / shopper / panelist at home on the Internet.
  • the consumer / shopper / panelists submits her voice to the researcher via the phone or the Internet.
  • the qualitative data may be subsequently used to support LVA drawn conclusions (such LVA conclusion formed independent of the qualitative data).
  • the "passion" a consumer feels for an image, or an aspect of an image may obtained by the use of a "Passion Meter,” as provided by Unitec, Geneva, Switzerland and described in U.S. patent publication claiming the benefit of U.S. Prov. Appl. No. 60/823,531, filed Aug. 25, 2006 (and the non-provisional US publication claiming benefit thereof).
  • Other examples may include those described in "The Evaluative Movement Assessment (EMA)" - Brendl, Markman, and Messner (2005), Journal of Experimental Social Psychology, Volume 41 (4), pp. 346-368.
  • autonomic responses and measurements include but are not limited to changes or indications in: body temperature, e.g., measured by conductive or infrared thermometry, facial blood flow, skin impedance, EEG, EKG, blood pressure, blood transit time, heart rate, peripheral blood flow, perspiration or sweat, SDNN heart rate variability, galvanic skin response, pupil dilation, respiratory pace and volume per breath or an average taken, digestive tract peristalsis, large intestinal motility, and piloerection, i.e., goose bumps or body hair erectile state, saccades, temperature biofeedback, among others. See e.g., US 2007/010066.
  • Autonomic responses and measurements may also include body temperature (conductive or IR thermometry), facial blood flow, skin impedance, qEEG (quantified electroencephalography), stomach motility, and body hair erectile state, among others. Additional physiological measurements can be taken such as a facial electromyography, saliva viscosity and volume, measurement of salivary amylase activity, body metabolism, brain activity location and intensity, i.e., measured by fMRI or EEG.
  • the biometric data comprises cardiac data. Cardio vascular monitoring and other cardiac data obtaining techniques are described in US 2003/0149344. A commercial monitor may include the TANITA, 6102 cardio pulse meter. Electrocardiography, (using a Holter monitor) is another approach. Yet another approach is to employ UWB radar.
  • the biometric data is ocular biometric data or non-ocular biometric data.
  • Ocular biometric data is data obtained from the consumer's eye during research. Examples include pupil dilation, blink and eye tracking data.
  • Additional physiological measurements can be taken such as: electromyography of the facial, or other muscles; saliva viscosity and volume measures; measurement of salivary amylase activity; body biological function, e.g., metabolism via blood analysis, urine or saliva sample in order to evaluate changes in nervous system-directed responses, e.g., chemical markers can be measured for physiological data relating to levels of neuro-endocrine or endocrine-released hormones; brain function activity.
  • Brain function activity e.g., location and intensity
  • fMRI functional magnetic resonance imaging
  • MRI magnetic resonance imaging magnetic resonance imaging
  • radiography fluoroscopy
  • CT computerized tomography
  • ultrasonography nuclear medicine
  • PET Positron emission tomography
  • OT optical topography
  • NIRS near infrared spectroscopy
  • fNIR functional near- infrared imaging
  • monitoring brain function activity data may include the "brain- machine interface” developed by Hitachi, Inc., measuring brain blood flow. Yet another example includes “NIRS” or near infrared spectroscopy. Yet still another example is electroencephalogramy (EEG). See also e.g., US 6,572,562.
  • body language changes and measurements include all facial expressions, e.g., monitoring mouth, eye, neck, and jaw muscles, voluntary and involuntary muscle contractions, tissue, cartilage, bone structure, body limb positioning and gestural activity, limb motion patterns, e.g., tapping, patterned head movements, e.g., rotating or nodding, head positioning relative to the body and relative to the applied stimulus, vocal chord tension and resulting tonality, vocal volume (decibels), and speed of speech.
  • limb motion patterns e.g., tapping, patterned head movements, e.g., rotating or nodding, head positioning relative to the body and relative to the applied stimulus, vocal chord tension and resulting tonality, vocal volume (decibels), and speed of speech.
  • a non-invasive apparatus and method can be used.
  • a video digital photography apparatus can be used that correlates any facial expression changes with facial elements analysis software, or the Facial Action Coding System by Ekman at: http://face-and-emotion.com/dataface/facs/description.jsp or www.paulekman.com. See e.g., US 2003/0032890.
  • selection preference refers to a decision made by a consumer for the selection of product as a preference or non-preference, degree of appeal, probability of purchase or use, among others. This can also be additionally thought of as having or choosing an opinion, conscious or unconscious attitudes, whether openly expressed to another individual (via written or oral communication), or not.
  • selection preference query refers to any interaction with a subject that results in them identifying a single stimulus or specific group of stimuli from a broader selection of stimuli.
  • the identified stimulus may be a virtual or physical representation of that stimulus, e.g., package in a real or virtual retail environment, element or that stimulus, e.g., color of packing, scent of product contained in the packaging, picture or text, or a result of using that stimulus, e.g., hair color resulting from hair colorant usage.
  • the "query” or “selection preference query” may be made in any medium, e.g., verbal, oral or written, and may be made consciously, e.g., when probed, or unconsciously, e.g., when a subject behaves automatically in response to given stimulus in a given context.
  • a “query” can result in the selection or deselection of a stimulus; whereas, “selection preference query” results in identification of a stimulus or group of stimuli with positive associations.
  • a “selection preference query” may or may not be related to an intention to purchase.
  • limited communicative consumer refers to mammals who cannot articulate meaningfully to researchers. Examples may include a baby who lacks communication development, adult humans with impaired communication abilities (e.g., low IQ, physical handicap), or companion animals (e.g., dogs, cats, horse). Within the human species, the term “limited communicative consumer” refers to babies, some young children, and impaired adults such as from disease, injury or old age condition that possess limited conscious communication skills compared to those of normal human adults. For these consumers, consumer research has found difficulty to ascertain their emotive response and selection preference to products and proposed products. The present invention relates to emotive response and selection preference methods to conduct consumer research.
  • the present invention can be employed with a test subject when she is evaluating a consumer product, either in a virtual environment or a real environment, wherein the environment (virtual or real) is chosen from a home, office, test facility, restaurant, entertainment venue, outdoors, indoors, or retail store. See e.g., US 7,006,982; US 2002/0161651; US 2006/0010030; US 6,810,300; US 7,099,734; US 2003/0200129; US 2006/0149634. As a result, the location and use of the emotive response and selection system is not limited to any given environment.
  • the environment can be mobile, such that it can be moved and set up for use in the consumer's home, a retail store, a mall, a mall parking lot, a community building, a convention, a show, and the like.
  • the emotive response and selection preference systems can comprise a virtual or physical imaging apparatus, or combination thereof, which provides at least one visual stimulus.
  • the visual stimulus comprises a real store environment.
  • a "real store environment” means that the environment is non-virtual or real.
  • the store may be one open for business or may be prototypical (for testing).
  • the store may be a mass merchant, drug channel, warehouse store, or a high frequency store to provide a few examples of different store formats.
  • an imaging apparatus can display visual images, e.g., virtual, photographic, or physical images, of prospective or current product shelf arrangements to conduct consumer research regarding consumer products sold in a retail environment.
  • visual imaging may include human representations or avatars such as other product users, shoppers, or employees such as retail store clerks, or other mammals.
  • One advantage of such an imaging apparatus is faster screening and/or deeper insight regarding a consumer's reaction to a particular consumer product since the virtual environment can be realistic to a consumer.
  • a consumer's real-time reaction, upon viewing the consumer product, is one element in determining whether to buy the company's product or a competitor's product is referred to as the First Moment of Truth (FMOT).
  • FMOT First Moment of Truth
  • the SMOT is the assessment of product usage by the consumer or a usage experience by someone else that has been related to the consumer such as by word-of- mouth, internet chat room, product reviews, and the like.
  • the visual stimulus is static or non-static.
  • the stimulus comprises the consumer participating (e.g., conducting, observing, etc.) in a task associated with a product's usage. Examples of tasks associated a product's usage may include those described in US 7,249,603 (defining "task”); and 2007/0100666 (listing "activity types” in Table 2B).
  • the SMOT refers to both at the time of product use, and product benefits lasting for a period after product use or application, such as in a use experience, or in product beneficiary situations.
  • Another component is the "Zero" Moment of Truth (ZMOT) which refers to the interaction with a representation of or information about a product outside of the retail purchase environment. ZMOT can take place when the consumer receives or views advertisements, tests a sample (which also then lends some SMOT experience). For a retailer, ZMOT can be pre-market launch trade materials shared by the manufacturer before a product is launched for commercial sale.
  • ZMOT Moment of Truth
  • FMOT, SMOT or ZMOT can involve aesthetics, brand equity, textual and/or sensorial communications, and consumer benefit, among others.
  • Other factors include the appearance of the product at the point of sale or in an advertisement, the visual appearance (logo, copyrights, trademarks, or slogans, among others), olfactory (smell), and aural (sound) features communicated by and in support of the brand equity, and the graphic, verbal, pictorial or textual communication to the consumer such as value, unit price, performance, prestige, convenience.
  • the communication also focuses on how it is transmitted to the consumer, e.g., through a design, logo, text, pictures, imagery, and the like.
  • the virtual or physical imaging apparatus allows a company to evaluate these factors.
  • the virtual imaging apparatus gives a company, manufacturer, advertiser, or retailer, the ability to quickly screen a higher number of factors that can affect a consumer's reaction to a product at each or all of the Moments of Truth, e.g., FMOT, SMOT, and ZMOT, and allows for a higher number of consumers to be used in the evaluation of the product. For instance, project development teams within a company can evaluate a large number of consumers and have the data saved in a large database for later evaluation. Another benefit is that the virtual imaging apparatus allows a company to have lower developmental costs since they do not have to continually make costly physical prototypes, i.e., products, packaging, in-store environments, merchandise displays, etc. with virtual renditions. For example, a high-resolution, large-scale imaging apparatus allows a company to generate a virtual computer image, photographic image, or photo-shopped image of various prototypes without physically having to make them.
  • An additional benefit of the virtual imaging apparatus when used in conjunction with eye-tracking and an emotive response and selection system, is the ability to detect a consumer's emotive state to a proposed product, advertising slogan, etc.
  • the virtual imaging apparatus allows for improved and faster innovation techniques for a company to evaluate the appeal of various advertising and in-store merchandising elements and/or methods that they employ.
  • the virtual imaging apparatus can be used in a retail store, or, in an in vitro virtual retail environment. See e.g., US 6,026,377; US 6,304,855; US 5,848,399.
  • the image is one that responds interactively with the consumer. See e.g., US 6,128,004.
  • the imaging apparatus of an in-store environment allows the consumer to have a natural orientation dedicated to a real-life shopping experience. It also can allow a consumer to give feedback and respond to the imaging apparatus or in-store imaging apparatus in real-time, including with real-scale displayed imagery.
  • the virtual in-store imaging apparatus can store how many times a consumer picks up a product and places it back on the shelf, how long the consumer looks at the product, and, the precise locations of where the products are chosen by the consumer on the shelf.
  • the virtual in-store imaging apparatus can also be configured to store and monitor all the consumer's responses to the product, e.g., oral, written, physical, or involuntary actions, in addition to data collected by an eye-tracking apparatus.
  • an imaging apparatus can be used with other apparatuses such as an eye-tracking apparatus, head-tracking apparatus, and/or a physiological apparatus that measures at least one physiological response.
  • the imaging apparatus provides the company, manufacturer, advertiser, or retailer, superior feedback with regard to consumer's behavior and reactions to their products.
  • the vast majority of a consumer's decision-making and emotional reactions to consumer products occurs at the sub-conscious level, and cannot be easily determined by conscious awareness or direct interrogation.
  • variations in the eye-tracking activity and physiological indicators) of a consumer such as electrical brain activity
  • the level and span of attention, and extent and type of emotions evoked by the product can easily be measured using the disclosed virtual imaging apparatus with the eye-tracking and physiological apparatus.
  • real-time study gives the fastest learning, such learning can be done later by returning to stored data of the eye-tracking activity and physiological indicator(s) of a consumer.
  • Types of eye gazing data may include eye gaze fixation, eye gaze direction, path of eye gaze direction, eye gaze dwell time.
  • the eye gazing data is relative to the image displayed to the consumer as the data is obtained.
  • the image may be stored or archived during testing by methods well known to archive still and non-still images.
  • the physiological and imaging apparatus can combine neurological responses, motivational research, and physiological reactions, among others, to provide detailed depth analysis of a consumer's reaction to a product or environment.
  • the levels of arousal, involvement, engagement, attraction, degrees of memorization and brand attribution and association, and indices of predisposition and consideration can all be measured and evaluated with varying levels of degree.
  • the physiological and imaging apparatus allows the company to obtain the degree of arousal and degree of engagement with specificity.
  • it is now possible to more accurately and quickly capture an emotive response to a consumer product which may be an element involving opinion formation; and, a probable choice decision element on whether to use, not use, recommend, not recommend, select or not select for purchase.
  • this allows a company to develop FMOT strategies to stop, hold, and close as it relates to selling a company's product in a store.
  • the emotive response and selection system comprises at least one imaging apparatus, at least one eye-tracking apparatus used to monitor and track a consumer's eye movements in response to a product, and at least one physiological apparatus that measures a consumer's emotive state or feeling to a consumer product.
  • the at least one eye-tracking apparatus and the at least one physiological apparatus form an emotive response apparatus.
  • the at least one image apparatus provides at least one visual stimulus to a consumer.
  • the visual stimulus can be virtual, real, photographic, or holographic, a combination thereof, among others.
  • the measures obtained from the consumer of one or both of the eye-tracking or physiological apparatuses, or derivative analysis of one or both data such as a probable emotive response assignment can be used, in realtime, to manipulate and change the displayed images.
  • This can be accomplished using software integrated-analysis, or directed by a test observer monitoring the real-time consumer data, among other methods. For example, if it appears that the consumer's attention is drawn to blue products, then, a company or researcher can immediately change their displayed product from red to blue, to evaluate the consumer's reaction.
  • the ability to manipulate, modify, and change the displayed images is a powerful market feedback tool, notwithstanding that the present invention allows a company to do it in real-time. This can be done for not only product color, but shape, text, size, pricing, shelf location or any other possible visual or information form or arrangement. Alternatively, the feedback could be used to change the environment in addition to or separate from the visual stimulus.
  • One aspect of the invention is to better understand the emotive response element in combination with the attention element of the consumer analysis model in a more covert manner, whether in response to solely visual stimuli or a combination of a visual stimulus with at least one supplemental stimulus.
  • an eye-tracking apparatus or head-tracking apparatus may be used.
  • an emotive response apparatus can be used to provide the ability to understand the one or more emotive factors which causes a physiological response and/or change within a consumer.
  • the emotive response apparatus measures at least one physiological measure.
  • a physiological measure may include biological, body language expressed responses, and/or paralanguage, among others.
  • the probable emotive response is estimated by comparing the physiological measure and optionally the eye-gaze position data with a pre-determined dataset or model that gives probable emotive state or states associated with measures.
  • the use of multiple physiological measures can in some cases be helpful to ascertain probable emotive state or states.
  • an output of statistical confidence can be given to each emotive state or aggregate.
  • a report of likely weighting can be outputted.
  • the eye-tracking or head-tracking apparatus can be worn by the consumer, or, it can be a set of fixed sensors (or known position sensors which are either fixed or moving) remotely located from the consumer that monitors the consumer's eyes and/or head movements when viewing the visual stimulus.
  • the eye-tracking apparatus can further comprise a separate memory device that stores the data obtained from tracking the consumer's eyes and/or head movements, which may be located on the consumer or be remote from the consumer.
  • the memory device can then be electronically or wirelessly connected with a separate computer or storage system to transfer the data.
  • the memory device can further comprise a memory disk, cartridge, or other structure to facilitate the ease of transferring data, e.g., flash memory card.
  • the eye-tracking apparatus can also be configured to wirelessly transfer data to a separate data-capturing system that stores the data, e.g., through Bluetooth technology.
  • an eye-tracking apparatus that may be used with this invention is the Mobile Eye from ASL which is a non-tethered eye-tracking system for use when total freedom of movement is required and video with an overlayed cursor.
  • This system is designed to be easily worn by an active subject.
  • the eye- tracking optics is extremely lightweight and unobtrusive and the recording device is small enough to be worn on a belt.
  • the eye image and scene image are interleaved and saved to the recording device.
  • one, two, three, four, five, or more types of the biometric data are obtain from the consumer in a non-tethered manner.
  • Non-tethered means the biometric obtaining devices obtain data from the consumer without the consumer having wires or cords or the like attached from the consumer to a stand-alone piece of equipment. The consumer may walk or move around without the restriction (albeit in some embodiments in a confined area such as seated in front of a video monitor) of a tethered wire.
  • wires that are attached to a transmitter that is worn on the consumer's person such as “wireless microphone” is still considered “non-tethered” as the term is herein defined.
  • eye gazing data is obtained by way of a non-tethered means.
  • non-tethered means of obtaining biometric data include a sensing system worn on the consumer's person, such as a wave reflective or transponding sensor, or piece of material that is queried or probed by a remote piece of equipment via for example transmission of an electromagnetic wave that may or may not carry encoded data within the transmitted wave or sequence of waves).
  • the non-tethered means includes the subset means of remotely obtaining biometric data.
  • biometric data are obtained remotely.
  • the term “remotely” or “remote” means that no biometric data obtaining equipment is on, or carried by, the consumer to obtain the biometric data.
  • heart data may be obtained remotely by way of UWB radar to sense heart beat or breathing rate.
  • non-tethered obtaining data provides better data from testing given that testing environment is more analogous to "real life" since consumers typically do not have distractive or cumbersome equipment on their person or tethered to equipment. It also facilitates other avenues of testing which may require the consumer to participate in product usage or visit a retail store (commercial or prototypical) that do not lend themselves well to tethered methods.
  • a retail store commercial or prototypical
  • At least one physiological apparatus is used. For example, the physiological response of a consumer's blood pulse can be taken when viewing the visual stimulus while eye-tracking data is simultaneously gathered.
  • the measured data from the physiological apparatus is synchronized in time with the element to which the viewer has directed her attention at a point in time or over a period of time by computer software. While the recording of clock time is valuable, synchronization does not necessarily need to tag with actual clock time, but associate data with each other that occurred at the same point or interval of time. This allows for later analysis and understanding of the emotive state to various elements along the consumer's eye-gaze path.
  • certain emotive measurements e.g., blood pulse measures
  • the physiological apparatus can be worn by the consumer, or, it can be a set of fixed sensors or single sensor remotely located from the consumer that monitors the physiological responses of the consumer when viewing the visual stimulus.
  • the physiological apparatus can be a remotely located infrared camera to monitor changes in body or facial temperature, or the apparatus may be as simple as a watch worn on the wrist of the consumer to monitor heart rate.
  • the physiological apparatus is a wireless physiological apparatus.
  • the consumer is not constricted by any physical wires, e.g., electrical cords, limiting their movement or interaction with the visual stimulus.
  • the physiological apparatus can further comprise a separate memory device that stores the data obtained from tracking the consumer's physiological changes, which may be located on the consumer or be remote from the consumer.
  • the memory device can then be electronically or wirelessly connected with a separate computer or storage system to transfer the data.
  • the memory device can further comprise a memory disk, cartridge, or other structure to facilitate the ease of transferring data, e.g., flash memory card.
  • the physiological apparatus can also be configured to wirelessly transfer data to a separate data-capturing system that stores the data, e.g., through Bluetooth technology. Either way, the end result is that the data from the eye-tracking apparatus and the physiological apparatus is transferred to a separate apparatus that is configured to correlate, evaluate, and/or synchronize both sets of data, among other functions.
  • the separate apparatus is described as a data-capturing apparatus.
  • the data-capturing apparatus can be a separate computer, a laptop, a database, server, or any other electronic device configured to correlate, evaluate, and/or synchronize data from the physiological apparatus and the eye-tracking apparatus.
  • the data-capturing apparatus can further comprise additional databases or stored information.
  • known probable emotive states associated with certain physiological or eye-gaze measurement values, or derivative values such as from intermediate analysis can be stored and looked up in a table within the database and then time-associated, i.e., synchronized, with the viewed element for each or any time interval, or over a period of time, recorded during the period that the consumer is viewing the visual stimulus.
  • time-associated i.e., synchronized
  • a given physiological measure can also indicate two or more possible feelings either singly or in combination. In these cases, all possible feelings can be associated with a given time interval in the database.
  • Another additional database or stored information can be known selection states associated with certain emotive states, physiological, or eye-gaze measurement values, or derivative values such as from intermediate analysis, which can be stored and looked up in a table within the database and then time-associated, i.e., synchronized, with the viewed element for each or any time interval, or over a period of time, recorded during the period that the consumer is viewing the visual stimulus.
  • the measurement and tracking with subsequent time- association entry into the data-capturing apparatus of multiple physiological data such as a blood pulse measurement and a voice measurement is possible.
  • a feeling or possible feelings or emotive state(s) can then be assigned for each and associated time interval in the database.
  • the recorded feeling(s) for each can be compared to each other to output a new value of a most likely feeling or emotive state, based on cross-reinforcement of the individual database ascribed feelings, or an analysis sub-routine based on a prior model or correlation created beforehand with the emotive response measures involved.
  • the data obtained from the eye-tracking apparatus and physiological apparatus can be used in conjunction with other databases storing information in the data-capturing system to output processed data.
  • the processed data is in a synchronized format.
  • the assigned feelings from models, correlations, monographs, look-up tables and databases and the like can be adjusted internally for a specific consumer, or different environmental factors known or surmised to modify the feeling/emotive value correspondence can also be used.
  • a "control" measure conducted in advance, during or after the viewing test such as a specific consumer's response to controlled stimuli, questions, statements, and the like, can be used to modify the emotive value correspondence in that case.
  • a specific physiological response profile(s) modeled beforehand can be used as the "control.”
  • a consumer questionnaire is presented to the consumer and obtaining an answer thereto, wherein the questionnaire comprising one or more psychometric, psychographic, demographic questions, among others, can be asked.
  • the answers can be obtained before, during, after, or combination thereof at the time of presenting the visual stimulus to the consumer.
  • the emotive response and selection preference system can further obtain feedback from the consumer's response to the questions asked, with the questions optionally asked after the test and then obtained at that or a later time by the emotive response and selection system.
  • the data can also be correlated with psychometric measurements such as personality trait assessments to further enhance the reliability of the emotive response and selection preference system and methods.
  • the emotive response and selection preference system provides a company or researcher the ability to evaluate and monitor the body language of a consumer after he/she views a consumer product with the physiological apparatus.
  • the emotive response and selection preference system provides a company the ability to understand and critically evaluate the body language, conscious or unconscious responses, of a consumer to a consumer product.
  • the physiological apparatus can measure a single body language change or a plurality of body language changes of a consumer.
  • Body language changes and measurements include all facial expressions, i.e., monitoring mouth, eye, neck, and jaw muscles, voluntary and involuntary muscle contractions, tissue, cartilage, bone structure, body limb positioning, hands, fingers, shoulder positioning and the like, gestural activity, limb motion patterns, i.e., tapping, patterned head movements, i.e., rotating or nodding, head positioning relative to the body and relative to the applied stimulus, vocal chord tension and resulting tonality, vocal volume (decibels), and speed of speech.
  • a non-invasive physiological apparatus and method can be used.
  • a video digital photography apparatus can be used that captures and may correlate any facial expression change with facial elements analysis software.
  • the consumer is presented with questions soliciting attitude and/or behavioral data about the visual stimulus. See e.g., US 2007/0156515.
  • the data of the present invention may be stored and transferred according to known methods. See e.g., US 2006/0036751 ; US 2007/0100666.
  • One aspect of the invention provides for defining an area of interest (AOI) in the visual stimulus that is presented to the consumer.
  • the AOI may be defined by the investigator for numerous reasons. Some non-limiting reasons may be to test a certain characteristic of a product, or part of a graphic in an advertising message, or even a stain on a floor while the consumer performs the task of scrubbing the stain with a product.
  • the AOI may be defined, at least in part, by data (e.g., eye gaze duration in an area of the visual stimulus.)
  • the visual stimulus and AOFs may be illustrated as a graphic.
  • the graphic may be an archived image of the visual stimulus or some other representation.
  • the AOI may be illustrated on the graphic by drawing a circle or some other indicium indicating the location or area of the AOI in the graphic ("AOI indicium").
  • a visual stimulus and the graphic of the visual stimulus
  • the researcher may collect biometric data and eye gazing data from the consumer while presenting the visual stimulus to the consumer.
  • the researcher can determine when the consumer's gaze is directed within an AOI and thus associate the collected eye gazing data and the collected biometric data in relation to the AOI.
  • biometric data can be translated to emotional metric data before or after being associated with collected eye gazing data (in relation to the AOI).
  • a cardiac data will often have a lag time (versus say brain function activity data which is essentially or nearly instantaneous).
  • the investigator may compare biometric data / emotional metric data / eye gazing data in relation to a first AOI to that of the data in relation to second AOI, and a third AOI, and the like.
  • the emotional metric data or biometric data in relation to the AOI may be presented on a graphic (comprising the visual stimulus) as an indicium.
  • the indicium may be simply presented as raw data or perhaps a symbol (e.g., a needle on a scale) or scalar color-coding or scalar indicium size or the like.
  • the indicium may also communicate a degree of statistical confidence or range or the like for either the emotional metric or biometric data.
  • indicium there may be more than one indicium associated with a given AOI, such as two different biometric or emotional metric or combination indicia; or, indicium based on data from different consumers or the same consumer but in two different time-separated tests.
  • the indicium may represent positive or negative values relative to the specific metric chosen by the researcher.
  • the indicium can represent the collection of multiple consumers such as an average, a total, a variation from the mean, a range, a probability, a difference versus a standard, expectation or project goal of the data, as a percentage or number of consumers with data or data that falls within a defined set of limits or a minimum or maximum defined value.
  • the eye- gaze path or sequence of viewing may also be shown in whole or part.
  • the researcher may choose to present the data obtained (according the methodologies herein) described by presenting the data in a report that comprises: a graphic of the visual stimulus; an area of interest (AOI) indicium; an emotional metric data indicium or a biometric data indicium regarding the AOI; and an eye gazing indicium regarding the AOI.
  • AOI area of interest

Landscapes

  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)

Abstract

A method of obtaining consumer research data comprising the steps of presenting a visual stimulus to a consumer, collecting eye gazing data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer; and collecting non- ocular biometric data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer.

Description

METHODS FOR MEASURING EMOTIVE RESPONSE AND SELECTION PREFERENCE
FIELD OF THE INVENTION The present invention relates generally to methods for conducting consumer research.
BACKGROUND OF THE INVENTION
There is a continuing need for methods for measuring emotive response and selection preference that can provide accurate consumer feedback, whether conscious or sub-conscious, relating to a company's products for purposes of conducting consumer research, such as for shopping, usage analysis, and product beneficiary analysis. There is also a need for providing improved and more accurate consumer analyses models that avoid inaccuracies and inefficiencies associated with current methods.
See e.g., US 2003/0032890; US 2005/0243054; US 2005/0289582; US 5,676,138; US
6,190,314; US 6,309,342; US 6,572,562; US 6,638,217; US 7,046,924; US 7,249,603; WO 97/01984; WO 2007/043954; and Lindsey, Jeff; www.iefflindsav.com/market-research.shtml entitled "The Historic Use of Computerized Tools for Marketing and Market Research: A Brief
Survey."
SUMMARY OF THE INVENTION
The present invention attempts to address these and other needs by providing, in a first aspect of the invention, a method comprising the steps: presenting a visual stimulus to a consumer; collecting eye gazing data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer; collecting non-ocular biometric data in a non- tethered manner from the consumer while presenting the visual stimulus to the consumer.
Another aspect of the invention provides for a method of obtaining consumer research data comprising the steps: presenting a visual stimulus to a consumer; defining an area of interest (AOI) in the visual stimulus; collecting eye gazing data from the consumer while presenting the visual stimulus to the consumer and with regard to the AOI; collecting biometric data from the consumer while presenting the visual stimulus to the consumer; and associating the collected biometric data and the collected eye gazing data regarding the AOI.
Another aspect of the invention provides for a method of obtaining consumer research data comprising the steps; presenting a visual stimulus to a consumer; defining an area of interest (AOI) in the visual stimulus; collecting eye gazing data from the consumer while presenting the visual stimulus to the consumer and with regard to the AOI; collecting biometric data from the consumer while presenting the visual stimulus to the consumer; and translating the collected biometric data to an emotional metric data; and associating the emotional metric data and the collected eye gazing data regarding the AOI.
Another aspect of the invention provides for a method of obtaining consumer research data comprising the steps: presenting a visual stimulus to a consumer; collecting face direction data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer; collecting non-ocular biometric data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer. Systems and software are also provided.
DETAILED DESCRIPTION OF THE INVENTION
The term "consumer(s)" is used in the broadest sense and is a mammal, usually human, that includes but is not limited to a shopper, user, beneficiary, or an observer or viewer of products or services by at least one physiological sense such as visually by magazines, a sign, virtual, TV, or, auditory by music, speech, white noise, or olfactory by smell, scent, insult; or, by tactile, among others. A consumer can also be involved in a test (real world or simulation) whereas they may also be called a test panelist or panelist. In one embodiment, the consumer is an observer of another person who is using the product or service. The observation may be by way of viewing in-person or via photograph or video. The term "shopper" is used in the broadest sense and refers to an individual who is considering the selection or purchase of a product for immediate or future use by themselves or someone else. A shopper may engage in comparisons between consumer products. A shopper can receive information and impressions by various methods. Visual methods may include but are not limited to the product or its package within a retail store, a picture or description of a product or package, or the described or imaged usage or benefits of a product on a website; electronic or electrical media such as television, videos, illuminated panels & billboards & displays; or, printed forms such as ads or information on billboards, posters, displays, "Point-of- purchase" POP materials, coupons, flyers, signage, banners, magazine or newspaper pages or inserts, circulars, mailers, etc. A shopper sometimes is introduced into a shopping mode without prior planning or decision to do so such as with television program commercial, product placement within feature films, etc. For brevity, the shopper / consumer / panelist may be referred to as "she" for efficiency but will collectively include both female and male shoppers / consumers / and panelists.
The term "viewer" is used in the broadest sense and refers to a recipient of visual media communication where the product is entertainment information including information needed for decisions or news. Similar to the shopper examples, visual methods may include but are not limited to websites; electronic or electrical media such as television, videos, illuminated panels & billboards & displays; or, printed forms. The visual media can be supplemented with other sensorial stimulus such as auditory, among others.
The term "consumer analysis" is used in the broadest sense and refers to research involving the consumer reacting to in relation to a company's products such as in shopping, usage, post-application benefits receipt situations. Many current techniques, with significant drawbacks, exist to attempt to understand the emotive response or selection interest in one or more products, or a task involving one or more products. See e.g., US 2007/0005425.
The term "product(s)" is used in the broadest sense and refers to any product, product group, services, communications, entertainment, environments, organizations, systems, tools, and the like. Exemplary product forms and brands are described on The Procter & Gamble Company's website www.pg.com, and the linked sites found thereon. It is to be understood that consumer products that are part of product categories other than those listed above are also contemplated by the present invention, and that alternative product forms and brands other than those disclosed on the above-identified website are also encompassed by the present invention.
The term "emotive response indicator(s)" refers to a measure of a physiological or biological process or state of a human or mammal which is believed to be linked or influenced at least in part by the emotive state of the human or mammal at a point or over a period of time. It can also be linked or influenced to just one of the internal feelings at a point or period in time even if multiple internal feelings are present; or, it can be linked to any combination of present feelings. Additionally, the amount of impact or weighting that a given feeling influences an emotive response indicator can vary from person-to-person or other situational factors, e.g., the person is experiencing hunger, to even environmental factors such as room temperature.
The term "emotive state(s)" refers to the collection of internal feelings of the consumer at a point or over a period of time. It should be appreciated that multiple feelings can be present such as anxiousness and fear, or anxiousness and delight, among others. The term "imaging apparatus" is used in the broadest sense and refers to an apparatus for viewing of visual stimulus images including, but not limited to: drawings, animations, computer renderings, photographs, and text, among others. The images can be representations of real physical objects, or virtual images, or artistic graphics or text, and the like. The viewable images can be static, or dynamically changing or transforming such as in sequencing through a deck of static images, showing motions, and the like. The images can be presented or displayed in many different forms including, but not limited to print or painted media such as on paper, posters, displays, walls, floors, canvases, and the like. The images can be presented or displayed via light imaging techniques and displayed for viewing by the consumer on a computer monitor, plasma screen, LCD screen, CRT, projection screen, fogscreen, water screen, VR goggles, headworn helmets or eyeglasses with image display screens, or any other structure that allows an image to be displayed, among others. Projected imagery "in air" such as holographic and other techniques are also suitable. An example of a means for displaying a virtual reality environment, as well as receiving feed-back response to the environment, is described in US 6,425,764; and US 2006/0066509 Al.
In one embodiment, a method is provided the steps: presenting a visual stimulus to a consumer; collecting head position tracking and/or face direction tracking of the consumer while presenting the visual stimulus to the consumer; optionally collecting eye gazing data from the consumer while presenting the visual stimulus to the consumer; collecting biometric data from the consumer while presenting the visual stimulus to the consumer. For purposes of the present invention, the term "face direction data" means determining the field of view the consumer's face is facing from the wholly available visual environment surrounding the consumer. Without wishing to be bound by theory, this approach provides an estimation (for the sake of efficiency) of whether the consumer is viewing the visual stimulus (including any AOFs). Face direction data can be gathered by various known means including head position tracking, and face tracking. For example, face direction data may be obtained by remote video tracking means, by remote electromagnetic wave tracking, or by placing fixed sensor(s) or tracking point(s) at or near the consumer's head or face.
The term "visual stimulus" is used in the broadest sense and refers to any virtual or nonvirtual image including but not limited to a product, object, stimulus, and the like, that an individual may view with their eyes. In one embodiment, a non-visual stimulus (e.g., smell, sound, and the like) is substituted for the visual stimulus or is presented concurrently / concomitantly with the visual stimulus. In one embodiment, the visual stimulus may be archived as a physical image (e.g., photograph) or digital image for analysis.
The term "physiological measurement(s)", as used herein, broadly includes both biological measures as well as body language measures which measure both the autonomic responses of the consumer, as well as learned responses whether executed consciously or subconsciously, often executed as a learned habit. Physiological measurements are sometimes referred to as "biometric expressions" or "biometric data." See e.g., US 5,676,138; US 6,190,314; US 6,309,342; US 7,249,603; and US 2005/0289582. For purposes of clarification, the terms "physiological measurement," "biometric expression," and "biometric data" are used interchangeably herein. Body language, among other things, can non-verbally communicate emotive states via body gestures, postures, body or facial expressions, and the like. Generally, algorithms for physiological measurements can be used to implement embodiments of the present invention. Some embodiments may capture only one or a couple of physiological measurement(s) to reduce costs while other embodiments may capture multiple physiological measurements for more precision. Many techniques have been described in translating physiological measurements or biometric data into an emotional metric data (e.g., type of emotion or emotional levels). See e.g., US 2005/0289582, ffl[ 37 - 44 and the references cited therein. Examples may include Hidden Markov Models, neural networks, and fuzzy logic techniques. See e.g., Comm. ACM, vol. 37, no. 3, pp. 77-84, Mar. 1994. For purposes of clarification, the definition of the term "emotional metric data" subsumes the terms "emotion", "type of emotion," and "emotional level."
Without wishing to be bound by theory, it is generally thought that each emotion can cause a detectable physical response in the body. There are different systems and categorizations of "emotions." For purposes of this innovation, any set - or even a newly derived set of emotion definitions and hierarchies, can be used which is recognized as capturing at least a human emotion element. See e.g., US2003/0028383.
The term "body language", as used herein, broadly includes forms of communication using body movements or gestures, instead of, or in addition to, sounds, verbal language, or other forms of communication. Body language is part of the category of paralanguage, which for purposes of the present invention describes all forms of human or mammalian communication that are not verbal language. This includes, but is not limited to, the most subtle movements of many consumers, including winking and slight movement of the eyebrows. Examples of body language data include facial electromyography or vision-based facial expression data. See e.g., US 2005/0289582; US 5,436,638; US 7,227,976. The term "paralanguage" or "paralinguistic element(s)" refers to the non-verbal elements of communication used to modify meaning and convey emotion. Paralanguage may be expressed consciously or unconsciously, and it includes voice pitch, volume, intonation of speech, among others. Paralanguage can also comprise vocally-produced sounds. In text-only communication such as email, chat rooms, and instant messaging, paralinguistic elements can be displayed by emoticons, font and color choices, capitalization, the use of non-alphabetic or abstract characters, among others. One example of evaluating paralanguage is provided with the layered voice analysis apparatus, which may include the determination of an emotional state of an individual. One example is described in U.S. Patent No. 6,638,217. Another example is described in published PCT Application WO 97/01984 (PCT/IL96/00027). "Layered voice analysis" or "LVA" is broadly defined as any means of detecting the mental state and/or emotional makeup of voice by a speaker at a given moment / voice segment by detecting the emotional content of the speaker's speech. Non-limiting examples of commercially available LVA products include those from Nemesysco Ltd., Zuran, Israel, such as LVA 6.50, TiPi 6.40, GKl and SCAl. See e.g., US 6,638,217. Without wishing to be bound by theory, LVA identifies various types of stress levels, cognitive processes, and/or emotional reactions that are reflected in the properties of voice. In one embodiment, LVA divides a voice segment into: (i) emotional parameters; or (ii) categories of emotions. In another embodiment, the LVA analyzes an arousal level or an attention level in a voice segment. In another embodiment, voice is recorded by a voice recorder, wherein the voice recording is then analyzed by LVA. Examples of recording devices include: a computer via a microphone, telephone, television, radio, voice recorder (digital or analogue), computer-to-computer, video, CD, DVD, or the like. The less compressed the voice sample, the more likely accurate the LVA will be. The voice being recorded / analyzed may be the same or different language than the investigator's native language. Alternatively the voice is not recorded but analyzed as the consumer / shopper / panelist is speaking. A potential advantage of LVA is that the analysis may be done without looking at the language of the speech. For example, one approach of LVA is using data with regard to any sound (or lack thereof) that the consumer / shopper / panelist produces during testing. These sounds may include intonations, pauses, a gasp, an "err" or "hmm" or a sharp inhale/exhale of breath. Of course words may also form part of the analysis. Frequency of sound (or lack thereof) may used as part of the analysis.
One aspect of the invention provides using LVA in consumer or market research including consumer analysis. LVA may be used with or without other emotive response indicators or physiological measurements. In another embodiment, qualitative data is also obtained from the consumer / shopper / panelist. Non-limiting examples of qualitative data are a written questionnaire or an oral interview (person-to-person or over the phone / Internet). In one embodiment, at least one facet of the consumer or market research is conducted with the consumer / shopper / panelist at home on the Internet. In yet another embodiment, the consumer / shopper / panelists submits her voice to the researcher via the phone or the Internet. The qualitative data may be subsequently used to support LVA drawn conclusions (such LVA conclusion formed independent of the qualitative data).
In one embodiment, the "passion" a consumer feels for an image, or an aspect of an image, may obtained by the use of a "Passion Meter," as provided by Unitec, Geneva, Switzerland and described in U.S. patent publication claiming the benefit of U.S. Prov. Appl. No. 60/823,531, filed Aug. 25, 2006 (and the non-provisional US publication claiming benefit thereof). Other examples may include those described in "The Evaluative Movement Assessment (EMA)" - Brendl, Markman, and Messner (2005), Journal of Experimental Social Psychology, Volume 41 (4), pp. 346-368.
Generally, autonomic responses and measurements include but are not limited to changes or indications in: body temperature, e.g., measured by conductive or infrared thermometry, facial blood flow, skin impedance, EEG, EKG, blood pressure, blood transit time, heart rate, peripheral blood flow, perspiration or sweat, SDNN heart rate variability, galvanic skin response, pupil dilation, respiratory pace and volume per breath or an average taken, digestive tract peristalsis, large intestinal motility, and piloerection, i.e., goose bumps or body hair erectile state, saccades, temperature biofeedback, among others. See e.g., US 2007/010066. Autonomic responses and measurements may also include body temperature (conductive or IR thermometry), facial blood flow, skin impedance, qEEG (quantified electroencephalography), stomach motility, and body hair erectile state, among others. Additional physiological measurements can be taken such as a facial electromyography, saliva viscosity and volume, measurement of salivary amylase activity, body metabolism, brain activity location and intensity, i.e., measured by fMRI or EEG. In one embodiment, the biometric data comprises cardiac data. Cardio vascular monitoring and other cardiac data obtaining techniques are described in US 2003/0149344. A commercial monitor may include the TANITA, 6102 cardio pulse meter. Electrocardiography, (using a Holter monitor) is another approach. Yet another approach is to employ UWB radar.
In another embodiment, the biometric data is ocular biometric data or non-ocular biometric data. Ocular biometric data is data obtained from the consumer's eye during research. Examples include pupil dilation, blink and eye tracking data.
Additional physiological measurements can be taken such as: electromyography of the facial, or other muscles; saliva viscosity and volume measures; measurement of salivary amylase activity; body biological function, e.g., metabolism via blood analysis, urine or saliva sample in order to evaluate changes in nervous system-directed responses, e.g., chemical markers can be measured for physiological data relating to levels of neuro-endocrine or endocrine-released hormones; brain function activity. Brain function activity (e.g., location and intensity) may be measured by fMRI, a form of medical imaging in this case directed toward the brain. A non- exhaustive list of medical imaging technologies that may be useful for brain function activity understanding, (but can be used for observing other physiological metrics such as the use of ultrasound for heart or lung movement), include fMRI (functional magnetic resonance imaging), MRI magnetic resonance imaging), radiography, fluoroscopy, CT (computated tomography), ultrasonography, nuclear medicine, PET (Positron emission tomography), OT (optical topography), NIRS (near infrared spectroscopy) such as in oximetry, and fNIR (functional near- infrared imaging).
Another example of monitoring brain function activity data may include the "brain- machine interface" developed by Hitachi, Inc., measuring brain blood flow. Yet another example includes "NIRS" or near infrared spectroscopy. Yet still another example is electroencephalogramy (EEG). See also e.g., US 6,572,562. It should be appreciated that body language changes and measurements include all facial expressions, e.g., monitoring mouth, eye, neck, and jaw muscles, voluntary and involuntary muscle contractions, tissue, cartilage, bone structure, body limb positioning and gestural activity, limb motion patterns, e.g., tapping, patterned head movements, e.g., rotating or nodding, head positioning relative to the body and relative to the applied stimulus, vocal chord tension and resulting tonality, vocal volume (decibels), and speed of speech. When monitoring body language such as facial expressions or vocal changes, a non-invasive apparatus and method can be used. For example, a video digital photography apparatus can be used that correlates any facial expression changes with facial elements analysis software, or the Facial Action Coding System by Ekman at: http://face-and-emotion.com/dataface/facs/description.jsp or www.paulekman.com. See e.g., US 2003/0032890. The term "selection preference" refers to a decision made by a consumer for the selection of product as a preference or non-preference, degree of appeal, probability of purchase or use, among others. This can also be additionally thought of as having or choosing an opinion, conscious or unconscious attitudes, whether openly expressed to another individual (via written or oral communication), or not. The term "query" or "selection preference query" refers to any interaction with a subject that results in them identifying a single stimulus or specific group of stimuli from a broader selection of stimuli. The identified stimulus may be a virtual or physical representation of that stimulus, e.g., package in a real or virtual retail environment, element or that stimulus, e.g., color of packing, scent of product contained in the packaging, picture or text, or a result of using that stimulus, e.g., hair color resulting from hair colorant usage. The "query" or "selection preference query" may be made in any medium, e.g., verbal, oral or written, and may be made consciously, e.g., when probed, or unconsciously, e.g., when a subject behaves automatically in response to given stimulus in a given context. A "query" can result in the selection or deselection of a stimulus; whereas, "selection preference query" results in identification of a stimulus or group of stimuli with positive associations. A "selection preference query" may or may not be related to an intention to purchase.
The term "limited communicative consumer" refers to mammals who cannot articulate meaningfully to researchers. Examples may include a baby who lacks communication development, adult humans with impaired communication abilities (e.g., low IQ, physical handicap), or companion animals (e.g., dogs, cats, horse). Within the human species, the term "limited communicative consumer" refers to babies, some young children, and impaired adults such as from disease, injury or old age condition that possess limited conscious communication skills compared to those of normal human adults. For these consumers, consumer research has found difficulty to ascertain their emotive response and selection preference to products and proposed products. The present invention relates to emotive response and selection preference methods to conduct consumer research. It should be appreciated that the present invention can be employed with a test subject when she is evaluating a consumer product, either in a virtual environment or a real environment, wherein the environment (virtual or real) is chosen from a home, office, test facility, restaurant, entertainment venue, outdoors, indoors, or retail store. See e.g., US 7,006,982; US 2002/0161651; US 2006/0010030; US 6,810,300; US 7,099,734; US 2003/0200129; US 2006/0149634. As a result, the location and use of the emotive response and selection system is not limited to any given environment. The environment can be mobile, such that it can be moved and set up for use in the consumer's home, a retail store, a mall, a mall parking lot, a community building, a convention, a show, and the like. It should also be appreciated that that the emotive response and selection preference systems can comprise a virtual or physical imaging apparatus, or combination thereof, which provides at least one visual stimulus. In one embodiment, the visual stimulus comprises a real store environment. In turn, a "real store environment" means that the environment is non-virtual or real. The store may be one open for business or may be prototypical (for testing). The store may be a mass merchant, drug channel, warehouse store, or a high frequency store to provide a few examples of different store formats.
For example, outside of an in-store retail environment, an imaging apparatus can display visual images, e.g., virtual, photographic, or physical images, of prospective or current product shelf arrangements to conduct consumer research regarding consumer products sold in a retail environment. Such visual imaging may include human representations or avatars such as other product users, shoppers, or employees such as retail store clerks, or other mammals. One advantage of such an imaging apparatus is faster screening and/or deeper insight regarding a consumer's reaction to a particular consumer product since the virtual environment can be realistic to a consumer. A consumer's real-time reaction, upon viewing the consumer product, is one element in determining whether to buy the company's product or a competitor's product is referred to as the First Moment of Truth (FMOT). Two additional components may also influence the consumer's decision of whether to purchase or not. One is any prior use experience with the product and is referred to as the Second Moment of Truth (SMOT). The SMOT is the assessment of product usage by the consumer or a usage experience by someone else that has been related to the consumer such as by word-of- mouth, internet chat room, product reviews, and the like. In one embodiment, the visual stimulus is static or non-static. In another embodiment, the stimulus comprises the consumer participating (e.g., conducting, observing, etc.) in a task associated with a product's usage. Examples of tasks associated a product's usage may include those described in US 7,249,603 (defining "task"); and 2007/0100666 (listing "activity types" in Table 2B). The SMOT refers to both at the time of product use, and product benefits lasting for a period after product use or application, such as in a use experience, or in product beneficiary situations. Another component is the "Zero" Moment of Truth (ZMOT) which refers to the interaction with a representation of or information about a product outside of the retail purchase environment. ZMOT can take place when the consumer receives or views advertisements, tests a sample (which also then lends some SMOT experience). For a retailer, ZMOT can be pre-market launch trade materials shared by the manufacturer before a product is launched for commercial sale.
FMOT, SMOT or ZMOT can involve aesthetics, brand equity, textual and/or sensorial communications, and consumer benefit, among others. Other factors include the appearance of the product at the point of sale or in an advertisement, the visual appearance (logo, copyrights, trademarks, or slogans, among others), olfactory (smell), and aural (sound) features communicated by and in support of the brand equity, and the graphic, verbal, pictorial or textual communication to the consumer such as value, unit price, performance, prestige, convenience. The communication also focuses on how it is transmitted to the consumer, e.g., through a design, logo, text, pictures, imagery, and the like. The virtual or physical imaging apparatus allows a company to evaluate these factors.
The virtual imaging apparatus gives a company, manufacturer, advertiser, or retailer, the ability to quickly screen a higher number of factors that can affect a consumer's reaction to a product at each or all of the Moments of Truth, e.g., FMOT, SMOT, and ZMOT, and allows for a higher number of consumers to be used in the evaluation of the product. For instance, project development teams within a company can evaluate a large number of consumers and have the data saved in a large database for later evaluation. Another benefit is that the virtual imaging apparatus allows a company to have lower developmental costs since they do not have to continually make costly physical prototypes, i.e., products, packaging, in-store environments, merchandise displays, etc. with virtual renditions. For example, a high-resolution, large-scale imaging apparatus allows a company to generate a virtual computer image, photographic image, or photo-shopped image of various prototypes without physically having to make them.
An additional benefit of the virtual imaging apparatus, when used in conjunction with eye-tracking and an emotive response and selection system, is the ability to detect a consumer's emotive state to a proposed product, advertising slogan, etc. The virtual imaging apparatus allows for improved and faster innovation techniques for a company to evaluate the appeal of various advertising and in-store merchandising elements and/or methods that they employ. The virtual imaging apparatus can be used in a retail store, or, in an in vitro virtual retail environment. See e.g., US 6,026,377; US 6,304,855; US 5,848,399. In another embodiment, the image is one that responds interactively with the consumer. See e.g., US 6,128,004.
The imaging apparatus of an in-store environment allows the consumer to have a natural orientation dedicated to a real-life shopping experience. It also can allow a consumer to give feedback and respond to the imaging apparatus or in-store imaging apparatus in real-time, including with real-scale displayed imagery. For instance, the virtual in-store imaging apparatus can store how many times a consumer picks up a product and places it back on the shelf, how long the consumer looks at the product, and, the precise locations of where the products are chosen by the consumer on the shelf. The virtual in-store imaging apparatus can also be configured to store and monitor all the consumer's responses to the product, e.g., oral, written, physical, or involuntary actions, in addition to data collected by an eye-tracking apparatus. As indicated above, an imaging apparatus can be used with other apparatuses such as an eye-tracking apparatus, head-tracking apparatus, and/or a physiological apparatus that measures at least one physiological response.
The imaging apparatus provides the company, manufacturer, advertiser, or retailer, superior feedback with regard to consumer's behavior and reactions to their products. The vast majority of a consumer's decision-making and emotional reactions to consumer products occurs at the sub-conscious level, and cannot be easily determined by conscious awareness or direct interrogation. By studying, in real-time, variations in the eye-tracking activity and physiological indicators) of a consumer (such as electrical brain activity), it is possible to gain insight into what the consumer is sub-consciously thinking or feeling. The level and span of attention, and extent and type of emotions evoked by the product can easily be measured using the disclosed virtual imaging apparatus with the eye-tracking and physiological apparatus. As a result, not only are conscious reactions measured and evaluated but also sub-conscious ones. While real-time study gives the fastest learning, such learning can be done later by returning to stored data of the eye-tracking activity and physiological indicator(s) of a consumer.
Methods of obtaining eye gazing data are described in US 2005/0243054 Al; US 7,046,924; US 4,950,069; US 4,836,670; US 4,595,990. IBM developed a "Blue Eyes" camera capable of obtaining eye gazing data. Eyetracking, Inc., San Diego, CA is an example. Video- oculography (VOG) uses see-through goggles to measure eye-in-head position. Techniques may include electro-oculography, corneal reflection, lumbus, pupil, and eyelid tracking, and contact lens. See e.g., US 2005/0243054, col. 4, ^J 58 et seq. Types of eye gazing data may include eye gaze fixation, eye gaze direction, path of eye gaze direction, eye gaze dwell time. The eye gazing data is relative to the image displayed to the consumer as the data is obtained. The image may be stored or archived during testing by methods well known to archive still and non-still images.
The physiological and imaging apparatus can combine neurological responses, motivational research, and physiological reactions, among others, to provide detailed depth analysis of a consumer's reaction to a product or environment. The levels of arousal, involvement, engagement, attraction, degrees of memorization and brand attribution and association, and indices of predisposition and consideration can all be measured and evaluated with varying levels of degree. The physiological and imaging apparatus allows the company to obtain the degree of arousal and degree of engagement with specificity. In terms of the example shopper analysis model, it is now possible to more accurately and quickly capture an emotive response to a consumer product which may be an element involving opinion formation; and, a probable choice decision element on whether to use, not use, recommend, not recommend, select or not select for purchase. In turn, this allows a company to develop FMOT strategies to stop, hold, and close as it relates to selling a company's product in a store.
For example, in one embodiment, the emotive response and selection system comprises at least one imaging apparatus, at least one eye-tracking apparatus used to monitor and track a consumer's eye movements in response to a product, and at least one physiological apparatus that measures a consumer's emotive state or feeling to a consumer product. Collectively, the at least one eye-tracking apparatus and the at least one physiological apparatus form an emotive response apparatus. The at least one image apparatus provides at least one visual stimulus to a consumer. The visual stimulus can be virtual, real, photographic, or holographic, a combination thereof, among others.
As a feature of the disclosed emotive response selection system, the measures obtained from the consumer of one or both of the eye-tracking or physiological apparatuses, or derivative analysis of one or both data such as a probable emotive response assignment, can be used, in realtime, to manipulate and change the displayed images. This can be accomplished using software integrated-analysis, or directed by a test observer monitoring the real-time consumer data, among other methods. For example, if it appears that the consumer's attention is drawn to blue products, then, a company or researcher can immediately change their displayed product from red to blue, to evaluate the consumer's reaction. The ability to manipulate, modify, and change the displayed images is a powerful market feedback tool, notwithstanding that the present invention allows a company to do it in real-time. This can be done for not only product color, but shape, text, size, pricing, shelf location or any other possible visual or information form or arrangement. Alternatively, the feedback could be used to change the environment in addition to or separate from the visual stimulus.
One aspect of the invention is to better understand the emotive response element in combination with the attention element of the consumer analysis model in a more covert manner, whether in response to solely visual stimuli or a combination of a visual stimulus with at least one supplemental stimulus. For measuring the attention element, an eye-tracking apparatus or head-tracking apparatus may be used. For measuring the emotive response element, an emotive response apparatus can be used to provide the ability to understand the one or more emotive factors which causes a physiological response and/or change within a consumer. The emotive response apparatus measures at least one physiological measure. A physiological measure may include biological, body language expressed responses, and/or paralanguage, among others.
The probable emotive response is estimated by comparing the physiological measure and optionally the eye-gaze position data with a pre-determined dataset or model that gives probable emotive state or states associated with measures. The use of multiple physiological measures can in some cases be helpful to ascertain probable emotive state or states. Optionally, an output of statistical confidence can be given to each emotive state or aggregate. Optionally, for likelihood weighting if multiple emotive states are probable, a report of likely weighting can be outputted.
The eye-tracking or head-tracking apparatus can be worn by the consumer, or, it can be a set of fixed sensors (or known position sensors which are either fixed or moving) remotely located from the consumer that monitors the consumer's eyes and/or head movements when viewing the visual stimulus. The eye-tracking apparatus can further comprise a separate memory device that stores the data obtained from tracking the consumer's eyes and/or head movements, which may be located on the consumer or be remote from the consumer. The memory device can then be electronically or wirelessly connected with a separate computer or storage system to transfer the data. The memory device can further comprise a memory disk, cartridge, or other structure to facilitate the ease of transferring data, e.g., flash memory card. The eye-tracking apparatus can also be configured to wirelessly transfer data to a separate data-capturing system that stores the data, e.g., through Bluetooth technology.
One example of an eye-tracking apparatus that may be used with this invention is the Mobile Eye from ASL which is a non-tethered eye-tracking system for use when total freedom of movement is required and video with an overlayed cursor. This system is designed to be easily worn by an active subject. The eye- tracking optics is extremely lightweight and unobtrusive and the recording device is small enough to be worn on a belt. The eye image and scene image are interleaved and saved to the recording device. In one aspect of the invention, one, two, three, four, five, or more types of the biometric data are obtain from the consumer in a non-tethered manner. "Non-tethered" means the biometric obtaining devices obtain data from the consumer without the consumer having wires or cords or the like attached from the consumer to a stand-alone piece of equipment. The consumer may walk or move around without the restriction (albeit in some embodiments in a confined area such as seated in front of a video monitor) of a tethered wire. For purposes of clarification, wires that are attached to a transmitter that is worn on the consumer's person (such as "wireless microphone") is still considered "non-tethered" as the term is herein defined. In one embodiment, eye gazing data is obtained by way of a non-tethered means. Other examples of a non-tethered means of obtaining biometric data include a sensing system worn on the consumer's person, such as a wave reflective or transponding sensor, or piece of material that is queried or probed by a remote piece of equipment via for example transmission of an electromagnetic wave that may or may not carry encoded data within the transmitted wave or sequence of waves). In yet another example, the non-tethered means includes the subset means of remotely obtaining biometric data.
In another aspect of the invention, one, two, three, four, five, or more types of biometric data are obtained remotely. The term "remotely" or "remote" means that no biometric data obtaining equipment is on, or carried by, the consumer to obtain the biometric data. For example, heart data may be obtained remotely by way of UWB radar to sense heart beat or breathing rate.
Chia, Microwave Conference, Vol. 3, Oct. 2005.
Without wishing to be bound by theory, the use of non-tethered obtaining data provides better data from testing given that testing environment is more analogous to "real life" since consumers typically do not have distractive or cumbersome equipment on their person or tethered to equipment. It also facilitates other avenues of testing which may require the consumer to participate in product usage or visit a retail store (commercial or prototypical) that do not lend themselves well to tethered methods. To measure the emotive state of the consumer, at least one physiological apparatus is used. For example, the physiological response of a consumer's blood pulse can be taken when viewing the visual stimulus while eye-tracking data is simultaneously gathered. The measured data from the physiological apparatus is synchronized in time with the element to which the viewer has directed her attention at a point in time or over a period of time by computer software. While the recording of clock time is valuable, synchronization does not necessarily need to tag with actual clock time, but associate data with each other that occurred at the same point or interval of time. This allows for later analysis and understanding of the emotive state to various elements along the consumer's eye-gaze path. Another aspect of this invention is that certain emotive measurements, e.g., blood pulse measures, can be used to indicate topics or areas, e.g., visual elements, for later research such as a questionnaire if the measurement value(s) meets, exceeds or is less than some pre-determined level set by the researcher.
The physiological apparatus can be worn by the consumer, or, it can be a set of fixed sensors or single sensor remotely located from the consumer that monitors the physiological responses of the consumer when viewing the visual stimulus. For example, the physiological apparatus can be a remotely located infrared camera to monitor changes in body or facial temperature, or the apparatus may be as simple as a watch worn on the wrist of the consumer to monitor heart rate. It should be appreciated that in an exemplary embodiment, the physiological apparatus is a wireless physiological apparatus. In other words, the consumer is not constricted by any physical wires, e.g., electrical cords, limiting their movement or interaction with the visual stimulus. The physiological apparatus can further comprise a separate memory device that stores the data obtained from tracking the consumer's physiological changes, which may be located on the consumer or be remote from the consumer. The memory device can then be electronically or wirelessly connected with a separate computer or storage system to transfer the data. The memory device can further comprise a memory disk, cartridge, or other structure to facilitate the ease of transferring data, e.g., flash memory card. The physiological apparatus can also be configured to wirelessly transfer data to a separate data-capturing system that stores the data, e.g., through Bluetooth technology. Either way, the end result is that the data from the eye-tracking apparatus and the physiological apparatus is transferred to a separate apparatus that is configured to correlate, evaluate, and/or synchronize both sets of data, among other functions. For purposes of a simplified description, the separate apparatus is described as a data-capturing apparatus. The data-capturing apparatus can be a separate computer, a laptop, a database, server, or any other electronic device configured to correlate, evaluate, and/or synchronize data from the physiological apparatus and the eye-tracking apparatus.
The data-capturing apparatus can further comprise additional databases or stored information. For example, known probable emotive states associated with certain physiological or eye-gaze measurement values, or derivative values such as from intermediate analysis, can be stored and looked up in a table within the database and then time-associated, i.e., synchronized, with the viewed element for each or any time interval, or over a period of time, recorded during the period that the consumer is viewing the visual stimulus. It should be appreciated that a given physiological measure can also indicate two or more possible feelings either singly or in combination. In these cases, all possible feelings can be associated with a given time interval in the database.
Another additional database or stored information can be known selection states associated with certain emotive states, physiological, or eye-gaze measurement values, or derivative values such as from intermediate analysis, which can be stored and looked up in a table within the database and then time-associated, i.e., synchronized, with the viewed element for each or any time interval, or over a period of time, recorded during the period that the consumer is viewing the visual stimulus.
In another aspect of the invention, the measurement and tracking with subsequent time- association entry into the data-capturing apparatus of multiple physiological data such as a blood pulse measurement and a voice measurement is possible. For the measured values, a feeling or possible feelings or emotive state(s) can then be assigned for each and associated time interval in the database. The recorded feeling(s) for each can be compared to each other to output a new value of a most likely feeling or emotive state, based on cross-reinforcement of the individual database ascribed feelings, or an analysis sub-routine based on a prior model or correlation created beforehand with the emotive response measures involved. In other words, the data obtained from the eye-tracking apparatus and physiological apparatus, can be used in conjunction with other databases storing information in the data-capturing system to output processed data. The processed data is in a synchronized format.
In all cases, whether one or multiple emotive states are measured, the assigned feelings from models, correlations, monographs, look-up tables and databases and the like, can be adjusted internally for a specific consumer, or different environmental factors known or surmised to modify the feeling/emotive value correspondence can also be used. In some cases, a "control" measure conducted in advance, during or after the viewing test such as a specific consumer's response to controlled stimuli, questions, statements, and the like, can be used to modify the emotive value correspondence in that case. Alternatively, a specific physiological response profile(s) modeled beforehand can be used as the "control."
In one embodiment, a consumer questionnaire is presented to the consumer and obtaining an answer thereto, wherein the questionnaire comprising one or more psychometric, psychographic, demographic questions, among others, can be asked. The answers can be obtained before, during, after, or combination thereof at the time of presenting the visual stimulus to the consumer. The emotive response and selection preference system can further obtain feedback from the consumer's response to the questions asked, with the questions optionally asked after the test and then obtained at that or a later time by the emotive response and selection system. The data can also be correlated with psychometric measurements such as personality trait assessments to further enhance the reliability of the emotive response and selection preference system and methods. In still yet another embodiment, the emotive response and selection preference system provides a company or researcher the ability to evaluate and monitor the body language of a consumer after he/she views a consumer product with the physiological apparatus. The emotive response and selection preference system provides a company the ability to understand and critically evaluate the body language, conscious or unconscious responses, of a consumer to a consumer product. The physiological apparatus can measure a single body language change or a plurality of body language changes of a consumer. Body language changes and measurements include all facial expressions, i.e., monitoring mouth, eye, neck, and jaw muscles, voluntary and involuntary muscle contractions, tissue, cartilage, bone structure, body limb positioning, hands, fingers, shoulder positioning and the like, gestural activity, limb motion patterns, i.e., tapping, patterned head movements, i.e., rotating or nodding, head positioning relative to the body and relative to the applied stimulus, vocal chord tension and resulting tonality, vocal volume (decibels), and speed of speech. When monitoring body language such as facial expressions or vocal changes, a non-invasive physiological apparatus and method can be used. For example, a video digital photography apparatus can be used that captures and may correlate any facial expression change with facial elements analysis software.
In one aspect of the invention, the consumer is presented with questions soliciting attitude and/or behavioral data about the visual stimulus. See e.g., US 2007/0156515.
In another aspect of the invention, the data of the present invention may be stored and transferred according to known methods. See e.g., US 2006/0036751 ; US 2007/0100666.
One aspect of the invention provides for defining an area of interest (AOI) in the visual stimulus that is presented to the consumer. The AOI may be defined by the investigator for numerous reasons. Some non-limiting reasons may be to test a certain characteristic of a product, or part of a graphic in an advertising message, or even a stain on a floor while the consumer performs the task of scrubbing the stain with a product. Alternatively, the AOI may be defined, at least in part, by data (e.g., eye gaze duration in an area of the visual stimulus.)
The visual stimulus and AOFs, for reporting purposes of the investigator, may be illustrated as a graphic. The graphic may be an archived image of the visual stimulus or some other representation. In turn the AOI may be illustrated on the graphic by drawing a circle or some other indicium indicating the location or area of the AOI in the graphic ("AOI indicium"). Of course a visual stimulus (and the graphic of the visual stimulus) may comprise a plurality of AOI's (e.g., 2-10, or more). Each AOI (and thus AOI indicium) need not be uniform in size.
Upon defining the AOI, the researcher may collect biometric data and eye gazing data from the consumer while presenting the visual stimulus to the consumer. By temporally sequencing the collected eye gazing data in relation to the AOI, the researcher can determine when the consumer's gaze is directed within an AOI and thus associate the collected eye gazing data and the collected biometric data in relation to the AOI. Of course biometric data can be translated to emotional metric data before or after being associated with collected eye gazing data (in relation to the AOI). One skilled in the art will know to take into account any "lag time" associated with the biometric data and the emotional response and/or eye gaze data. For example, a cardiac data will often have a lag time (versus say brain function activity data which is essentially or nearly instantaneous).
In one embodiment, the investigator may compare biometric data / emotional metric data / eye gazing data in relation to a first AOI to that of the data in relation to second AOI, and a third AOI, and the like. The emotional metric data or biometric data in relation to the AOI may be presented on a graphic (comprising the visual stimulus) as an indicium. The indicium may be simply presented as raw data or perhaps a symbol (e.g., a needle on a scale) or scalar color-coding or scalar indicium size or the like. The indicium may also communicate a degree of statistical confidence or range or the like for either the emotional metric or biometric data. There may be more than one indicium associated with a given AOI, such as two different biometric or emotional metric or combination indicia; or, indicium based on data from different consumers or the same consumer but in two different time-separated tests. The indicium may represent positive or negative values relative to the specific metric chosen by the researcher. Additionally, the indicium can represent the collection of multiple consumers such as an average, a total, a variation from the mean, a range, a probability, a difference versus a standard, expectation or project goal of the data, as a percentage or number of consumers with data or data that falls within a defined set of limits or a minimum or maximum defined value. Optionally, the eye- gaze path or sequence of viewing may also be shown in whole or part. Of course the researcher may choose to present the data obtained (according the methodologies herein) described by presenting the data in a report that comprises: a graphic of the visual stimulus; an area of interest (AOI) indicium; an emotional metric data indicium or a biometric data indicium regarding the AOI; and an eye gazing indicium regarding the AOI.
The emotive response and selection preference methods described above merely illustrate and disclose preferred methods of many that could be used and produced. The above description and drawings illustrate embodiments, which achieve the objects, features, and advantages of the present invention. However, it is not intended that the present invention be strictly limited to the above-described and illustrated embodiments. Any modification, though presently unforeseeable, of the present invention that comes within the spirit and scope of the following claims should be considered part of the present invention. The dimensions and values disclosed herein are not to be understood as being strictly limited to the exact numerical values recited. Instead, unless otherwise specified, each such dimension is intended to mean both the recited value and a functionally equivalent range surrounding that value. For example, a dimension disclosed as "40 mm" is intended to mean "about 40 mm."

Claims

CLAIMSWhat is claimed is:
1. A method of obtaining consumer research data comprising the steps:
(a) presenting a visual stimulus to a consumer,
(b) collecting eye gazing data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer;
(c) collecting non-ocular biometric data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer.
2. The method of claim 1, further comprising the step of associating said non-ocular biometric data with said eye gazing data, and translating said associated non-ocular biometric data to an associated emotional metric data.
3. The method of claim 1, further comprising the step of translating said non-ocular biometric data to an emotional metric data, and associating the emotional metric data with said eye gazing data.
4. A method of obtaining consumer research data comprising the steps:
(a) presenting a visual stimulus to a consumer;
(b) collecting face direction data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer;
(c) collecting non-ocular biometric data in a non-tethered manner from the consumer while presenting the visual stimulus to the consumer.
5. The method of claim 4, further comprising the step of associating said non-ocular biometric data with said face direction data, and translating said associated non-ocular biometric data to an associated emotional metric data.
6. The method of claim 4, further comprising the step of translating said non-ocular biometric data to an emotional metric data, and associating the emotional metric data with said face direction data.
7. A method of obtaining consumer research data comprising the steps;
(a) presenting a visual stimulus to a consumer;
(b) defining an area of interest (AOI) in the visual stimulus;
(c) collecting eye gazing data from the consumer while presenting the visual stimulus to the consumer and with regard to the AOI;
(d) collecting non-ocular biometric data from the consumer while presenting the visual stimulus to the consumer; and
(e) associating the collected non-ocular biometric data and the collected eye gazing data regarding the AOI.
8. A method of obtaining consumer research data comprising the steps;
(a) presenting a visual stimulus to a consumer;
(b) defining an area of interest (AOI) in the visual stimulus;
(c) collecting eye gazing data from the consumer while presenting the visual stimulus to the consumer and with regard to the AOI;
(d) collecting non-ocular biometric data from the consumer while presenting the visual stimulus to the consumer; and
(e) translating the collected non-ocular biometric data to an emotional metric data;
(f) associating the emotional metric data and the collected eye gazing data regarding the AOI.
9. The method of claims 1-7, or 8, wherein at least a portion of said collected non-ocular biometric data is collected in a non-tethered manner, and is selected from brain function data, voice recognition data, body language data, cardiac data, or combination thereof.
10. The method of claim 1-8, or 9, wherein the biometric data comprises voice recognition data, and wherein the voice recognition data comprises layered voice analysis data.
PCT/US2007/019487 2006-09-07 2007-09-07 Methods for measuring emotive response and selection preference WO2008030542A2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
EP07837845A EP2062206A4 (en) 2006-09-07 2007-09-07 Methods for measuring emotive response and selection preference
BRPI0716106-9A BRPI0716106A2 (en) 2006-09-07 2007-09-07 METHODS FOR MEASURING EMOTIONAL RESPONSE AND PREFERENCE OF CHOICE
JP2009527416A JP5249223B2 (en) 2006-09-07 2007-09-07 Methods for measuring emotional responses and preference trends
CA002663078A CA2663078A1 (en) 2006-09-07 2007-09-07 Methods for measuring emotive response and selection preference
MX2009002419A MX2009002419A (en) 2006-09-07 2007-09-07 Methods for measuring emotive response and selection preference.

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US84275706P 2006-09-07 2006-09-07
US84275506P 2006-09-07 2006-09-07
US60/842,757 2006-09-07
US60/842,755 2006-09-07
US88600407P 2007-01-22 2007-01-22
US88599807P 2007-01-22 2007-01-22
US60/885,998 2007-01-22
US60/886,004 2007-01-22

Publications (2)

Publication Number Publication Date
WO2008030542A2 true WO2008030542A2 (en) 2008-03-13
WO2008030542A3 WO2008030542A3 (en) 2008-06-26

Family

ID=39157853

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/019487 WO2008030542A2 (en) 2006-09-07 2007-09-07 Methods for measuring emotive response and selection preference

Country Status (7)

Country Link
US (2) US20080065468A1 (en)
EP (1) EP2062206A4 (en)
JP (1) JP5249223B2 (en)
BR (1) BRPI0716106A2 (en)
CA (1) CA2663078A1 (en)
MX (1) MX2009002419A (en)
WO (1) WO2008030542A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2549428A2 (en) 2011-07-22 2013-01-23 Dil Brands Method and system for generating behavioral studies of an individual
WO2018167420A1 (en) * 2017-03-14 2018-09-20 Orange Method for enriching a digital content with spontaneous data
CN109828662A (en) * 2019-01-04 2019-05-31 杭州赛鲁班网络科技有限公司 A kind of perception and computing system for admiring commodity
CN113749656A (en) * 2021-08-20 2021-12-07 杭州回车电子科技有限公司 Emotion identification method and device based on multi-dimensional physiological signals
US12042260B2 (en) 2018-12-20 2024-07-23 Panasonic Intellectual Property Management Co., Ltd. Biometric apparatus, biometric method, and non-transitory computer-readable storage medium

Families Citing this family (290)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100649713B1 (en) * 2004-12-06 2006-11-28 한국전자통신연구원 Method for hierarchical system configuration and integrated scheduling to provide multimedia streaming service on a two-level double cluster system
EP1913555B1 (en) * 2005-08-04 2018-05-23 Philips Lighting Holding B.V. Apparatus for monitoring a person having an interest to an object, and method thereof
CA2622365A1 (en) * 2005-09-16 2007-09-13 Imotions-Emotion Technology A/S System and method for determining human emotion by analyzing eye properties
US9658473B2 (en) * 2005-10-07 2017-05-23 Percept Technologies Inc Enhanced optical and perceptual digital eyewear
US8469713B2 (en) * 2006-07-12 2013-06-25 Medical Cyberworlds, Inc. Computerized medical training system
US20080065468A1 (en) * 2006-09-07 2008-03-13 Charles John Berg Methods for Measuring Emotive Response and Selection Preference
WO2008055078A2 (en) * 2006-10-27 2008-05-08 Vivometrics, Inc. Identification of emotional states using physiological responses
US8260690B2 (en) * 2006-11-08 2012-09-04 Kimberly-Clark Worldwide, Inc. System and method for capturing test subject feedback
US20080213736A1 (en) * 2006-12-28 2008-09-04 Jon Morris Method and apparatus for emotional profiling
US8321797B2 (en) * 2006-12-30 2012-11-27 Kimberly-Clark Worldwide, Inc. Immersive visualization center for creating and designing a “total design simulation” and for improved relationship management and market research
WO2008081412A1 (en) * 2006-12-30 2008-07-10 Kimberly-Clark Worldwide, Inc. Virtual reality system including viewer responsiveness to smart objects
US8370207B2 (en) * 2006-12-30 2013-02-05 Red Dot Square Solutions Limited Virtual reality system including smart objects
US8341022B2 (en) * 2006-12-30 2012-12-25 Red Dot Square Solutions Ltd. Virtual reality system for environment building
US8295542B2 (en) 2007-01-12 2012-10-23 International Business Machines Corporation Adjusting a consumer experience based on a 3D captured image stream of a consumer response
US8588464B2 (en) 2007-01-12 2013-11-19 International Business Machines Corporation Assisting a vision-impaired user with navigation based on a 3D captured image stream
US8269834B2 (en) 2007-01-12 2012-09-18 International Business Machines Corporation Warning a user about adverse behaviors of others within an environment based on a 3D captured image stream
WO2008093361A2 (en) * 2007-02-01 2008-08-07 Techvoyant Infotech Private Limited Stimuli based intelligent electronic system
US20080215974A1 (en) * 2007-03-01 2008-09-04 Phil Harrison Interactive user controlled avatar animations
JP5309126B2 (en) 2007-03-29 2013-10-09 ニューロフォーカス・インコーポレーテッド System, method, and apparatus for performing marketing and entertainment efficiency analysis
US20080242951A1 (en) * 2007-03-30 2008-10-02 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Effective low-profile health monitoring or the like
US20080243005A1 (en) * 2007-03-30 2008-10-02 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational user-health testing
US20090005654A1 (en) * 2007-03-30 2009-01-01 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational user-health testing
US20090018407A1 (en) * 2007-03-30 2009-01-15 Searete Llc, A Limited Corporation Of The State Of Delaware Computational user-health testing
US20090024050A1 (en) * 2007-03-30 2009-01-22 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational user-health testing
US20080242952A1 (en) * 2007-03-30 2008-10-02 Searete Llc, A Limited Liablity Corporation Of The State Of Delaware Effective response protocols for health monitoring or the like
US20090119154A1 (en) * 2007-11-07 2009-05-07 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20080242949A1 (en) * 2007-03-30 2008-10-02 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational user-health testing
US20090005653A1 (en) * 2007-03-30 2009-01-01 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational user-health testing
US20090118593A1 (en) * 2007-11-07 2009-05-07 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20080242947A1 (en) * 2007-03-30 2008-10-02 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Configuring software for effective health monitoring or the like
US20080319276A1 (en) * 2007-03-30 2008-12-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational user-health testing
US20080242948A1 (en) * 2007-03-30 2008-10-02 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Effective low-profile health monitoring or the like
WO2008137579A1 (en) * 2007-05-01 2008-11-13 Neurofocus, Inc. Neuro-informatics repository system
WO2008137581A1 (en) 2007-05-01 2008-11-13 Neurofocus, Inc. Neuro-feedback based stimulus compression device
US8392253B2 (en) 2007-05-16 2013-03-05 The Nielsen Company (Us), Llc Neuro-physiology and neuro-behavioral based stimulus targeting system
WO2008141340A1 (en) * 2007-05-16 2008-11-20 Neurofocus, Inc. Audience response measurement and tracking system
US20090030287A1 (en) * 2007-06-06 2009-01-29 Neurofocus Inc. Incented response assessment at a point of transaction
US8494905B2 (en) * 2007-06-06 2013-07-23 The Nielsen Company (Us), Llc Audience response analysis using simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI)
JP5542051B2 (en) 2007-07-30 2014-07-09 ニューロフォーカス・インコーポレーテッド System, method, and apparatus for performing neural response stimulation and stimulation attribute resonance estimation
US20090036755A1 (en) * 2007-07-30 2009-02-05 Neurofocus, Inc. Entity and relationship assessment and extraction using neuro-response measurements
US8635105B2 (en) * 2007-08-28 2014-01-21 The Nielsen Company (Us), Llc Consumer experience portrayal effectiveness assessment system
US8386313B2 (en) 2007-08-28 2013-02-26 The Nielsen Company (Us), Llc Stimulus placement system using subject neuro-response measurements
KR20100047865A (en) 2007-08-28 2010-05-10 뉴로포커스, 인크. Consumer experience assessment system
US8392255B2 (en) 2007-08-29 2013-03-05 The Nielsen Company (Us), Llc Content based selection and meta tagging of advertisement breaks
US9191450B2 (en) * 2007-09-20 2015-11-17 Disney Enterprises, Inc. Measuring user engagement during presentation of media content
US8494610B2 (en) * 2007-09-20 2013-07-23 The Nielsen Company (Us), Llc Analysis of marketing and entertainment effectiveness using magnetoencephalography
US20090083129A1 (en) * 2007-09-20 2009-03-26 Neurofocus, Inc. Personalized content delivery using neuro-response priming data
US8327395B2 (en) * 2007-10-02 2012-12-04 The Nielsen Company (Us), Llc System providing actionable insights based on physiological responses from viewers of media
US20090112849A1 (en) * 2007-10-24 2009-04-30 Searete Llc Selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US20090112693A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Providing personalized advertising
US20090112696A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Method of space-available advertising in a mobile device
US8234262B2 (en) * 2007-10-24 2012-07-31 The Invention Science Fund I, Llc Method of selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US8001108B2 (en) * 2007-10-24 2011-08-16 The Invention Science Fund I, Llc Returning a new content based on a person's reaction to at least two instances of previously displayed content
US20090112694A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Targeted-advertising based on a sensed physiological response by a person to a general advertisement
US8126867B2 (en) * 2007-10-24 2012-02-28 The Invention Science Fund I, Llc Returning a second content based on a user's reaction to a first content
US20090113297A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Requesting a second content based on a user's reaction to a first content
US20090112695A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Physiological response based targeted advertising
US9582805B2 (en) * 2007-10-24 2017-02-28 Invention Science Fund I, Llc Returning a personalized advertisement
US8112407B2 (en) * 2007-10-24 2012-02-07 The Invention Science Fund I, Llc Selecting a second content based on a user's reaction to a first content
US9513699B2 (en) * 2007-10-24 2016-12-06 Invention Science Fund I, LL Method of selecting a second content based on a user's reaction to a first content
US20090112697A1 (en) * 2007-10-30 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Providing personalized advertising
WO2009059246A1 (en) 2007-10-31 2009-05-07 Emsense Corporation Systems and methods providing en mass collection and centralized processing of physiological responses from viewers
US20090158309A1 (en) * 2007-12-12 2009-06-18 Hankyu Moon Method and system for media audience measurement and spatial extrapolation based on site, display, crowd, and viewership characterization
US8615479B2 (en) 2007-12-13 2013-12-24 The Invention Science Fund I, Llc Methods and systems for indicating behavior in a population cohort
US9211077B2 (en) * 2007-12-13 2015-12-15 The Invention Science Fund I, Llc Methods and systems for specifying an avatar
US20090157481A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying a cohort-linked avatar attribute
US8195593B2 (en) * 2007-12-20 2012-06-05 The Invention Science Fund I Methods and systems for indicating behavior in a population cohort
US20090164458A1 (en) * 2007-12-20 2009-06-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems employing a cohort-linked avatar
US8356004B2 (en) * 2007-12-13 2013-01-15 Searete Llc Methods and systems for comparing media content
US20090157660A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems employing a cohort-linked avatar
US20090157751A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying an avatar
US20090157625A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for identifying an avatar-linked population cohort
US20090164302A1 (en) * 2007-12-20 2009-06-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying a cohort-linked avatar attribute
US20090157813A1 (en) * 2007-12-17 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for identifying an avatar-linked population cohort
US8069125B2 (en) * 2007-12-13 2011-11-29 The Invention Science Fund I Methods and systems for comparing media content
US20090156955A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for comparing media content
US20090171164A1 (en) * 2007-12-17 2009-07-02 Jung Edward K Y Methods and systems for identifying an avatar-linked population cohort
US20090164503A1 (en) * 2007-12-20 2009-06-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying a media content-linked population cohort
US8150796B2 (en) * 2007-12-20 2012-04-03 The Invention Science Fund I Methods and systems for inducing behavior in a population cohort
US9418368B2 (en) * 2007-12-20 2016-08-16 Invention Science Fund I, Llc Methods and systems for determining interest in a cohort-linked avatar
US20090164131A1 (en) * 2007-12-20 2009-06-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying a media content-linked population cohort
US9775554B2 (en) * 2007-12-31 2017-10-03 Invention Science Fund I, Llc Population cohort-linked avatar
US20090222305A1 (en) * 2008-03-03 2009-09-03 Berg Jr Charles John Shopper Communication with Scaled Emotional State
US8433612B1 (en) * 2008-03-27 2013-04-30 Videomining Corporation Method and system for measuring packaging effectiveness using video-based analysis of in-store shopper response
WO2009132312A1 (en) * 2008-04-25 2009-10-29 Sorensen Associates Inc. Point of view shopper camera system with orientation sensor
US8462996B2 (en) * 2008-05-19 2013-06-11 Videomining Corporation Method and system for measuring human response to visual stimulus based on changes in facial expression
US8429225B2 (en) 2008-05-21 2013-04-23 The Invention Science Fund I, Llc Acquisition and presentation of data indicative of an extent of congruence between inferred mental states of authoring users
US9161715B2 (en) * 2008-05-23 2015-10-20 Invention Science Fund I, Llc Determination of extent of congruity between observation of authoring user and observation of receiving user
US9192300B2 (en) * 2008-05-23 2015-11-24 Invention Science Fund I, Llc Acquisition and particular association of data indicative of an inferred mental state of an authoring user
US8615664B2 (en) * 2008-05-23 2013-12-24 The Invention Science Fund I, Llc Acquisition and particular association of inference data indicative of an inferred mental state of an authoring user and source identity data
US7904507B2 (en) 2008-05-23 2011-03-08 The Invention Science Fund I, Llc Determination of extent of congruity between observation of authoring user and observation of receiving user
US20090292658A1 (en) * 2008-05-23 2009-11-26 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Acquisition and particular association of inference data indicative of inferred mental states of authoring users
US9101263B2 (en) * 2008-05-23 2015-08-11 The Invention Science Fund I, Llc Acquisition and association of data indicative of an inferred mental state of an authoring user
US8082215B2 (en) * 2008-05-23 2011-12-20 The Invention Science Fund I, Llc Acquisition and particular association of inference data indicative of inferred mental states of authoring users
US8086563B2 (en) * 2008-05-23 2011-12-27 The Invention Science Fund I, Llc Acquisition and particular association of data indicative of an inferred mental state of an authoring user
SE0801267A0 (en) * 2008-05-29 2009-03-12 Cunctus Ab Method of a user unit, a user unit and a system comprising said user unit
US20090318773A1 (en) * 2008-06-24 2009-12-24 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Involuntary-response-dependent consequences
US8219438B1 (en) * 2008-06-30 2012-07-10 Videomining Corporation Method and system for measuring shopper response to products based on behavior and facial expression
CN102077236A (en) * 2008-07-03 2011-05-25 松下电器产业株式会社 Impression degree extraction apparatus and impression degree extraction method
US20100010317A1 (en) * 2008-07-09 2010-01-14 De Lemos Jakob Self-contained data collection system for emotional response testing
US20100010370A1 (en) 2008-07-09 2010-01-14 De Lemos Jakob System and method for calibrating and normalizing eye data in emotional testing
US8136944B2 (en) 2008-08-15 2012-03-20 iMotions - Eye Tracking A/S System and method for identifying the existence and position of text in visual media content and for determining a subjects interactions with the text
US20100070987A1 (en) * 2008-09-12 2010-03-18 At&T Intellectual Property I, L.P. Mining viewer responses to multimedia content
US20100094097A1 (en) * 2008-10-15 2010-04-15 Charles Liu System and method for taking responsive action to human biosignals
US20100123776A1 (en) * 2008-11-18 2010-05-20 Kimberly-Clark Worldwide, Inc. System and method for observing an individual's reaction to their environment
US20100168529A1 (en) * 2008-12-30 2010-07-01 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for presenting an inhalation experience
US8464288B2 (en) * 2009-01-21 2013-06-11 The Nielsen Company (Us), Llc Methods and apparatus for providing personalized media in video
US8270814B2 (en) * 2009-01-21 2012-09-18 The Nielsen Company (Us), Llc Methods and apparatus for providing video with embedded media
US9357240B2 (en) * 2009-01-21 2016-05-31 The Nielsen Company (Us), Llc Methods and apparatus for providing alternate media for video decoders
US8539359B2 (en) * 2009-02-11 2013-09-17 Jeffrey A. Rapaport Social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic
US20120071785A1 (en) * 2009-02-27 2012-03-22 Forbes David L Methods and systems for assessing psychological characteristics
US9558499B2 (en) * 2009-02-27 2017-01-31 The Forbes Consulting Group, Llc Methods and systems for assessing psychological characteristics
US9603564B2 (en) * 2009-02-27 2017-03-28 The Forbes Consulting Group, Llc Methods and systems for assessing psychological characteristics
WO2010100567A2 (en) 2009-03-06 2010-09-10 Imotions- Emotion Technology A/S System and method for determining emotional response to olfactory stimuli
US9886729B2 (en) 2009-03-10 2018-02-06 Gearbox, Llc Computational systems and methods for health services planning and matching
US9858540B2 (en) 2009-03-10 2018-01-02 Gearbox, Llc Computational systems and methods for health services planning and matching
US20180197636A1 (en) * 2009-03-10 2018-07-12 Gearbox Llc Computational Systems and Methods for Health Services Planning and Matching
US9911165B2 (en) 2009-03-10 2018-03-06 Gearbox, Llc Computational systems and methods for health services planning and matching
US10319471B2 (en) 2009-03-10 2019-06-11 Gearbox Llc Computational systems and methods for health services planning and matching
US9892435B2 (en) * 2009-03-10 2018-02-13 Gearbox Llc Computational systems and methods for health services planning and matching
US20100250325A1 (en) 2009-03-24 2010-09-30 Neurofocus, Inc. Neurological profiles for market matching and stimulus presentation
US8905298B2 (en) * 2009-03-24 2014-12-09 The Western Union Company Transactions with imaging analysis
US8473352B2 (en) 2009-03-24 2013-06-25 The Western Union Company Consumer due diligence for money transfer systems and methods
US8285706B2 (en) * 2009-06-10 2012-10-09 Microsoft Corporation Using a human computation game to improve search engine performance
US20120191542A1 (en) * 2009-06-24 2012-07-26 Nokia Corporation Method, Apparatuses and Service for Searching
US20110046502A1 (en) * 2009-08-20 2011-02-24 Neurofocus, Inc. Distributed neuro-response data collection and analysis
US8655437B2 (en) 2009-08-21 2014-02-18 The Nielsen Company (Us), Llc Analysis of the mirror neuron system for evaluation of stimulus
US10987015B2 (en) * 2009-08-24 2021-04-27 Nielsen Consumer Llc Dry electrodes for electroencephalography
US9560984B2 (en) 2009-10-29 2017-02-07 The Nielsen Company (Us), Llc Analysis of controlled and automatic attention for introduction of stimulus material
US20110106750A1 (en) 2009-10-29 2011-05-05 Neurofocus, Inc. Generating ratings predictions using neuro-response data
US8209224B2 (en) * 2009-10-29 2012-06-26 The Nielsen Company (Us), Llc Intracluster content management using neuro-response priming data
US8335715B2 (en) * 2009-11-19 2012-12-18 The Nielsen Company (Us), Llc. Advertisement exchange using neuro-response data
US8335716B2 (en) * 2009-11-19 2012-12-18 The Nielsen Company (Us), Llc. Multimedia advertisement exchange
CA2781753A1 (en) * 2009-11-25 2011-06-03 David J. Vining Advanced multimedia structured reporting
US20110166937A1 (en) * 2010-01-05 2011-07-07 Searete Llc Media output with micro-impulse radar feedback of physiological response
US9019149B2 (en) 2010-01-05 2015-04-28 The Invention Science Fund I, Llc Method and apparatus for measuring the motion of a person
US8884813B2 (en) 2010-01-05 2014-11-11 The Invention Science Fund I, Llc Surveillance of stress conditions of persons using micro-impulse radar
US9024814B2 (en) 2010-01-05 2015-05-05 The Invention Science Fund I, Llc Tracking identities of persons using micro-impulse radar
US20110166940A1 (en) * 2010-01-05 2011-07-07 Searete Llc Micro-impulse radar detection of a human demographic and delivery of targeted media content
US9069067B2 (en) 2010-09-17 2015-06-30 The Invention Science Fund I, Llc Control of an electronic apparatus using micro-impulse radar
US9767470B2 (en) 2010-02-26 2017-09-19 Forbes Consulting Group, Llc Emotional survey
US20110237971A1 (en) * 2010-03-25 2011-09-29 Neurofocus, Inc. Discrete choice modeling using neuro-response data
WO2011133548A2 (en) 2010-04-19 2011-10-27 Innerscope Research, Inc. Short imagery task (sit) research method
US20110263946A1 (en) * 2010-04-22 2011-10-27 Mit Media Lab Method and system for real-time and offline analysis, inference, tagging of and responding to person(s) experiences
US8655428B2 (en) 2010-05-12 2014-02-18 The Nielsen Company (Us), Llc Neuro-response data synchronization
JP5465089B2 (en) * 2010-05-31 2014-04-09 キヤノン株式会社 Visual stimulus presentation device for brain function measurement, functional magnetic resonance imaging device, magnetoencephalograph, brain function measurement method
US20140058828A1 (en) * 2010-06-07 2014-02-27 Affectiva, Inc. Optimizing media based on mental state analysis
GB2481323B (en) * 2010-06-17 2016-12-14 Forethought Pty Ltd Measurement of emotional response to sensory stimuli
US20120023161A1 (en) * 2010-07-21 2012-01-26 Sk Telecom Co., Ltd. System and method for providing multimedia service in a communication system
US20120022937A1 (en) * 2010-07-22 2012-01-26 Yahoo! Inc. Advertisement brand engagement value
US8392251B2 (en) 2010-08-09 2013-03-05 The Nielsen Company (Us), Llc Location aware presentation of stimulus material
US8392250B2 (en) 2010-08-09 2013-03-05 The Nielsen Company (Us), Llc Neuro-response evaluated stimulus in virtual reality environments
US20120042263A1 (en) 2010-08-10 2012-02-16 Seymour Rapaport Social-topical adaptive networking (stan) system allowing for cooperative inter-coupling with external social networking systems and other content sources
US8396744B2 (en) 2010-08-25 2013-03-12 The Nielsen Company (Us), Llc Effective virtual reality environments for presentation of marketing materials
WO2012083415A1 (en) * 2010-11-15 2012-06-28 Tandemlaunch Technologies Inc. System and method for interacting with and analyzing media on a display using eye gaze tracking
EP2492705B1 (en) * 2011-02-24 2015-12-30 Takasago International Corporation fMRI method to identify olfactive stimuli of the dopaminergic reward system
US8898091B2 (en) 2011-05-11 2014-11-25 Ari M. Frank Computing situation-dependent affective response baseline levels utilizing a database storing affective responses
US8676937B2 (en) 2011-05-12 2014-03-18 Jeffrey Alan Rapaport Social-topical adaptive networking (STAN) system allowing for group based contextual transaction offers and acceptances and hot topic watchdogging
US9707372B2 (en) * 2011-07-29 2017-07-18 Rosalind Y. Smith System and method for a bioresonance chamber
US8771206B2 (en) * 2011-08-19 2014-07-08 Accenture Global Services Limited Interactive virtual care
US20150135309A1 (en) * 2011-08-20 2015-05-14 Amit Vishram Karmarkar Method and system of user authentication with eye-tracking data
US8988350B2 (en) * 2011-08-20 2015-03-24 Buckyball Mobile, Inc Method and system of user authentication with bioresponse data
US9442565B2 (en) 2011-08-24 2016-09-13 The United States Of America, As Represented By The Secretary Of The Navy System and method for determining distracting features in a visual display
US8854282B1 (en) 2011-09-06 2014-10-07 Google Inc. Measurement method
US8489182B2 (en) 2011-10-18 2013-07-16 General Electric Company System and method of quality analysis in acquisition of ambulatory electrocardiography device data
US9015084B2 (en) 2011-10-20 2015-04-21 Gil Thieberger Estimating affective response to a token instance of interest
US9819711B2 (en) * 2011-11-05 2017-11-14 Neil S. Davey Online social interaction, education, and health care by analysing affect and cognitive features
JP5898970B2 (en) * 2012-01-20 2016-04-06 株式会社日立製作所 Mood evaluation system
US9451303B2 (en) 2012-02-27 2016-09-20 The Nielsen Company (Us), Llc Method and system for gathering and computing an audience's neurologically-based reactions in a distributed framework involving remote storage and computing
US9569986B2 (en) 2012-02-27 2017-02-14 The Nielsen Company (Us), Llc System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
US9292858B2 (en) 2012-02-27 2016-03-22 The Nielsen Company (Us), Llc Data collection system for aggregating biologically based measures in asynchronous geographically distributed public environments
US20130254006A1 (en) * 2012-03-20 2013-09-26 Pick'ntell Ltd. Apparatus and method for transferring commercial data at a store
US9030505B2 (en) * 2012-05-17 2015-05-12 Nokia Technologies Oy Method and apparatus for attracting a user's gaze to information in a non-intrusive manner
US9888842B2 (en) * 2012-05-31 2018-02-13 Nokia Technologies Oy Medical diagnostic gaze tracker
KR20140011204A (en) * 2012-07-18 2014-01-28 삼성전자주식회사 Method for providing contents and display apparatus thereof
US8984065B2 (en) * 2012-08-01 2015-03-17 Eharmony, Inc. Systems and methods for online matching using non-self-identified data
US20140039857A1 (en) * 2012-08-03 2014-02-06 Daniel A. Hill Emotional analytics for performance improvement
US20140040945A1 (en) * 2012-08-03 2014-02-06 Elwha, LLC, a limited liability corporation of the State of Delaware Dynamic customization of audio visual content using personalizing information
US9300994B2 (en) 2012-08-03 2016-03-29 Elwha Llc Methods and systems for viewing dynamically customized audio-visual content
US10237613B2 (en) 2012-08-03 2019-03-19 Elwha Llc Methods and systems for viewing dynamically customized audio-visual content
US10455284B2 (en) 2012-08-31 2019-10-22 Elwha Llc Dynamic customization and monetization of audio-visual content
US8989835B2 (en) 2012-08-17 2015-03-24 The Nielsen Company (Us), Llc Systems and methods to gather and analyze electroencephalographic data
WO2014037937A2 (en) * 2012-09-06 2014-03-13 Beyond Verbal Communication Ltd System and method for selection of data according to measurement of physiological parameters
US10010270B2 (en) * 2012-09-17 2018-07-03 Verily Life Sciences Llc Sensing system
US9477993B2 (en) * 2012-10-14 2016-10-25 Ari M Frank Training a predictor of emotional response based on explicit voting on content and eye tracking to verify attention
KR20150086289A (en) * 2012-11-14 2015-07-27 카네기 멜론 유니버시티 Automated thumbnail selection for online video
US20140149177A1 (en) * 2012-11-23 2014-05-29 Ari M. Frank Responding to uncertainty of a user regarding an experience by presenting a prior experience
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
WO2014088637A1 (en) * 2012-12-07 2014-06-12 Cascade Strategies, Inc. Biosensitive response evaluation for design and research
US20140164056A1 (en) * 2012-12-07 2014-06-12 Cascade Strategies, Inc. Biosensitive response evaluation for design and research
US9230180B2 (en) * 2013-01-18 2016-01-05 GM Global Technology Operations LLC Eyes-off-the-road classification with glasses classifier
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US9320450B2 (en) 2013-03-14 2016-04-26 The Nielsen Company (Us), Llc Methods and apparatus to gather and analyze electroencephalographic data
US20190332656A1 (en) * 2013-03-15 2019-10-31 Sunshine Partners, LLC Adaptive interactive media method and system
CA3187490A1 (en) 2013-03-15 2014-09-18 Interaxon Inc. Wearable computing apparatus and method
US20140287387A1 (en) * 2013-03-24 2014-09-25 Emozia, Inc. Emotion recognition system and method for assessing, monitoring, predicting and broadcasting a user's emotive state
US9424411B2 (en) * 2013-05-23 2016-08-23 Honeywell International Inc. Athentication of device users by gaze
US20140365310A1 (en) * 2013-06-05 2014-12-11 Machine Perception Technologies, Inc. Presentation of materials based on low level feature analysis
US9710787B2 (en) * 2013-07-31 2017-07-18 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for representing, diagnosing, and recommending interaction sequences
US10013892B2 (en) * 2013-10-07 2018-07-03 Intel Corporation Adaptive learning environment driven by real-time identification of engagement level
US10546310B2 (en) 2013-11-18 2020-01-28 Sentient Decision Science, Inc. Systems and methods for assessing implicit associations
US20150213002A1 (en) * 2014-01-24 2015-07-30 International Business Machines Corporation Personal emotion state monitoring from social media
US9773258B2 (en) * 2014-02-12 2017-09-26 Nextep Systems, Inc. Subliminal suggestive upsell systems and methods
US9622702B2 (en) 2014-04-03 2017-04-18 The Nielsen Company (Us), Llc Methods and apparatus to gather and analyze electroencephalographic data
US20150294086A1 (en) * 2014-04-14 2015-10-15 Elwha Llc Devices, systems, and methods for automated enhanced care rooms
US20150302422A1 (en) * 2014-04-16 2015-10-22 2020 Ip Llc Systems and methods for multi-user behavioral research
US10222953B2 (en) * 2014-04-30 2019-03-05 Disney Enterprises, Inc. Systems and methods for editing virtual content of a virtual space
US20160015328A1 (en) * 2014-07-18 2016-01-21 Sony Corporation Physical properties converter
CN105354621A (en) * 2014-08-21 2016-02-24 国际商业机器公司 Method and apparatus for determining storage modes of articles in multiple storage regions
US11851279B1 (en) * 2014-09-30 2023-12-26 Amazon Technologies, Inc. Determining trends from materials handling facility information
US11107091B2 (en) 2014-10-15 2021-08-31 Toshiba Global Commerce Solutions Gesture based in-store product feedback system
US20160110737A1 (en) * 2014-10-17 2016-04-21 Big Heart Pet Brands Product Development Methods for Non-Verbalizing Consumers
WO2016086167A1 (en) * 2014-11-26 2016-06-02 Theranos, Inc. Methods and systems for hybrid oversight of sample collection
US20160253735A1 (en) * 2014-12-30 2016-09-01 Shelfscreen, Llc Closed-Loop Dynamic Content Display System Utilizing Shopper Proximity and Shopper Context Generated in Response to Wireless Data Triggers
US9510788B2 (en) * 2015-02-14 2016-12-06 Physical Enterprises, Inc. Systems and methods for providing user insights based on real-time physiological parameters
US20160292983A1 (en) * 2015-04-05 2016-10-06 Smilables Inc. Wearable infant monitoring device
CN107924643B (en) * 2015-04-05 2021-05-18 斯米拉布莱斯有限公司 Infant development analysis method and system
US10438215B2 (en) * 2015-04-10 2019-10-08 International Business Machines Corporation System for observing and analyzing customer opinion
US9668688B2 (en) 2015-04-17 2017-06-06 Mossbridge Institute, Llc Methods and systems for content response analysis
US9936250B2 (en) 2015-05-19 2018-04-03 The Nielsen Company (Us), Llc Methods and apparatus to adjust content presented to an individual
US20160364774A1 (en) * 2015-06-10 2016-12-15 Richard WITTSIEPE Single action multi-dimensional feedback graphic system and method
JP6553418B2 (en) * 2015-06-12 2019-07-31 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Display control method, display control device and control program
US10872354B2 (en) * 2015-09-04 2020-12-22 Robin S Slomkowski System and method for personalized preference optimization
WO2017047090A1 (en) * 2015-09-18 2017-03-23 日本電気株式会社 Fingerprint imaging system, fingerprint imaging device, image processing device, fingerprint imaging method, and recording medium
US10430810B2 (en) 2015-09-22 2019-10-01 Health Care Direct, Inc. Systems and methods for assessing the marketability of a product
US10242252B2 (en) * 2015-09-25 2019-03-26 Intel Corporation Expression recognition tag
US10148808B2 (en) 2015-10-09 2018-12-04 Microsoft Technology Licensing, Llc Directed personal communication for speech generating devices
US9679497B2 (en) * 2015-10-09 2017-06-13 Microsoft Technology Licensing, Llc Proxies for speech generating devices
US10262555B2 (en) 2015-10-09 2019-04-16 Microsoft Technology Licensing, Llc Facilitating awareness and conversation throughput in an augmentative and alternative communication system
JP6240289B2 (en) * 2015-10-15 2017-11-29 ダイキン工業株式会社 Evaluation device, market research device, and learning evaluation device
US10775882B2 (en) * 2016-01-21 2020-09-15 Microsoft Technology Licensing, Llc Implicitly adaptive eye-tracking user interface
CN105678591A (en) * 2016-02-29 2016-06-15 北京时代云英科技有限公司 Video-analysis-based commercial intelligent operation decision-making support system and method
US10178341B2 (en) * 2016-03-01 2019-01-08 DISH Technologies L.L.C. Network-based event recording
US10726465B2 (en) * 2016-03-24 2020-07-28 International Business Machines Corporation System, method and computer program product providing eye tracking based cognitive filtering and product recommendations
US10187694B2 (en) 2016-04-07 2019-01-22 At&T Intellectual Property I, L.P. Method and apparatus for enhancing audience engagement via a communication network
CA3020974A1 (en) 2016-04-15 2017-10-19 Walmart Apollo, Llc Partiality vector refinement systems and methods through sample probing
GB2564610A (en) 2016-04-15 2019-01-16 Walmart Apollo Llc Systems and methods for providing content-based product recommendations
WO2017180977A1 (en) 2016-04-15 2017-10-19 Wal-Mart Stores, Inc. Systems and methods for facilitating shopping in a physical retail facility
CA3021016A1 (en) * 2016-04-15 2017-10-19 Walmart Apollo, Llc Vector-based characterizations of products
CN105852831A (en) * 2016-05-10 2016-08-17 华南理工大学 Equipment based on virtual reality interaction technology and brain function real-time monitoring technology
US10373464B2 (en) 2016-07-07 2019-08-06 Walmart Apollo, Llc Apparatus and method for updating partiality vectors based on monitoring of person and his or her home
WO2018017868A1 (en) * 2016-07-21 2018-01-25 Magic Leap, Inc. Technique for controlling virtual image generation system using emotional states of user
US10108784B2 (en) * 2016-08-01 2018-10-23 Facecontrol, Inc. System and method of objectively determining a user's personal food preferences for an individualized diet plan
US10120747B2 (en) 2016-08-26 2018-11-06 International Business Machines Corporation Root cause analysis
US10878454B2 (en) 2016-12-23 2020-12-29 Wipro Limited Method and system for predicting a time instant for providing promotions to a user
US20180189802A1 (en) * 2017-01-03 2018-07-05 International Business Machines Corporation System, method and computer program product for sensory simulation during product testing
US10943100B2 (en) 2017-01-19 2021-03-09 Mindmaze Holding Sa Systems, methods, devices and apparatuses for detecting facial expression
WO2018142228A2 (en) 2017-01-19 2018-08-09 Mindmaze Holding Sa Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location including for at least one of a virtual and augmented reality system
KR102520627B1 (en) * 2017-02-01 2023-04-12 삼성전자주식회사 Apparatus and method and for recommending products
WO2018146558A2 (en) 2017-02-07 2018-08-16 Mindmaze Holding Sa Systems, methods and apparatuses for stereo vision and tracking
US10142686B2 (en) * 2017-03-30 2018-11-27 Rovi Guides, Inc. System and methods for disambiguating an ambiguous entity in a search query based on the gaze of a user
US10977674B2 (en) * 2017-04-28 2021-04-13 Qualtrics, Llc Conducting digital surveys that collect and convert biometric data into survey respondent characteristics
CA3175206A1 (en) 2017-05-08 2018-11-15 Symrise Ag Novel fragrance compositions and products with mood enhancing effects
WO2018226550A1 (en) 2017-06-06 2018-12-13 Walmart Apollo, Llc Rfid tag tracking systems and methods in identifying suspicious activities
JP6572943B2 (en) * 2017-06-23 2019-09-11 カシオ計算機株式会社 Robot, robot control method and program
US11010797B2 (en) 2017-07-05 2021-05-18 International Business Machines Corporation Sensors and sentiment analysis for rating systems
CN111629653B (en) 2017-08-23 2024-06-21 神经股份有限公司 Brain-computer interface with high-speed eye tracking features
US11559593B2 (en) * 2017-10-17 2023-01-24 Germbot, LLC Ultraviolet disinfection device
WO2019094953A1 (en) 2017-11-13 2019-05-16 Neurable Inc. Brain-computer interface with adaptations for high-speed, accurate, and intuitive user interactions
US11328533B1 (en) 2018-01-09 2022-05-10 Mindmaze Holding Sa System, method and apparatus for detecting facial expression for motion capture
CN111712192B (en) 2018-01-18 2024-07-02 神经股份有限公司 Brain-computer interface with adaptation to high-speed, accurate and intuitive user interactions
US20210106910A1 (en) 2018-02-20 2021-04-15 International Flavors & Fragrances Inc. Device and Method for Integrating Scent into Virtual Reality Environment
CN117679032A (en) * 2018-05-25 2024-03-12 丰田自动车欧洲公司 System and method for determining perceived load and level of stimulus perception of human brain
US10725536B2 (en) * 2018-08-21 2020-07-28 Disney Enterprises, Inc. Virtual indicium display system for gaze direction in an image capture environment
US10664050B2 (en) 2018-09-21 2020-05-26 Neurable Inc. Human-computer interface using high-speed and accurate tracking of user interactions
US11146867B2 (en) * 2018-10-12 2021-10-12 Blue Yonder Research Limited Apparatus and method for obtaining and processing data relating to user interactions and emotions relating to an event, item or condition
US10860104B2 (en) 2018-11-09 2020-12-08 Intel Corporation Augmented reality controllers and related methods
US11741376B2 (en) * 2018-12-07 2023-08-29 Opensesame Inc. Prediction of business outcomes by analyzing voice samples of users
JP2020119215A (en) * 2019-01-23 2020-08-06 トヨタ自動車株式会社 Information processor, information processing method, program, and demand search system
CN113906368A (en) * 2019-04-05 2022-01-07 惠普发展公司,有限责任合伙企业 Modifying audio based on physiological observations
US11797938B2 (en) 2019-04-25 2023-10-24 Opensesame Inc Prediction of psychometric attributes relevant for job positions
US11393252B2 (en) * 2019-05-01 2022-07-19 Accenture Global Solutions Limited Emotion sensing artificial intelligence
US11553871B2 (en) 2019-06-04 2023-01-17 Lab NINE, Inc. System and apparatus for non-invasive measurement of transcranial electrical signals, and method of calibrating and/or using same for various applications
ES2801024A1 (en) * 2019-06-26 2021-01-07 Banco De Espana BANKNOTE CLASSIFICATION METHOD AND SYSTEM BASED ON NEUROANALYSIS (Machine-translation by Google Translate, not legally binding)
JP7357244B2 (en) * 2019-09-09 2023-10-06 パナソニックIpマネジメント株式会社 Store usage information distribution device, store usage information distribution system equipped with the same, and store usage information distribution method
JP7283336B2 (en) * 2019-09-30 2023-05-30 富士通株式会社 IMPRESSION ESTIMATION METHOD, IMPRESSION ESTIMATION PROGRAM AND IMPRESSION ESTIMATION DEVICE
KR102203786B1 (en) * 2019-11-14 2021-01-15 오로라월드 주식회사 Method and System for Providing Interaction Service Using Smart Toy
WO2021107955A1 (en) * 2019-11-27 2021-06-03 Hewlett-Packard Development Company, L.P. Providing inputs to computing devices
JP7255751B2 (en) * 2020-03-31 2023-04-11 コニカミノルタ株式会社 DESIGN EVALUATION DEVICE, LEARNING DEVICE, PROGRAM AND DESIGN EVALUATION METHOD
US20210350223A1 (en) * 2020-05-07 2021-11-11 International Business Machines Corporation Digital content variations via external reaction
US20210406983A1 (en) * 2020-06-30 2021-12-30 L'oreal System for generating product recommendations using biometric data
US12002081B2 (en) * 2020-06-30 2024-06-04 L'oreal System for generating product recommendations using biometric data
FR3114426A1 (en) * 2020-09-18 2022-03-25 L'oreal SYSTEM FOR GENERATE PRODUCT RECOMMENDATIONS USING BIOMETRIC DATA
FR3113972A1 (en) * 2020-09-10 2022-03-11 L'oreal System for generating product recommendations using biometric data
US20220122096A1 (en) * 2020-10-15 2022-04-21 International Business Machines Corporation Product performance estimation in a virtual reality environment
WO2022192033A1 (en) * 2021-03-08 2022-09-15 Drive Your Art, Llc Billboard simulation and assessment system
US11887405B2 (en) 2021-08-10 2024-01-30 Capital One Services, Llc Determining features based on gestures and scale
US12069535B2 (en) 2022-02-09 2024-08-20 Bank Of America Corporation Intelligent precursory systematized authentication
WO2023224604A1 (en) 2022-05-17 2023-11-23 Symrise Ag Fragrance compositions and products conveying a positive mood
CN116421202B (en) * 2023-02-13 2024-04-02 华南师范大学 Brain visual function rapid detection method, device and storage medium based on electroencephalogram rapid periodic visual stimulus singular paradigm

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4348186A (en) * 1979-12-17 1982-09-07 The United States Of America As Represented By The Secretary Of The Navy Pilot helmet mounted CIG display with eye coupled area of interest
US5243517A (en) * 1988-08-03 1993-09-07 Westinghouse Electric Corp. Method and apparatus for physiological evaluation of short films and entertainment materials
US6330426B2 (en) * 1994-05-23 2001-12-11 Stephen J. Brown System and method for remote education using a memory card
US6292688B1 (en) * 1996-02-28 2001-09-18 Advanced Neurotechnologies, Inc. Method and apparatus for analyzing neurological response to emotion-inducing stimuli
US5676138A (en) * 1996-03-15 1997-10-14 Zawilinski; Kenneth Michael Emotional response analyzer system with multimedia display
NL1002854C2 (en) * 1996-04-12 1997-10-15 Eyelight Research Nv Method and measurement system for measuring and interpreting respondents' responses to presented stimuli, such as advertisements or the like.
JPH10207615A (en) * 1997-01-22 1998-08-07 Tec Corp Network system
US6173260B1 (en) * 1997-10-29 2001-01-09 Interval Research Corporation System and method for automatic classification of speech based upon affective content
IL122632A0 (en) * 1997-12-16 1998-08-16 Liberman Amir Apparatus and methods for detecting emotions
US6190314B1 (en) * 1998-07-15 2001-02-20 International Business Machines Corporation Computer input device with biosensors for sensing user emotions
JP2000099612A (en) * 1998-09-25 2000-04-07 Hitachi Ltd Method for preparing electronic catalog and system therefor
JP4051798B2 (en) * 1999-02-12 2008-02-27 松下電工株式会社 Design construction support system
US7120880B1 (en) * 1999-02-25 2006-10-10 International Business Machines Corporation Method and system for real-time determination of a subject's interest level to media content
AU2248501A (en) * 1999-12-17 2001-06-25 Promo Vu Interactive promotional information communicating system
JP2002175339A (en) * 2000-12-07 2002-06-21 Kenji Mimura Design method for merchandise
GB0101794D0 (en) * 2001-01-24 2001-03-07 Central Research Lab Ltd Monitoring responses to visual stimuli
US6572562B2 (en) * 2001-03-06 2003-06-03 Eyetracking, Inc. Methods for monitoring affective brain function
US20030032890A1 (en) * 2001-07-12 2003-02-13 Hazlett Richard L. Continuous emotional response analysis with facial EMG
US8561095B2 (en) * 2001-11-13 2013-10-15 Koninklijke Philips N.V. Affective television monitoring and control in response to physiological data
US7249603B2 (en) * 2002-04-03 2007-07-31 The Procter & Gamble Company Method for measuring acute stress in a mammal
US7213600B2 (en) * 2002-04-03 2007-05-08 The Procter & Gamble Company Method and apparatus for measuring acute stress
US20040001616A1 (en) * 2002-06-27 2004-01-01 Srinivas Gutta Measurement of content ratings through vision and speech recognition
JP4117781B2 (en) * 2002-08-30 2008-07-16 セイコーインスツル株式会社 Data transmission system and body-mounted communication device
US7046924B2 (en) * 2002-11-25 2006-05-16 Eastman Kodak Company Method and computer program product for determining an area of importance in an image using eye monitoring information
US9274598B2 (en) * 2003-08-25 2016-03-01 International Business Machines Corporation System and method for selecting and activating a target object using a combination of eye gaze and key presses
KR100592934B1 (en) * 2004-05-21 2006-06-23 한국전자통신연구원 Wearable physiological signal detection module and measurement apparatus with the same
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
US20060041401A1 (en) * 2004-08-12 2006-02-23 Johnston Jeffrey M Methods, systems, and computer program products for facilitating user choices among complex alternatives using conjoint analysis in combination with psychological tests, skills tests, and configuration software
US7630522B2 (en) * 2006-03-08 2009-12-08 Microsoft Corporation Biometric measurement using interactive display systems
US20070288300A1 (en) * 2006-06-13 2007-12-13 Vandenbogart Thomas William Use of physical and virtual composite prototypes to reduce product development cycle time
US20080065468A1 (en) * 2006-09-07 2008-03-13 Charles John Berg Methods for Measuring Emotive Response and Selection Preference

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP2062206A4 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2549428A2 (en) 2011-07-22 2013-01-23 Dil Brands Method and system for generating behavioral studies of an individual
EP2549428A3 (en) * 2011-07-22 2014-05-14 Dil Brands Method and system for generating behavioral studies of an individual
WO2018167420A1 (en) * 2017-03-14 2018-09-20 Orange Method for enriching a digital content with spontaneous data
FR3064097A1 (en) * 2017-03-14 2018-09-21 Orange METHOD FOR ENRICHING DIGITAL CONTENT BY SPONTANEOUS DATA
US20200074482A1 (en) * 2017-03-14 2020-03-05 Orange Method for enriching a digital content with spontaneous data
US11954698B2 (en) 2017-03-14 2024-04-09 Orange Method for enriching a digital content with spontaneous data
US12042260B2 (en) 2018-12-20 2024-07-23 Panasonic Intellectual Property Management Co., Ltd. Biometric apparatus, biometric method, and non-transitory computer-readable storage medium
CN109828662A (en) * 2019-01-04 2019-05-31 杭州赛鲁班网络科技有限公司 A kind of perception and computing system for admiring commodity
CN113749656A (en) * 2021-08-20 2021-12-07 杭州回车电子科技有限公司 Emotion identification method and device based on multi-dimensional physiological signals
CN113749656B (en) * 2021-08-20 2023-12-26 杭州回车电子科技有限公司 Emotion recognition method and device based on multidimensional physiological signals

Also Published As

Publication number Publication date
MX2009002419A (en) 2009-03-16
US20100174586A1 (en) 2010-07-08
EP2062206A2 (en) 2009-05-27
WO2008030542A3 (en) 2008-06-26
EP2062206A4 (en) 2011-09-21
JP2010503110A (en) 2010-01-28
BRPI0716106A2 (en) 2014-07-01
US20080065468A1 (en) 2008-03-13
CA2663078A1 (en) 2008-03-13
JP5249223B2 (en) 2013-07-31

Similar Documents

Publication Publication Date Title
JP5249223B2 (en) Methods for measuring emotional responses and preference trends
US11200964B2 (en) Short imagery task (SIT) research method
Li et al. Current and potential methods for measuring emotion in tourism experiences: A review
Tracy et al. Show your pride: Evidence for a discrete emotion expression
CN101512574A (en) Methods for measuring emotive response and selection preference
Bonoma et al. Nonverbal communication in marketing: Toward a communicational analysis
US20100004977A1 (en) Method and System For Measuring User Experience For Interactive Activities
US20090119154A1 (en) Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20120164613A1 (en) Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20090118593A1 (en) Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20090132275A1 (en) Determining a demographic characteristic of a user based on computational user-health testing
Berger et al. Assessing advertising effectiveness: The potential of goal‐directed behavior
Białowąs et al. Eye-tracking in marketing research
US20090172540A1 (en) Population cohort-linked avatar
Schwarzkopf Measurement devices and the psychophysiology of consumer behaviour: A posthuman genealogy of neuromarketing
Drozdova Measuring emotions in marketing and consumer behavior: is face reader an applicable tool?
Wu et al. Neurophysiology of sensory imagery: An effort to improve online advertising effectiveness through science laboratory experimentation
Lopatovska Emotional aspects of the online information retrieval process
Chiu et al. Redesigning the user interface of a healthcare management system for the elderly with a systematic usability testing method
Roemer et al. Eye tracking as a research method for social media
Wang et al. An exploratory study on consumers’ attention towards social media advertising: An electroencephalography approach
Soleymani Implicit and Automated Emtional Tagging of Videos
Hutcherson Measuring arousal through physiological responses to packaging designs: Investigating the validity of electrodermal activity as a measure of arousal in a realistic shopping environment
Pierce Facial Expression Intelligence Scale (FEIS): Recognizing and interpreting facial expressions and implications for consumer behavior
García Díaz Neuromarketing

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780033304.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07837845

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2007837845

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 1135/DELNP/2009

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: MX/A/2009/002419

Country of ref document: MX

WWE Wipo information: entry into national phase

Ref document number: 2663078

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2009527416

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01E

Ref document number: PI0716106

Country of ref document: BR

Free format text: ESCLARECA, O DEPOSITANTE, A DIVERGENCIA NOS NOMES DE INVENTORES CONSTANTES NA PUBLICACAO INTERNACIONAL WO 2008/030542 DE 13/03/2008 E OS CONSTANTES NO DOCUMENTO DE CESSAO APRESENTADO NA PETICAO 20090018681 DE 26/02/2009

ENP Entry into the national phase

Ref document number: PI0716106

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20090226