CN112541668A - Automatic evaluation method and system for product experience - Google Patents

Automatic evaluation method and system for product experience Download PDF

Info

Publication number
CN112541668A
CN112541668A CN202011429565.3A CN202011429565A CN112541668A CN 112541668 A CN112541668 A CN 112541668A CN 202011429565 A CN202011429565 A CN 202011429565A CN 112541668 A CN112541668 A CN 112541668A
Authority
CN
China
Prior art keywords
product
experience
eye movement
electroencephalogram
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011429565.3A
Other languages
Chinese (zh)
Inventor
唐瑞鸿
韩可人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Intention Technology Co ltd
Original Assignee
Beijing Intention Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Intention Technology Co ltd filed Critical Beijing Intention Technology Co ltd
Priority to CN202011429565.3A priority Critical patent/CN112541668A/en
Publication of CN112541668A publication Critical patent/CN112541668A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Computer Interaction (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Neurosurgery (AREA)
  • Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Dermatology (AREA)
  • Game Theory and Decision Science (AREA)
  • Biomedical Technology (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention provides a method and a system for automatically evaluating product experience, wherein the method comprises the following steps: receiving an electroencephalogram signal acquired by an electroencephalogram device and tested in a product experience process to obtain electroencephalogram data changing along with time; receiving eye movement signals acquired by eye movement equipment in the process of product experience to be tested to obtain eye movement data changing along with time; dividing the product into a plurality of interest areas based on each visual element of the product, and calculating visual attention information of each interest area of the product in the process of observing the product to be tested based on eye movement data; calculating visual related experience emotion index values of the tested products in the attention periods of the interest areas of the products based on the electroencephalogram data; and evaluating each visual element of the product based on the visual attention information and the visual related experience emotion to obtain a first evaluation result. Compared with the traditional method for scoring after experience, the automatic evaluation method for product experience has higher real-time property, continuity, objectivity and pertinence.

Description

Automatic evaluation method and system for product experience
Technical Field
The invention relates to the technical field of neuroscience, in particular to a method and a system for automatically evaluating product experience, which realize the evaluation of the product experience through neurology monitoring.
Background
In order to meet the preference of a user, a product manufacturer often performs user experience investigation on products such as food, for example, product taste experience investigation, so as to perform product experience evaluation and better develop and upgrade the products. The current product experience assessment is mostly subjective scoring in some designated aspects after being experienced by consumers, for example, the overall appearance of the product is scored, the smell of the product is scored, the taste of the product is scored, the sweetness of the product is scored, and the like. That is, only limited, individual aspects of a product can be scored by a user scoring the product experience, and it is difficult to distinguish the detail factors that affect the consumer's experience in these aspects. In addition, the experience of the product is a continuous process, which generally includes several links, the accumulation of the experience of each link will ultimately affect whether the consumer purchases the product again, and the influence of the consumer on the integrity of the experience of the product in each link of the experience is difficult to be reflected in the conventional scoring mode at present. In addition, the subjective scoring of the product experience is sometimes difficult to show the real feeling of the consumer. Therefore, the conventional mode of evaluating the product through the consumer experience score often cannot intuitively and accurately evaluate the aspects of the product, so that the pertinence of product development and upgrading is insufficient, the product development and upgrading are limited, and the increase of the product repurchase rate is also limited.
How to more truly and accurately reflect the experience of consumers on each element and each link of the product is a problem to be solved urgently.
Disclosure of Invention
In view of this, the embodiments of the present invention provide an automatic product experience assessment method and system capable of detecting the objective experience of a consumer in a complete product experience link, so as to eliminate or improve one or more defects in the prior art.
The technical scheme of the invention is as follows:
in one aspect of the present invention, a method for automatically evaluating product experience is provided, which includes the following steps:
receiving electroencephalogram signals acquired by electroencephalogram equipment in a plurality of tested product experience processes to obtain electroencephalogram data changing along with time, wherein the product experience processes comprise a product observation stage and a product use stage;
receiving a plurality of eye movement signals acquired by eye movement equipment and tested in the product experience process to obtain eye movement data changing along with time;
dividing the product into a plurality of interest areas based on each visual element of the product, and calculating visual attention information of each interest area of the product to be tested in the product observation stage based on eye movement data, wherein the visual attention information comprises at least one of attention duration, arrival rate and browsing sequence information;
calculating visual related experience emotion index values of the tested products in the attention periods of the interest areas of the products based on the electroencephalogram data;
and evaluating each visual element of the product based on the visual attention information and the visual related experience emotion to obtain a first evaluation result.
Optionally, the starting time and the ending time of different experience events in the product use phase are marked in the electroencephalogram data, and an event-related experience emotion index value during each experience event is calculated based on the electroencephalogram data during each experience event, wherein the product use phase comprises at least one of the following experience events: opening a box event, experiencing a product function event and picking up a box event; and evaluating the product function elements of the experience events based on the experience emotion index values related to the events to obtain a second evaluation result.
Optionally, the packaged product is a boxed or bottled yoghurt or beverage, the product use phase is a product taste experience phase, and the experience product function event is a product taste event.
Optionally, the product taste experience process includes a plurality of experience events of the same type, the method further comprising: and evaluating the experience change related elements of the experience events of the same type based on the event related experience emotion index values corresponding to the experience events of the same type at different times to obtain a third evaluation result.
Optionally, the method further comprises: and obtaining staged evaluation results and comprehensive evaluation results of different stages of the product experience process based on at least two of the first evaluation result, the second evaluation result and the third evaluation result.
Optionally, the method further comprises generating a product upgrade suggestion based on each product evaluation result.
Optionally, the attention duration is an average attention duration obtained based on a plurality of tested eye movement data; the experience emotion index value is an average experience emotion index value obtained based on a plurality of tested electroencephalogram data.
Optionally, the method further comprises, before calculating the emotion index value, preprocessing the electroencephalogram data, the preprocessing comprising: filtering processing and artifact component removing processing.
In another aspect of the present invention, there is also provided an automatic evaluation system for product experience, which includes a processor and a memory, the memory having stored therein computer instructions, the processor being configured to execute the computer instructions stored in the memory, and when the computer instructions are executed by the processor, the system implements the steps of the method as described above.
In another aspect of the present invention, a computer-readable storage medium is also provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the method as set forth above.
According to the automatic evaluation method and system for product experience, the electroencephalogram data and the eye movement data are introduced into a product experience evaluation system. The provided evaluation result is more objective and direct, and the product experience problem existing in a subconscious layer which is not realized by the consumer can be mined. In addition, the method adopted by the invention is used for monitoring the product experience in the whole process of electroencephalogram and oculomotor movement, so that compared with the traditional after-the-fact scoring method, the method has real-time monitoring, continuity, pertinence, objectivity and comprehensiveness, thereby promoting the improvement and upgrade of the product well and increasing the rate of repurchase.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
It will be appreciated by those skilled in the art that the objects and advantages that can be achieved with the present invention are not limited to the specific details set forth above, and that these and other objects that can be achieved with the present invention will be more clearly understood from the detailed description that follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention. For purposes of illustrating and describing some portions of the present invention, corresponding parts of the drawings may be exaggerated, i.e., may be larger, relative to other components in an exemplary apparatus actually manufactured according to the present invention. In the drawings:
fig. 1 is a flowchart illustrating an automatic product experience assessment method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of the distribution of electrodes on the head of a common electroencephalogram device.
Fig. 3 is a schematic diagram of electroencephalogram data during the packaging period of 3 tested observations in an embodiment of the present invention.
Fig. 4 is a schematic diagram of visual information, emotion index information and experience score obtained based on eye movement data and electroencephalogram data in an embodiment of the present invention.
FIG. 5 is a schematic diagram of the results of the trial evaluation performed during the product observation phase and the product tasting phase according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of an experience evaluation result corresponding to a plurality of tasting actions according to an embodiment of the present invention.
Fig. 7 is a schematic diagram of visual information and emotional experience information obtained by analyzing more interest areas of the yogurt package box in another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the following embodiments and accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the structures and/or processing steps closely related to the scheme according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, elements, steps or components, but does not preclude the presence or addition of one or more other features, elements, steps or components.
It is also noted herein that the term "coupled," if not specifically stated, may refer herein to not only a direct connection, but also an indirect connection in which an intermediate is present.
In order to solve the problems of non-real-time, incomplete, non-objective, inaccurate, weak pertinence and the like of product experience evaluation in the prior art, the electroencephalogram data and the eye movement data are introduced into a product experience evaluation system, so that the automatic evaluation method and the automatic evaluation system for the product experience are provided. The product experience assessment is more comprehensive, real-time, continuous, objective, accurate and targeted, targeted improvement and upgrading can be provided for the product experience assessment result, and the product repurchase rate can be greatly improved.
Fig. 1 is a schematic flow chart illustrating a method for automatically evaluating product experience according to an embodiment of the present invention, as shown in fig. 1, the method of the embodiment includes the following steps:
step S110, receiving electroencephalogram signals collected by the electroencephalogram equipment in the experience process of a plurality of tested products to obtain electroencephalogram data of the electroencephalogram signals changing along with time.
In the step, electroencephalogram equipment is adopted to monitor electroencephalogram signals of a plurality of tested products in the experience process, and electroencephalogram data of the plurality of tested products in the experience process are obtained from the electroencephalogram equipment. The Electroencephalogram data is electroencephalograms (EEG), which can record the electrical wave changes during brain activities, and is the overall reflection of the electrophysiological activities of the brain nerve cells on the surface of the cerebral cortex or scalp. When the emotion of the user changes, the brain waves change correspondingly, so that the emotion of the user can be recognized through the brain waves. With the development of artificial intelligence, brain wave-based emotion recognition technology has been rapidly developed.
Electroencephalogram signals (EEG) can be recorded by adopting 64-channel easy cap electroencephalogram acquisition equipment (such as an electrode cap) expanded by an international 10-20 system. Fig. 2 is a schematic diagram showing the distribution of electrodes on the head when the EEG equipment realizes EEG signal monitoring. In the embodiment of the invention, the sampling frequency of the electroencephalogram equipment can be 500Hz, the scalp resistance can be regulated to be below 5k omega, the grounding electrode is AFz, and the reference electrode is FCz. In monitoring the electroencephalogram signals related to emotion, 8 electrodes can be adopted for the right head: f2, F4, F6, F8, FC2, FC4, FC6, and FT8, the left head may employ 8 electrodes: f1, F3, F5, F7, FC1, FC3, FC5, and FT 7. In addition, a greater number of electrodes may be employed. During the process of testing products, the electroencephalogram equipment can be used for obtaining electroencephalogram data of electroencephalogram waves changing along with time.
In the present embodiment, the product may be a packaged product, such as a packaged whole food, for example, a boxed yogurt, a bagged biscuit, etc.; the present invention is not limited to these examples, and may be any commodity which is convenient for experience and has a plurality of design elements. The product experience process may include at least a plurality of different stages, including a product observation stage and a product use stage, and may include the same or different experience events at each stage, for example, a product use stage including an open product package event and an experience product function event (e.g., a product taste event), and an experience product function event may include a plurality of repeated function experience events. As an example, in the case that the packaged product is a packaged food (such as a boxed or bottled food, such as boxed yogurt, bottled beverage, etc.), the product use phase is a product taste experience phase, and the experience product function event is a product taste event. In the embodiment of the invention, the experience process of each stage can be evaluated based on the eye movement data and the brain electricity data of each stage, so that different design elements influencing the experience of consumers of each stage can be improved, and the product repurchase rate is improved.
As mentioned above, in case the product is a food product, the product experience process may comprise the process from taking the packaged product, opening the product package until after tasting the product. The whole product experience process can comprise several stages, such as a stage of observing the product package, opening the product package, using (tasting) the product, etc., and a stage of collecting product residues after tasting. In the embodiment of the invention, the electroencephalogram equipment can monitor the tested electroencephalogram in the whole product experience process, so that electroencephalogram data for recording the whole product experience process are generated.
Step S120, receiving eye movement indexes of a plurality of tested products in the process of experiencing by the eye movement equipment, and obtaining eye movement data changing along with time.
In the embodiment of the present invention, for example, an SMI eye tracker may be used as an eye movement device to collect eye movement indicators of each subject during a product experience process, so as to generate a plurality of subject gaze tracking data, that is, eye movement data, and the eye movement data may be derived by using SMI software. The SMI eye tracker also has a video recording function. Of course, other eye movement instruments may be used to collect the eye movement indicators and generate eye movement data in which the eye movement indicators change over time.
In the whole process of the tested product, the eye tracker and the electroencephalogram device synchronously acquire the tested eye movement data and the electroencephalogram data, so in the embodiment of the invention, the time point of starting or ending data acquisition by the electroencephalogram device and the eye movement device simultaneously in the eye movement data and the electroencephalogram data can be marked by using a mark (marker) to realize the alignment of the electroencephalogram data and the eye movement data.
And step S130, dividing the product into a plurality of interest areas based on the visual elements of the product, and calculating the visual attention information of the product to be tested in each interest area of the product in the product observation stage based on the eye movement data.
In embodiments of the present invention, the individual visual elements of the product are preferably visual elements on the product packaging. For example, for boxed yogurt, each visual element designed on the boxed yogurt can be used as an Area of Interest (AOI), and since each surface of a box has different characters and/or graphics, and is used as a design element for different areas, a box can include a plurality of areas of Interest. For bottled yogurt, various visual elements on the yogurt bottle, including the body and text and graphics on the bottle, can also be divided into multiple regions of interest.
Based on the eye movement indexes acquired by the eye movement instrument, the visual attention information of the tested product in each interest area of the product in the product observation stage can be calculated based on the eye movement data of each interest area. The visual attention information may include, for example, one or more of the following information: time length of attention, arrival rate, first view time length, attention sequence and the like. The attention duration refers to the total duration of stay of each tested sight line in a specific interest area. The arrival rate refers to the number of subjects seeing a particular region of interest/total number of subjects. The focus order refers to the order of viewing of the product packaging elements for each subject. The first-view time refers to the time when each subject first observes a specific interest region, and the first-view time refers to the length of time each subject first observes the specific interest region and then stays in the eyes.
Based on this visual attention information it can be determined which region of interest is more or less interesting for the subject.
For different testees, experiences of interest areas on product packages are likely to be different, so in one embodiment, in order to embody emotional experiences of visual elements of the products to most testees, visual attention information such as attention duration, first-view time, first-view duration and the like of each interest area may be an average attention information value obtained by averaging the sum of corresponding information of a plurality of testees, and the browsing sequence information may be determined based on the browsing sequence of the testees with the highest proportion.
And step S140, calculating vision-related experience emotion index values of the tested product in the attention periods of the interest areas based on the electroencephalogram data.
Although it is possible to determine which region of interest is more or less interested in the subject based on the visual attention information in step S130, it is difficult to determine whether the emotional experience given to the subject by a specific region of interest is a positive emotional experience or a message emotional experience, and therefore, in this step, the value of the emotional index caused by the vision during the attention of each region of interest, i.e., the value of the visual-related experience emotional index, is detected based on the electroencephalogram data.
Because the eye movement data and the brain electrical data are aligned, the eye movement information of each tested product (including product packaging) in the attention period of each interest area is determined, and the emotion index synchronized with the eye movement information can be obtained, namely the emotion experience brought by the visual elements of the product (including product packaging) is obtained.
For different testees, experience emotion indexes caused by interest areas on product packages are likely to be different, some testees may generate positive emotions and some testees may generate negative emotions for the same interest area, and therefore, in one embodiment, in order to embody emotional experiences of product visual elements to most testees, the calculated vision-related experience emotion index value of the testees in the interest period of each interest area of the product may be an experience emotion index value obtained by averaging the multiple testees through the sum of the experience emotion index values.
Based on the existing frontal asymmetry theory (frontal asymmetry theory), the asymmetry activation of the tested frontal lobe alpha wave band can reflect the tested indicators such as motivation, emotion and the like, so that the asymmetry indicator of the frontal lobe alpha wave band can be used for evaluating the real-time emotional experience of the consumer. Therefore, the tested emotional experience index can be calculated in the following mode in the embodiment of the invention: the energy average value (alpha wave 8-13Hz) of the left 8 electrodes (F1, F3, F5, F7, FC1, FC3, FC5 and FT7) is subtracted from the energy average value (alpha wave 8-13Hz) of the right 8 electrodes (F2, F4, F6, F8, FC2, FC4, FC6 and FT8), and finally the emotional experience data of the tested person (consumer) can be obtained. And combining all the collected data with other data of the same class in the database, and converting the data into percentiles for further analysis. Where 45% -55% of the data for the location is defined as a neutral emotional experience, greater than 55% is defined as a positive emotional experience, and less than 45% is defined as a negative emotional experience. The specific method for obtaining emotional experience based on electroencephalogram data can be realized by adopting the prior art, and is not described herein again.
Before calculating the emotion index value, preprocessing may be performed on the electroencephalogram data, and the preprocessing may include: filtering processing, artifact component deleting processing and the like, which belong to the means adopted in the current electroencephalogram signal processing, and are not described herein again.
And S150, evaluating each visual element of the product based on the visual attention information and the visual related experience emotion to obtain a first evaluation result.
That is, in the process of observing the product, the visual attention information obtained based on the eye movement data and the experience emotion corresponding to the visual attention information may be evaluated for each visual element of the product to obtain an evaluation result, which may be referred to as a first evaluation result.
For example, for a square package, the following evaluation results may be obtained according to the electroencephalogram and eye movement indexes:
(1) most of the tested objects pay attention to matching pictures and logos on the front side of the package preferentially; more than half of the consumers will then focus on the brand logo and icon, then focus on the two posters on the side, and the other elements will focus on later times.
(2) The positive elements preferentially attract the attention of the consumers, but the positive elements have no advantages in design (matching and material) and the consumers have poor emotional experience after seeing the positive elements.
(3) The side elements are concerned later, and most design elements bring positive emotional experience to consumers after seeing publicity and case accidents.
(4) The file and the propaganda words are very helpful for establishing positive emotion for consumers, and the key sentences extracted from the propaganda words which suggest the core are put on the front.
Based on the above evaluation results, the product designer can improve the visual elements of the product, thereby meeting the needs of consumers, improving the experience of the consumers, and providing the product repurchase rate.
In addition, in the embodiment of the invention, not only the experience of the product to be tested in the product observation stage can be evaluated, but also the experience of other stages (such as the product use stage) in the whole product experience process can be evaluated.
For example, in the case where the product is a packaged product, the automatic evaluation of product experience of embodiments of the present invention may further include:
step S160, marking the starting time and the ending time of different experience events in the using stage of the product based on the video data obtained by the video recording function of the eye movement equipment, and calculating the experience emotion index value related to the event in each experience event period based on the electroencephalogram data in each experience event period.
In this step, the product using stage may include an event of opening the product package and/or an event of experiencing the product function, but may also include other events, such as an event of picking up the product package.
For example, for a boxed yogurt product, the start time and the end time of events such as screwing cap, pouring yogurt, drinking yogurt, yogurt aftertaste experience, etc. can be determined based on the video data recorded by the video recording function of the eye tracker. The electroencephalogram data in the time period corresponding to the starting time and the ending time of each experience event can be intercepted, and the index value of the experience emotion related to the event during each experience event can be calculated according to the electroencephalogram data.
For example, the mood that is being tested during the screwing and the mood during the taste of yoghurt can be calculated. If the bottle cap is difficult to unscrew, the bottle cap can possibly cause the trial to generate negative emotion; if the yogurt is too sour or sweet, it may also lead to attempts to create negative emotions.
And step S170, evaluating the product function elements of the experience events based on the experience emotion index values related to the events to obtain a second evaluation result.
The functional elements of the product as referred to herein refer to elements of the product that affect the experience of using the product during the use of the product. Taking box yogurt as an example, functional elements of the product can include: design elements influencing the bottle cap opening operation when the bottle cap is opened (such as whether the bottle cap is designed to be opened easily), design elements influencing the drinking experience when the yoghourt is drunk (such as whether the bottle cap is designed to be convenient for drinking the yoghourt), and product quality elements influencing the taste experience when the yoghourt returns (including the quality of the yoghourt, such as sweetness, acidity, viscosity and the like).
In the step, the elements influencing the experience of consumers in the product using process are evaluated based on the emotional indexes brought by the experience in the product using process, so that the product elements influencing the experience of consumers in the using process can be improved in a targeted manner, and the product repurchase rate can be further improved.
For example, product elements related to events such as screw capping, yogurt pouring, yogurt drinking, and yogurt aftertaste experience can be evaluated based on event-related experience emotion index values corresponding to the events such as screw capping, yogurt pouring, yogurt drinking, and yogurt aftertaste experience, for example, the rationality of bottle cap design, bottle mouth design, yogurt taste, and components affecting yogurt aftertaste are evaluated to obtain an evaluation result (which may be referred to as a second evaluation result) reflecting emotion index values, so that the product elements can be improved in a targeted manner according to the second evaluation result to further improve the product repurchase rate.
In addition, in the embodiment of the invention, for the products such as boxed or bottled yoghourt or beverage, the product using stage comprises a product tasting experience stage, and the experience product function event is the product tasting event. In the case where the product taste experience process may include a plurality of experience events of the same type, for example, the act of drinking yoghurt may be repeated a plurality of times, the method of automatically evaluating the product experience may further include the steps of:
and evaluating the experience change related elements of the experience events of the same type based on the event related experience emotion index values corresponding to the experience events of the same type at different times to obtain an evaluation result (which can be called as a third evaluation result).
For example, a box of yogurt needs a plurality of mouths to drink, but tastes which may be experienced when the user drinks the first mouth are different from those when the user drinks the second mouth or the nth mouth, that is, the taste experience in the early stage and the taste experience in the later stage in the product may be different, so that experience emotion index values brought by repeated experience events occurring at different times can be monitored, experience-related elements are evaluated based on the monitored experience emotion index values, and a third evaluation result is obtained. Therefore, the reasons of different taste differences in different periods can be actively analyzed in the production of the product, so that the product can be upgraded, and the repurchase rate of the product can be improved.
In some embodiments of the present invention, the product observation phase and the product use phase may be different time periods or may overlap. For example, during the product use phase (e.g., yogurt tasting phase), there may be instances where the product or product packaging is viewed while returning the flavor. At the moment, the eye movement action and the product using action can be analyzed in parallel to obtain the corresponding electroencephalogram index.
Based on the above steps, the first evaluation result and the second evaluation result obtained as above respectively represent staged evaluation results obtained by evaluating the visual element and the functional element of the product in two different stages, namely, the product observation stage and the product use stage, and the third evaluation result represents an evaluation result obtained based on experience emotion index values of the same experience event at different times in the same stage, and the staged display is performed based on the first evaluation result and the second evaluation result, so that staged evaluation results of the whole experience process can be obtained, and the integrated evaluation result can be obtained by integrating. And deep causes that cause the same event experience to change over time can be deeply analyzed based on the third evaluation result. And integrating the second evaluation result and the third evaluation result to obtain a comprehensive evaluation result influencing the experience of the tested use.
Based on the evaluation results, positive product upgrading suggestions can be generated. The product upgrade proposal can be automatically generated by the system based on the obtained product design information, for example, the evaluation can be formed based on the product design element in a targeted manner. If the content of the interest area corresponding to the component description part on the package is small, the eye movement data show that the arrival rate of the interest area is high, the attention time is long, but the emotion of the user is negative, but the emotion of the user for experiencing the taste of the product is positive, and based on the result, the upgrading suggestion that the text is too small, so that the user cannot see the text clearly, the experience is poor, and the text needs to be enlarged can be analyzed and obtained.
Based on the method steps, the emotion experience calculated by intercepting the corresponding EEG data through the eye movement video is combined, and finally the emotion experience of each part in the product experience process can be obtained, so that the product experience assessment is more comprehensive, real-time, continuous, objective, accurate and targeted, targeted improvement and upgrading can be provided for the product experience assessment result, and the product repurchase rate can be greatly improved.
The following describes the automatic evaluation method of product experience according to the embodiment of the present invention by taking the tested yogurt as an example.
First, subjects meeting the requirements, e.g., individuals of a particular age and/or gender, are recruited. After enrollment into the test subjects, informed consent and privacy protocols were signed with the test subjects and pre-experimental preparations such as shampooing (for dead skin cleaning, reducing resistance, facilitating data collection), wearing brain electrical devices (scrub and conductive pastes could be used to further reduce head resistance to below 5k Ω prior to wearing), and wearing eye movement devices and calibration were performed.
After the calibration is completed, the data to be tested can be collected. The person can sit on a comfortable chair, the cross staring at the screen is used as a base line, and the person is presented with a guidance language in a voice or interface display mode, wherein the guidance language can be' for example, next, the person wants to drink a piece of yoghourt, and then, the person wants to taste the yoghourt according to the usual drinking habit, and after the user drinks the yoghourt, the person can indicate that the person can drink the yoghourt and what questions the person has.
If the subject is unquestionable and ready to begin experiencing the product, the subject may experience the product in the usual fashion of drinking the product (which may include steps of viewing the package design, opening the product, tasting the product, picking up the product package, etc.). During the period, the electroencephalogram equipment and the eye movement equipment record the whole process. The electroencephalogram data can record electroencephalogram signals by using easy cap 64-channel electroencephalogram acquisition equipment, the sampling rate is 500Hz, the grounding electrode is AFz, and the reference electrode is FCz; the eye movement data can be recorded by adopting an SMI eye movement instrument and watching tracking data and recorded. The two devices record simultaneously, a mark (marker) is inserted into the electroencephalogram and eye movement data at the beginning of an experiment, and a mark (marker) is inserted into the electroencephalogram and eye movement data at the end of the experiment, so that the electroencephalogram and eye movement data are aligned and equal in length.
Based on the eye movement data, all or part of the eye movement indicators related to the area of interest (AOI), i.e. the visual information of each AOI, can be derived by the SMI software, including one or more of the following information: time length of concern, arrival rate, first view time length and attention sequence, wherein:
(1) length of attention: the total length of time that each tested line of sight stays in a particular area of interest.
(2) Arrival rate: see the number of subjects/total subjects for a particular region of interest.
(3) The sequence of attention: and browsing the product packaging elements in sequence.
The electroencephalogram index assessment is adopted in the whole product experience process, the product experience steps can be divided into a product observation stage and a product tasting stage, and the product observation stage can comprise: looking at the packaging step, the product taste stage may include the steps of picking up, capping, pouring the first bite, drinking the first bite, pouring (averaging multiple times), drinking (averaging multiple times), pouring the last bite, drinking the last bite, tasting aftertaste and picking up.
The emotion index can be obtained based on the electroencephalogram data in the following way:
(1) data pre-processing
The data preprocessing may include: high-pass filtering at 0.5 Hz; manually carrying out visual inspection and deleting data fluctuation caused by overlarge tested action; and removing artifact components (ICA) such as electrooculogram, myoelectricity, electrocardio, dead pixel and 50Hz noise.
(2) Turning to reference: the average of the two electrodes TP9, TP10 was calculated and subtracted from each brain electrical channel. TP9 and TP10 electrodes were located at bilateral papillary sites, one of the commonly used reference sites in EEG acquisitions.
(3) Calculating an emotional experience index: the energy average value (alpha wave 8-13Hz) of the left 8 electrodes (F1, F3, F5, F7, FC1, FC3, FC5 and FT7) is subtracted from the energy average value (alpha wave 8-13Hz) of the right 8 electrodes (F2, F4, F6, F8, FC2, FC4, FC6 and FT8), and finally the consumer emotional experience is obtained. The emotional experience data may be converted to a percentage form, where 45% -55% of the data at the location is defined as a neutral emotional experience, greater than 55% is defined as a positive emotional experience, and less than 45% is defined as a negative emotional experience. Of course, other relative numerical representations are possible, such as positive emotional experience being positive, negative emotional experience being negative, and neutral emotional experience being 0.
(4) Emotion indicator card segment: and calculating the mean value of the emotion indexes during the specific event according to the labeling of the eye movement data to obtain the emotional experience of the specific event. The mean of the emoticons during the particular event may be the mean of the emoticons at various times at various stages throughout the product experience.
The 3 rows of electroencephalogram data in fig. 3 are electroencephalogram data obtained when 3 tested objects are observed and packaged in the product observation stage, and the electroencephalogram data are subjected to emotion experience indexes, wherein the attention time intervals of all interest areas are marked on an electroencephalogram based on video recording of an eye tracker. The electroencephalogram data shown in fig. 3 is data obtained by performing data preprocessing.
Fig. 4 shows visual information (arrival rate and total fixation time) and emotion indicator information (emo.i) obtained based on the eye movement data and brain electrical data of 3 subjects, where the total fixation time and emo.i represent the average of the corresponding results of the 3 subjects. As can be seen in fig. 4, the tested emotion is a positive emotion for the first area of interest AOI 1, a negative emotion for the second area of interest AOI 2, and a neutral emotion for the third area of interest AOI 3. The three regions of interest can be scored based on this, whereby a score for the three regions of interest can be obtained, with the AOI 2 score being the lowest, indicating that improvement is a major concern. The specific scoring manner may be various, for example, the score of the interest region may be obtained based on the respective weight and score of each index, but the present invention is not limited thereto.
Similarly, by combining the eye movement data (such as video of an eye tracker) and the electroencephalogram data in the product tasting stage, the emotion indexes corresponding to the experience events of the product to be tested in the product tasting stage can be obtained. Experience events are for example: picking up, screwing the cap, pouring the first mouth, drinking the first mouth, pouring (averaging for many times), drinking (averaging for many times), pouring the last mouth, drinking the last mouth, tasting aftertaste and the like.
Fig. 5 is a schematic diagram of the test evaluation results obtained in the product observation stage and the product tasting stage. As shown in fig. 5, the package is not liked by the consumer, the drinking style (pour) is not liked by the consumer, but the product taste experience is emotionally positive. The product should be optimized for the packaging and drinking mode.
To gain a deeper understanding of the taste of the product to the consumer experience throughout the tasting period, the experience of multiple yogurt drinking events was further evaluated. The taste experience process of the product adopts electroencephalogram index evaluation, and as shown in figure 6, the multiple emotional experiences of the product tasting stage are calculated.
As shown in FIG. 6, the early taste of the product is better, the later taste experience is reduced, and the reason that the emotional experience of the consumers is reduced at the later stage of the product experience can be known in combination with interview.
Fig. 7 is a schematic diagram showing visual information and emotional experience information obtained by analyzing more interest areas of a certain yogurt packaging box according to an embodiment of the present invention, and based on the results shown in fig. 7, evaluation results and corresponding improved upgrade suggestions for the yogurt product can be obtained. As follows:
(1) most of the tested objects pay attention to matching pictures and logos on the front side of the package preferentially; more than half of the consumers will then focus on the brand logo and icon, then focus on the two posters on the side, and the other elements will focus on later times.
(2) The positive elements preferentially attract the attention of the consumers, but the positive elements have no advantages in design (matching and material) and the consumers have poor emotional experience after seeing the positive elements.
(3) The side elements are concerned later, and most design elements bring positive emotional experience to consumers after seeing publicity and case accidents.
(4) The file and the propaganda words are very helpful for establishing positive emotion for consumers, and the key sentences extracted from the propaganda words which suggest the core are put on the front.
The method can detect the objective experience of the consumer in a complete product experience link, thereby optimizing and improving the repurchase rate aiming at the defects. According to the automatic evaluation method and system for product experience, the electroencephalogram data and the eye movement data are introduced into a product experience evaluation system. The provided evaluation result is more objective and direct, and the product experience problem existing in a subconscious layer which is not realized by the consumer can be mined. In addition, the method adopted by the invention is used for monitoring the product experience in the whole process of electroencephalogram and oculomotor movement, so that compared with the traditional after-the-fact scoring method, the method has real-time monitoring, continuity, pertinence, objectivity and comprehensiveness, thereby promoting the improvement and upgrade of the product well and increasing the rate of repurchase.
Corresponding to the above method for automatically evaluating product experience, the present invention further provides an automatic evaluating system for product experience, which comprises a processor and a memory, wherein the memory stores computer instructions, the processor is used for executing the computer instructions stored in the memory, and when the computer instructions are executed by the processor, the system realizes the steps of the method
The present invention also relates to a storage medium on which computer program code may be stored, which when executed may implement various embodiments of the method of the present invention, and which may be a tangible storage medium such as an optical disk, a Random Access Memory (RAM), a memory, a Read Only Memory (ROM), an electrically programmable ROM, an electrically erasable programmable ROM, a register, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
It is to be understood that the invention is not limited to the specific arrangements and instrumentality described above and shown in the drawings. A detailed description of known methods is omitted herein for the sake of brevity. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present invention are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications and additions or change the order between the steps after comprehending the spirit of the present invention.
Those of ordinary skill in the art will appreciate that the various illustrative components, systems, and methods described in connection with the embodiments disclosed herein may be implemented as hardware, software, or combinations of both. Whether this is done in hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, plug-in, function card, or the like. When implemented in software, the elements of the invention are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine-readable medium or transmitted by a data signal carried in a carrier wave over a transmission medium or a communication link. A "machine-readable medium" may include any medium that can store or transfer information. Examples of a machine-readable medium include electronic circuits, semiconductor memory devices, ROM, flash memory, Erasable ROM (EROM), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, Radio Frequency (RF) links, and so forth. The code segments may be downloaded via computer networks such as the internet, intranet, etc.
It should also be noted that the exemplary embodiments mentioned in this patent describe some methods or systems based on a series of steps or devices. However, the present invention is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments and/or in combination with or instead of the features of the other embodiments in the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes may be made to the embodiment of the present invention by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for automated assessment of product experience, the method comprising the steps of:
receiving electroencephalogram signals acquired by electroencephalogram equipment in a plurality of tested product experience processes to obtain electroencephalogram data changing along with time, wherein the product experience processes comprise a product observation stage and a product use stage;
receiving a plurality of eye movement signals acquired by eye movement equipment and tested in the product experience process to obtain eye movement data changing along with time;
dividing the product into a plurality of interest areas based on each visual element of the product, and calculating visual attention information of each interest area of the product to be tested in the product observation stage based on eye movement data, wherein the visual attention information comprises at least one of attention duration, arrival rate and browsing sequence information;
calculating visual related experience emotion index values of the tested products in the attention periods of the interest areas of the products based on the electroencephalogram data;
and evaluating each visual element of the product based on the visual attention information and the visual related experience emotion to obtain a first evaluation result.
2. The method of claim 1, wherein the product is a packaged product, the method further comprising the steps of:
marking the starting time and the ending time of different experience events in the product use phase based on video data of an eye movement device, and calculating an event-related experience emotion index value in each experience event based on electroencephalogram data in each experience event, wherein the product use phase comprises at least one of the following experience events: opening a product package event and experiencing a product function event;
and evaluating the product function elements of the experience events based on the experience emotion index values related to the events to obtain a second evaluation result.
3. The method of claim 2, wherein the packaged product is a boxed or bottled yogurt or beverage, the product use phase is a product taste experience phase, and the experienced product function event is a product taste event.
4. The method of claim 3, wherein the product tasting experience process comprises a plurality of experience events of the same type, the method further comprising:
and evaluating the experience change related elements of the experience events of the same type based on the event related experience emotion index values corresponding to the experience events of the same type at different times to obtain a third evaluation result.
5. The method of claim 3, further comprising:
and obtaining staged evaluation results and comprehensive evaluation results of different stages of the product experience process based on at least two of the first evaluation result, the second evaluation result and the third evaluation result.
6. The method of claim 5, further comprising:
and generating a product upgrading suggestion based on each evaluation result.
7. The method of claim 1,
the attention duration is an average attention duration obtained based on a plurality of tested eye movement data;
the experience emotion index value is an average experience emotion index value obtained based on a plurality of tested electroencephalogram data.
8. The method of claim 1, further comprising, prior to calculating the emotion index value, pre-processing the brain electrical data, the pre-processing comprising: filtering processing and artifact component removing processing.
9. An automated product experience assessment system comprising a processor and a memory, wherein the memory has stored therein computer instructions for executing computer instructions stored in the memory, which when executed by the processor, performs the steps of the method according to any of claims 1 to 8.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
CN202011429565.3A 2020-12-09 2020-12-09 Automatic evaluation method and system for product experience Pending CN112541668A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011429565.3A CN112541668A (en) 2020-12-09 2020-12-09 Automatic evaluation method and system for product experience

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011429565.3A CN112541668A (en) 2020-12-09 2020-12-09 Automatic evaluation method and system for product experience

Publications (1)

Publication Number Publication Date
CN112541668A true CN112541668A (en) 2021-03-23

Family

ID=75019724

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011429565.3A Pending CN112541668A (en) 2020-12-09 2020-12-09 Automatic evaluation method and system for product experience

Country Status (1)

Country Link
CN (1) CN112541668A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112948691A (en) * 2021-03-29 2021-06-11 建信金融科技有限责任公司 Method and device for calculating experience index of entity place
CN113080532A (en) * 2021-05-14 2021-07-09 安徽中烟工业有限责任公司 Heating cigarette smoking set user experience quantitative analysis method based on eye movement tracking technology

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107274223A (en) * 2017-06-13 2017-10-20 杭州电子科技大学 Fusion EEG signals and the advertisement evaluations method for watching tracking characteristics attentively
CN110414856A (en) * 2019-08-01 2019-11-05 秒针信息技术有限公司 A kind of method and device for assessing marketing message designing quality
CN111311070A (en) * 2020-01-20 2020-06-19 南京航空航天大学 Product design scheme decision method combining electroencephalogram and eye movement and combining user similarity

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107274223A (en) * 2017-06-13 2017-10-20 杭州电子科技大学 Fusion EEG signals and the advertisement evaluations method for watching tracking characteristics attentively
CN110414856A (en) * 2019-08-01 2019-11-05 秒针信息技术有限公司 A kind of method and device for assessing marketing message designing quality
CN111311070A (en) * 2020-01-20 2020-06-19 南京航空航天大学 Product design scheme decision method combining electroencephalogram and eye movement and combining user similarity

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112948691A (en) * 2021-03-29 2021-06-11 建信金融科技有限责任公司 Method and device for calculating experience index of entity place
CN113080532A (en) * 2021-05-14 2021-07-09 安徽中烟工业有限责任公司 Heating cigarette smoking set user experience quantitative analysis method based on eye movement tracking technology

Similar Documents

Publication Publication Date Title
CN110801237B (en) Cognitive ability evaluation system based on eye movement and electroencephalogram characteristics
CN107274223B (en) Advertisement evaluation method integrating electroencephalogram signal and gaze tracking characteristics
CN107783945B (en) Search result webpage attention evaluation method and device based on eye movement tracking
Kim et al. EEG based comparative measurement of visual fatigue caused by 2D and 3D displays
CN112541668A (en) Automatic evaluation method and system for product experience
CN109222888B (en) Method for judging reliability of psychological test based on eye movement technology
Chouinard et al. Using automatic face analysis to score infant behaviour from video collected online
US20130060602A1 (en) Systems and methods to determine impact of test subjects
US20230043838A1 (en) Method for determining preference, and device for determining preference using same
Couceiro et al. Spotting problematic code lines using nonintrusive programmers' biofeedback
CN112472089A (en) System and method for judging reliability of psychological test based on eye movement technology
Chynał et al. Shopping behaviour analysis using eyetracking and EEG
DeLapp et al. Preparing for racial discrimination and moving beyond reactive coping: A systematic review
CN112587136B (en) Taste sensory evaluation method and system
Rojas et al. Design perception: combining semantic priming with eye tracking and event-related potential (ERP) techniques to identify salient product visual attributes
Porcu et al. Towards the prediction of the quality of experience from facial expression and gaze direction
CN112637688B (en) Video content evaluation method and video content evaluation system
Meyerding Combining eye-tracking and choice-based conjoint analysis in a bottom-up experiment.
CN105825225A (en) Method for making electroencephalogram target judgment with assistance of machine vision
Shabani et al. Gender Effect on Neural Correlates of Autobiographical False Memories for Brand Images.
Yuvaraj et al. A real time neurophysiological framework for general monitoring awareness of air traffic controllers
KR101929124B1 (en) Attention Analyzing Device Using Consumer Interest Areas and Analyzing Method Thereof
Nomura et al. Extraction of unconscious emotions while watching TV commercials
CN112541667A (en) E-commerce page evaluation method and device
Cao et al. A facilitatory effect of perceptual incongruity on target-source matching in pictorial metaphors of chinese advertising: EEG evidence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination