CN114648354A - Advertisement evaluation method and system based on eye movement tracking and emotional state - Google Patents

Advertisement evaluation method and system based on eye movement tracking and emotional state Download PDF

Info

Publication number
CN114648354A
CN114648354A CN202210184517.5A CN202210184517A CN114648354A CN 114648354 A CN114648354 A CN 114648354A CN 202210184517 A CN202210184517 A CN 202210184517A CN 114648354 A CN114648354 A CN 114648354A
Authority
CN
China
Prior art keywords
emotion
data
advertisement
user
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210184517.5A
Other languages
Chinese (zh)
Inventor
李玉豪
潘煜
徐四华
金佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai international studies university
Original Assignee
Shanghai international studies university
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai international studies university filed Critical Shanghai international studies university
Priority to CN202210184517.5A priority Critical patent/CN114648354A/en
Publication of CN114648354A publication Critical patent/CN114648354A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0242Determining effectiveness of advertisements
    • G06Q30/0245Surveys
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Business, Economics & Management (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Strategic Management (AREA)
  • Psychiatry (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Accounting & Taxation (AREA)
  • Educational Technology (AREA)
  • Social Psychology (AREA)
  • Pathology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Developmental Disabilities (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Child & Adolescent Psychology (AREA)
  • Game Theory and Decision Science (AREA)

Abstract

The invention discloses an advertisement evaluation method and system based on eye tracking and emotional state, wherein the method comprises the following steps: the method comprises the steps of obtaining eye movement data, constructing a visual projection model based on a deep neural network, and obtaining a projection area of a user in an information flow advertisement according to the visual projection model; acquiring emotion data, constructing an emotion assessment model based on a deep neural network, and judging user inverse emotion data according to the emotion assessment model; searching a projection area related to the information flow advertisement from the visual projection model according to the judged user inverse emotion data; and executing the adjustment of advertisement delivery according to the projection area. And recognizing cognitive avoidance and psychological adverse states which may exist in the advertisement browsing process of the user by combining the eye movement data and the emotional state data, and further adjusting the advertisement content delivery according to the detected cognitive avoidance and psychological adverse states.

Description

Advertisement evaluation method and system based on eye movement tracking and emotional state
Technical Field
The invention relates to the technical field of advertisements, in particular to an advertisement evaluation method and system based on eye movement tracking and emotional state
Background
Currently, for a native advertisement (native advertisement) with a matching media function format of a mobile terminal, an evaluation effect is mainly based on a conversion rate (conversion rate) and a skip rate (skip) of data statistics, wherein the conversion rate utilizes a rate of completing an intentional behavior in a visitor of a data statistics page, and an evaluation method based on the conversion rate is beneficial to measuring a mobile user acquisition condition of advertisement content, so that an input-output condition (return on inventment) of advertisement delivery is evaluated, and the evaluation method can also be used for evaluating different strategies of the mobile terminal and differences of delivery. The skipping rate is helpful to determine user preference, know user requirements, and evaluate the accuracy of content personalized delivery, and user's gaze and participation. The existing information flow advertisement evaluation method is only limited to analyzing the content of explicit response of the user, and the content causing user cognitive avoidance and neglect cannot be effectively judged. Information flow advertisements in the context of mobile networks are susceptible to factors such as perceptual intrusion and situational cognitive load, causing cognitive avoidance by the user, i.e., the user consciously ignores such content and generates a psychological adverse response. The traditional information flow advertisement evaluation method cannot evaluate and respond to the reaction of the user and timely adjust, so that the real effectiveness and user experience of the information flow advertisement are endangered, the real scene used by the mobile terminal user is difficult to reproduce, and the adverse state and cognitive avoidance behavior of the user on the information flow advertisement cannot be evaluated in real time.
Disclosure of Invention
One of the objectives of the present invention is to provide an advertisement evaluation method and system based on eye tracking and emotional state, where the method and system capture eye movement data of a user by using a camera of a mobile terminal, acquire emotional state data of the user by using an intelligent wearable device, and identify cognitive avoidance and psychological reversion states that may exist in an advertisement browsing process of the user by combining the eye movement data and the emotional state data, and further adjust advertisement content delivery according to the detected cognitive avoidance and psychological reversion states, thereby reducing negative experience of advertisement delivery.
Another object of the present invention is to provide an advertisement evaluation method and system based on eye tracking and emotional state, which identify the interest areas in the user's eye movement data, and further determine the coincidence degree of the information flow advertisement content and the interest areas in the eye movement data to determine the attention level of the user, and further discover the possibility of the user's cognitive avoidance state.
The invention also aims to provide an advertisement evaluation method and system based on eye movement tracking and emotional states.
To achieve at least one of the above-mentioned objects, the present invention further provides an advertisement evaluation method based on eye tracking and emotional state, the method comprising the steps of:
the method comprises the steps of obtaining eye movement data, constructing a visual projection model based on a deep neural network, and obtaining a projection area of a user in an information flow advertisement according to the visual projection model;
acquiring emotion data, constructing an emotion assessment model based on a deep neural network, and judging user inverse emotion data according to the emotion assessment model;
judging the cognitive avoidance state of the user according to the fixation data of the eye movement data to the projection area;
searching a projection area related to the information flow advertisement from the visual projection model according to the judged user inverse emotion data and cognitive avoidance state data;
and executing the adjustment of advertisement putting according to the projection area.
According to a preferred embodiment of the present invention, the method for constructing the visual projection model comprises: the method comprises the steps of acquiring screen data of a user in real time, acquiring eye movement data of the user in real time through a camera of a mobile terminal of the user, and judging target content areas of different information flow advertisements according to the screen data.
According to another preferred embodiment of the present invention, the method for constructing the visual projection model comprises: constructing a visual projection decision parameter from the acquired eye movement data, wherein the visual projection decision parameter comprises: target area fixation duration, first fixation vision, single fixation duration, single fixation interval standard deviation, target area fixation times, skip rate, re-fixation rate, and re-fixation eye movement.
According to another preferred embodiment of the present invention, the method for constructing the visual projection model comprises: the method comprises the steps of obtaining eye movement data including features of a plurality of samples constructed by eyeball size, eyeball shape, pupil color, real gazing direction and illumination quantity to serve as training data, inputting the training data into the deep neural network, and calculating root mean square error to serve as an updating basis of default parameters of the deep neural network.
According to another preferred embodiment of the invention, the target area watching duration and the single watching duration are calculated by adopting an attention algorithm according to the visual projection determination parameters, a target area watching duration threshold and a single watching duration threshold are respectively configured, and if at least one of the calculated target area watching duration and the single watching duration is greater than the respective corresponding threshold, the sample data is input into an emotion assessment model for judging the inverse emotion.
According to another preferred embodiment of the present invention, the method for the emotion assessment model to judge the contrary emotion comprises the following steps: the method comprises the steps of obtaining eye movement data to construct pupil variation distinguishing parameters, obtaining heart rate data in emotion data to construct heart rate variation distinguishing parameters, wherein the heart rate variation distinguishing parameters comprise a heartbeat standard deviation, adjacent heartbeat intervals and a normal heartbeat ratio of adjacent heartbeats in a browsing interval based on time domain analysis; and heart rate variation degree evaluation, high frequency range evaluation, low frequency range evaluation and high-low frequency ratio based on frequency domain analysis, wherein the pupil variation discrimination parameters comprise the pupil diameter of the fixation point of the left eye and the right eye, nystagmus frequency and blink frequency, the heart rate variation discrimination parameters and the pupil characteristic variation parameters are input into the emotion evaluation model, and the emotion evaluation model parameters are updated by adopting a gradient descent algorithm.
According to another preferred embodiment of the present invention, a time interval in which there is an inverse emotion is determined by the emotion assessment model, a watching projection area of the user on the screen detected by the visual projection model is determined according to the inverse emotion time interval, and further, the corresponding advertisement content is queried according to the watching projection area and the screen data.
According to another preferred embodiment of the present invention, the parameters of the emotion estimation model include: and updating the hidden layer threshold and the input weight after iteration through training data, and setting corresponding reverse emotion level labels on an output layer, wherein the level labels comprise a high level, a middle level and a low level and are used for respectively representing the degree of reverse emotion states.
In order to accomplish at least one of the above objects of the present invention, the present invention further provides an advertisement evaluation system based on eye tracking and emotional state, which performs the above-mentioned advertisement evaluation method based on eye tracking and emotional state.
The present invention further provides a computer-readable storage medium storing a computer program executable by a processor to perform a method for advertisement evaluation based on eye tracking and emotional state.
Drawings
Fig. 1 is a schematic flow chart showing an advertisement evaluation method based on eye tracking and emotional states according to the present invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
It is understood that the terms "a" and "an" should be interpreted as meaning that a number of one element or element is one in one embodiment, while a number of other elements is one in another embodiment, and the terms "a" and "an" should not be interpreted as limiting the number.
Referring to fig. 1, the present invention discloses an advertisement evaluation method and system based on eye tracking and emotional state, wherein the method comprises the following steps: firstly, when an information flow advertisement is displayed on a mobile terminal of a user, sampling the position of the information flow advertisement in a screen in real time, and recording the pixel area of the advertisement in a corresponding area to create screen data, wherein the screen data records the display time, the display position and the display range of different information flow advertisements on the screen; and simultaneously, a front camera of the mobile terminal is started, and eye movement data of a user facing a display interface can be acquired through the front camera. And constructing a visual projection model according to the eye movement data and the acquired screen data. Further, the heart rate data of the user at different moments are recorded through wearable equipment with a heart rate detection function, such as iwatch, and an emotion assessment model is built through the eye movement data and the heart rate data. And recognizing cognitive avoidance and psychological adverse states of the user through the eye movement data and the emotion assessment model. Therefore, the watching position of the visual projection model user in the screen data is searched according to the time of the detected cognitive avoidance and psychological reversion state, so that the advertisement in the screen concerned by the current eyes of the user can be judged according to the watching position, and the advertisement delivery can be further controlled and adjusted through program setting to reduce the negative experience of the user on the advertisement.
Specifically, the invention mainly judges the user cognition avoidance through the eye movement data, and the specific method comprises the following steps: the method comprises the steps of obtaining the watching duration and the watching frequency of corresponding advertisement content of user eye movement data on different screen data, and judging the state of a user at the moment as cognitive avoidance by setting a watching duration threshold and a watching frequency threshold and detecting that at least one of the watching duration and the watching frequency of the advertisement content in the eye movement data is smaller than the corresponding watching duration threshold and the watching frequency threshold. And when at least one of the gazing duration and the gazing frequency of the advertisement content in the detected eye movement data is larger than the corresponding gazing duration threshold and corresponding gazing frequency threshold, further inputting the eye movement data into an emotion assessment model to judge whether inverse psychology exists or not.
It is worth mentioning that the construction method of the visual projection model comprises the following steps: acquiring eye movement data, and constructing a visual projection judgment parameter of the visual projection model according to the eye movement data, wherein the visual projection judgment parameter comprises: the target area fixation time length TFD, the first fixation vision FFD, the single fixation time length SFD, the single fixation interval standard deviation SDSF, the target area fixation time number FN, the skip rate SR, the fixation rate RR and the compensation eye movement CM. The invention preferably selects the existing deep neural network as a training model, constructs a plurality of sample training data including, but not limited to eyeball shape, size, pupil color, real gaze position, illumination quantity and other data through the obtained eye movement data, the deep neural network model comprises an input layer, a hidden layer and an output layer, the invention preferably selects a human posture estimation algorithm (human posture estimation) to carry out feature detection, wherein the hidden layer of the deep neural network model adopts a pre-trained ResNet network, constructs features such as eyeball shape, size, pupil color, real gaze position, illumination quantity and the like based on the default model parameters set by the pre-trained ResNet network, and inputs the constructed features into the input layer of the neural network neural model for training through the human posture estimation algorithm, outputs the trained visual projection judgment parameters, further calculates root mean square error (root mean square r) of the visual projection judgment parameters of each training, and further adopting an attention evaluation algorithm to calculate attention projection of the user on the screen data, and acquiring corresponding advertisement content according to the projected screen data.
Wherein the attention estimation algorithm implemented method comprises: and calculating vectors of the face direction, the pupil and the gaze estimation reference point according to the acquired eye movement data, further adjusting and optimizing the calculated vectors of the face direction, the pupil and the gaze estimation reference point by adopting an RMSprop self-adaptive learning method, and outputting the root mean square error of the projection determination parameters again to normalize the result. The normalized root mean square error user evaluates whether the projection decision parameter satisfies the attention projection of the target area. And calculating a visual projection judgment parameter of the attention projection of the area corresponding to each screen data in a mode of setting the visual judgment parameter threshold, and if the visual projection judgment parameter meets the visual judgment parameter threshold, determining that the attention projection of the user exists in the area corresponding to the screen data at the moment. For example, the target area watching duration TFD may be set, the single watching duration SFD may construct a visual projection determination parameter threshold, if the target area watching duration TFD exists, and at least one of the single watching durations SFD is greater than a corresponding threshold, it is determined that the visual projection determination parameter threshold is satisfied, and the target area corresponding to the screen data is used as a user attention projection area, and is further used to extract advertisement content in the attention projection area. It should be noted that the setting of the threshold of the visual decision parameter according to the present invention may be selected differently according to the eye movement data, and the present invention is not limited specifically.
Further, the emotion assessment model is constructed according to the acquired eye movement data and the heart rate data acquired by the wearable device, and the construction method of the emotion assessment model comprises the following steps: the method comprises the steps of collecting electrocardiosignals of a user in real time, collecting eye movement data of the user in real time, and respectively constructing a heart rate variation distinguishing parameter and a pupil variation distinguishing parameter according to the electrocardiosignals and the eye movement data. The system comprises a multi-modal emotion data processing module, and the heart rate variation discrimination parameter and the pupil variation discrimination parameter are respectively input into the multi-modal emotion data processing module for processing to respectively generate the inverse emotions of different levels.
Specifically, the emotion data processing module of the multimode modality adopts a deep neural network to construct the emotion assessment model, the emotion assessment model comprises an input layer, a hidden layer and an output layer, the data structure of the model is similar to that of the visual projection model, and the heart rate variation discrimination parameters comprise: time domain analysis data based on wearable device detection specifically is heartbeat standard deviation SDNN, adjacent heartbeat interval NN50, and adjacent heartbeat NN50 account for browsing interval normal heartbeat ratio pNN50, and convert the heartbeat interval into frequency domain analysis data based on discrete Fourier transform: the overall heart rate variability of the user during the browsing period evaluates TP, the high frequency range evaluates HFP, the low frequency range evaluates LFP, and the low-high frequency ratio LF/HF. The method and the device can judge the variation heart rate according to the heart rate data of the time domain and the frequency domain, wherein the variation heart rate can be judged in various manners, for example, at least one data in the time domain analysis data can be set as abnormal data, and the abnormal heart rate can be judged, the normal range of the heart rate data is set according to the existing medical specification, and the method and the device are not described in detail. Wherein the pupil variation distinguishing parameter comprises: pupil diameter PD, nystagmus frequency PF and blink frequency EB of the fixation points of the left eye and the right eye of the user. And further constructing a complete reverse emotion assessment parameter according to the pupil variation discrimination parameter and the heart rate variation discrimination parameter. For example, if at least one of the heart rate variability judging parameters is calculated to be abnormal heart rate data and at least one of the pupil variability judging parameters is abnormal, the user can be judged to have adverse emotion.
Therefore, after feature recognition is carried out according to the obtained eye movement data, the heart rate data and the eye movement data obtained through the feature recognition are input into an emotion assessment model, the emotion assessment model is restrained through the constructed inverse emotion distinguishing parameters of the heart rate variation parameters and the pupil variation parameters, and the trained emotion assessment model is obtained and used as a distinguishing model of inverse emotion of the user.
It is worth mentioning that when it is obtained through the emotion assessment model that there is an inverse emotion in a display screen of the mobile terminal browsed by the user in a certain period, a time period in which the inverse emotion exists is obtained, the time period is used as a time period for the client to browse a page for positioning, a attention projection area of the current user in screen data is inquired in the visual projection model through a positioned time range, and further, advertisement content existing in the current projection area is found according to the attention projection area on the screen data, so that operations such as adjusting the advertisement content or cancelling the advertisement are further performed, and negative experience existing in the process that the user browses the advertisement in the area is reduced.
In the emotion evaluation model established by the emotion processing module, the configuration of the input layer nodes, the hidden layer nodes and the output layer nodes may be set according to a geometric pyramid rule (the geometric pyramid rule), for example, when the number of the input layer nodes is p and the number of the output layer nodes is u, the hidden layer nodes q ═ sqrt (p × u). The emotion assessment model adopts a gradient descent method to update the parameters of the model, the parameters of the model comprise hidden layer threshold values and input layer weights, and the emotion assessment model adopts the existing deep neural network, so that the method does not need to describe any more how to update the model parameters.
The emotion assessment model outputs 3 levels of states on an output node for carrying out level label assignment, for example, the output value is (001) corresponding to a low inversion state, the corresponding value is (010) corresponding to a medium inversion state, and the value is (011) corresponding to a high inversion state. The grade labels are respectively used for identifying the corresponding advertisement contents through the object labels, so that the reverse emotion of the user in the browsing process and the browsed advertisement contents can be correlated, and the adjustment of the subsequent advertisement contents is facilitated.
In particular, according to the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program, when executed by a Central Processing Unit (CPU), performs the above-described functions defined in the method of the present application. It should be noted that the computer readable medium mentioned above in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wire segments, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless section, wire section, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It will be understood by those skilled in the art that the embodiments of the present invention described above and illustrated in the drawings are given by way of example only and not by way of limitation, the objects of the invention having been fully and effectively achieved, the functional and structural principles of the present invention having been shown and described in the embodiments, and that various changes or modifications may be made in the embodiments of the present invention without departing from such principles.

Claims (10)

1. An advertisement evaluation method based on eye tracking and emotional state, characterized in that the method comprises the following steps:
the method comprises the steps of obtaining eye movement data, constructing a visual projection model based on a deep neural network, and obtaining a projection area of a user in an information flow advertisement according to the visual projection model;
acquiring emotion data, constructing an emotion assessment model based on a deep neural network, and judging user inverse emotion data according to the emotion assessment model;
judging the cognitive avoidance state of the user according to the fixation data of the eye movement data to the projection area;
searching a projection area related to the information flow advertisement from the visual projection model according to the judged user inverse emotion data and cognitive avoidance state data;
and executing the adjustment of advertisement putting according to the projection area.
2. The advertisement evaluation method based on eye tracking and emotional state as claimed in claim 1, wherein the visual projection model is constructed by: the method comprises the steps of acquiring screen data of a user in real time, acquiring eye movement data of the user in real time through a camera of a mobile terminal of the user, and judging target content areas of different information flow advertisements according to the screen data.
3. The advertisement evaluation method based on eye tracking and emotional state as claimed in claim 1, wherein the visual projection model is constructed by: constructing a visual projection decision parameter from the acquired eye movement data, wherein the visual projection decision parameter comprises: target area fixation duration, first fixation vision, single fixation duration, single fixation interval standard deviation, target area fixation times, skip rate, re-fixation rate, and re-fixation eye movement.
4. The advertisement evaluation method based on eye tracking and emotional state as claimed in claim 3, wherein the visual projection model is constructed by: the method comprises the steps of obtaining features of a plurality of samples which are constructed by eye movement data including eyeball size, eyeball shape, pupil color, real gazing direction and illumination quantity and using the features as training data, inputting the training data into the deep neural network, and calculating a visual projection judgment parameter root mean square error to be used as an updating basis of a default parameter of the deep neural network.
5. The advertisement evaluation method based on eye tracking and emotional state as claimed in claim 4, wherein the target area gazing duration and the single gazing duration are calculated by adopting an attention algorithm according to the visual projection determination parameter, and the target area gazing duration threshold and the single gazing duration threshold are configured respectively, and if at least one of the calculated target area gazing duration and the single gazing duration is greater than the corresponding threshold, the sample data is input into an emotion assessment model for determining the inverse emotion.
6. The advertisement evaluation method based on eye tracking and emotional state as claimed in claim 1, wherein the method for the emotion assessment model to judge the contrary emotion comprises the following steps: the method comprises the steps of obtaining eye movement data to construct pupil variation distinguishing parameters, obtaining heart rate data in emotion data to construct heart rate variation distinguishing parameters, wherein the heart rate variation distinguishing parameters comprise a heartbeat standard deviation, adjacent heartbeat intervals and a normal heartbeat ratio of adjacent heartbeats in a browsing interval based on time domain analysis; and heart rate variation degree evaluation, high frequency range evaluation, low frequency range evaluation and high-low frequency ratio based on frequency domain analysis, wherein the pupil variation discrimination parameters comprise the pupil diameter of the fixation point of the left eye and the right eye, nystagmus frequency and blink frequency, the heart rate variation discrimination parameters and the pupil characteristic variation parameters are input into the emotion evaluation model, and the emotion evaluation model parameters are updated by adopting a gradient descent algorithm.
7. The advertisement evaluation method based on eye tracking and emotional state as claimed in claim 6, wherein the emotion assessment model determines a period of time when there is an inverse emotion, determines a watching projection area of the user on the screen detected by the visual projection model according to the inverse emotion period, and further queries corresponding advertisement content according to the watching projection area and screen data.
8. The advertisement evaluation method based on eye tracking and emotional state as claimed in claim 6, wherein the parameters of the emotional evaluation model comprise: and updating the hidden layer threshold and the input weight after iteration through training data, and setting corresponding reverse emotion level labels on an output layer, wherein the level labels comprise a high level, a middle level and a low level and are used for respectively representing the degree of reverse emotion states.
9. An advertisement evaluation system based on eye tracking and emotional state, characterized in that the system performs an advertisement evaluation method based on eye tracking and emotional state as claimed in claims 1-8.
10. A computer-readable storage medium storing a computer program that can be executed by a processor to perform a method for advertisement evaluation based on eye tracking and emotional state according to any one of claims 1 to 8.
CN202210184517.5A 2022-02-23 2022-02-23 Advertisement evaluation method and system based on eye movement tracking and emotional state Pending CN114648354A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210184517.5A CN114648354A (en) 2022-02-23 2022-02-23 Advertisement evaluation method and system based on eye movement tracking and emotional state

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210184517.5A CN114648354A (en) 2022-02-23 2022-02-23 Advertisement evaluation method and system based on eye movement tracking and emotional state

Publications (1)

Publication Number Publication Date
CN114648354A true CN114648354A (en) 2022-06-21

Family

ID=81993587

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210184517.5A Pending CN114648354A (en) 2022-02-23 2022-02-23 Advertisement evaluation method and system based on eye movement tracking and emotional state

Country Status (1)

Country Link
CN (1) CN114648354A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115120240A (en) * 2022-08-30 2022-09-30 山东心法科技有限公司 Sensitivity evaluation method, equipment and medium for special industry target perception skills
CN115358777A (en) * 2022-08-16 2022-11-18 支付宝(杭州)信息技术有限公司 Advertisement putting processing method and device of virtual world
CN115762772A (en) * 2022-10-10 2023-03-07 北京中科睿医信息科技有限公司 Target object emotion feature determination method, device, equipment and storage medium
CN115904075A (en) * 2022-11-28 2023-04-04 中国汽车技术研究中心有限公司 Vehicle configuration improvement method, system, device and storage medium
CN115904911A (en) * 2022-12-24 2023-04-04 北京津发科技股份有限公司 Web human factor intelligent online evaluation method, system and device based on cloud server
CN116993405A (en) * 2023-09-25 2023-11-03 深圳市火星人互动娱乐有限公司 Method, device and system for implanting advertisements into VR game
CN117808536A (en) * 2024-02-23 2024-04-02 蓝色火焰科技成都有限公司 Interactive advertisement evaluation method, system and delivery terminal

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115358777A (en) * 2022-08-16 2022-11-18 支付宝(杭州)信息技术有限公司 Advertisement putting processing method and device of virtual world
CN115120240A (en) * 2022-08-30 2022-09-30 山东心法科技有限公司 Sensitivity evaluation method, equipment and medium for special industry target perception skills
CN115120240B (en) * 2022-08-30 2022-12-02 山东心法科技有限公司 Sensitivity evaluation method, equipment and medium for special industry target perception skills
CN115762772A (en) * 2022-10-10 2023-03-07 北京中科睿医信息科技有限公司 Target object emotion feature determination method, device, equipment and storage medium
CN115904075A (en) * 2022-11-28 2023-04-04 中国汽车技术研究中心有限公司 Vehicle configuration improvement method, system, device and storage medium
CN115904075B (en) * 2022-11-28 2024-01-02 中国汽车技术研究中心有限公司 Vehicle configuration improvement method, system, device and storage medium
CN115904911A (en) * 2022-12-24 2023-04-04 北京津发科技股份有限公司 Web human factor intelligent online evaluation method, system and device based on cloud server
CN116993405A (en) * 2023-09-25 2023-11-03 深圳市火星人互动娱乐有限公司 Method, device and system for implanting advertisements into VR game
CN116993405B (en) * 2023-09-25 2023-12-05 深圳市火星人互动娱乐有限公司 Method, device and system for implanting advertisements into VR game
CN117808536A (en) * 2024-02-23 2024-04-02 蓝色火焰科技成都有限公司 Interactive advertisement evaluation method, system and delivery terminal
CN117808536B (en) * 2024-02-23 2024-05-14 蓝色火焰科技成都有限公司 Interactive advertisement evaluation method, system and delivery terminal

Similar Documents

Publication Publication Date Title
CN114648354A (en) Advertisement evaluation method and system based on eye movement tracking and emotional state
EP3465532B1 (en) Control device, system and method for determining the perceptual load of a visual and dynamic driving scene
KR102668240B1 (en) Method and device for estimating physical state of a user
JP7070605B2 (en) Focus range estimator, its method and program
KR101697476B1 (en) Method for recognizing continuous emotion for robot by analyzing facial expressions, recording medium and device for performing the method
WO2014052938A1 (en) Systems and methods for sensory and cognitive profiling
Beringer et al. Reliability and validity of machine vision for the assessment of facial expressions
CN110970130A (en) Data processing method for attention defect hyperactivity disorder
CN114209324B (en) Psychological assessment data acquisition method based on image visual cognition and VR system
WO2018171875A1 (en) Control device, system and method for determining the perceptual load of a visual and dynamic driving scene
CN113208593A (en) Multi-modal physiological signal emotion classification method based on correlation dynamic fusion
CN108492855A (en) A kind of apparatus and method for training the elderly's attention
JPH08225028A (en) Awakening condition detecting device for driver
CN113764099A (en) Psychological state analysis method, device, equipment and medium based on artificial intelligence
US11935140B2 (en) Initiating communication between first and second users
Rafiqi et al. Work-in-progress, PupilWare-M: Cognitive load estimation using unmodified smartphone cameras
CN111222374A (en) Lie detection data processing method and device, computer equipment and storage medium
CN113017634B (en) Emotion evaluation method, emotion evaluation device, electronic device, and computer-readable storage medium
US20240233914A1 (en) Predicting mental state characteristics of users of wearable devices
KR20160053455A (en) Method for recognizing continuous emotion for robot by analyzing facial expressions, recording medium and device for performing the method
Ekiz et al. Long short-term memory network based unobtrusive workload monitoring with consumer grade smartwatches
CN114399752A (en) Eye movement multi-feature fusion fatigue detection system and method based on micro eye jump characteristics
US20220225917A1 (en) Electronic device, system and method for predicting the performance of an individual human during a visual perception task
Ortega et al. Open your eyes: Eyelid aperture estimation in Driver Monitoring Systems
CN117636488B (en) Multi-mode fusion learning ability assessment method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination