CN113807904A - Article recommendation method, device and system, computer system and storage medium - Google Patents

Article recommendation method, device and system, computer system and storage medium Download PDF

Info

Publication number
CN113807904A
CN113807904A CN202010557074.0A CN202010557074A CN113807904A CN 113807904 A CN113807904 A CN 113807904A CN 202010557074 A CN202010557074 A CN 202010557074A CN 113807904 A CN113807904 A CN 113807904A
Authority
CN
China
Prior art keywords
emotion
electroencephalogram
data
article
feature data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010557074.0A
Other languages
Chinese (zh)
Inventor
伍悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Wodong Tianjun Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Wodong Tianjun Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Wodong Tianjun Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN202010557074.0A priority Critical patent/CN113807904A/en
Publication of CN113807904A publication Critical patent/CN113807904A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0202Market predictions or forecasting for commercial activities

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Marketing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Psychiatry (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Social Psychology (AREA)
  • Public Health (AREA)
  • Psychology (AREA)
  • Educational Technology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Hospice & Palliative Care (AREA)
  • Veterinary Medicine (AREA)
  • Developmental Disabilities (AREA)
  • Child & Adolescent Psychology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Game Theory and Decision Science (AREA)
  • Bioinformatics & Computational Biology (AREA)

Abstract

The present disclosure provides an item recommendation method, including: acquiring electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles within a preset time period; determining the emotion type corresponding to each article according to the electroencephalogram signal and the electrocardiosignal corresponding to each article; sequencing a plurality of articles according to the emotion type corresponding to each article to generate an article preference list; recommending other items associated with the specified items in the item preference list to the user according to the arrangement order of the items in the item preference list. The present disclosure also provides an article recommendation device and system, a computer system, and a storage medium.

Description

Article recommendation method, device and system, computer system and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an item recommendation method, apparatus and system, a computer system, and a storage medium.
Background
With the rapid development of the internet, people are more and more accustomed to online shopping, but the vast quantity of items on an online shopping platform also makes the choice of the items more difficult.
At present, an online shopping platform generally pushes hot content and user-preferred content to a user to help the user quickly locate an interested article in a mass of articles.
In carrying out the disclosed concept, the inventors discovered: the potential tendency of the user can be found according to the recently browsed content or the order-placing content of the user, and personalized item recommendation can be performed for the user. However, after a user has purchased a certain item, the possibility of the need to repeatedly purchase the item in a short time is very low, and if the item or an item with similar attributes is continuously recommended to the user, the new need of the user cannot be matched in real time, and a more accurate item recommendation effect cannot be achieved.
Disclosure of Invention
In view of the above, the present disclosure provides an item recommendation method, apparatus and system, a computer system and a storage medium.
One aspect of the present disclosure provides an item recommendation method, including: acquiring electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles within a preset time period; determining the emotion type corresponding to each article according to the electroencephalogram signal and the electrocardiosignal corresponding to each article; sequencing the plurality of articles according to the emotion type corresponding to each article to generate an article preference list; recommending other items related to the specified items in the item preference list to the user according to the arrangement sequence of the items in the item preference list.
According to an embodiment of the present disclosure, determining the emotion type corresponding to each item according to the electroencephalogram signal and the electrocardiograph signal corresponding to each item includes: respectively extracting features of M dimensions from the electroencephalogram signal and the electrocardiosignal corresponding to each article to obtain electroencephalogram feature data of M dimensions and heart rate feature data of M dimensions, wherein M is an integer greater than 1; performing data fusion on the electroencephalogram characteristic data with M dimensions and the heart rate characteristic data with M dimensions to obtain fused characteristic data; optimizing the complexity of the fused feature data to obtain optimized feature data; inputting the optimized feature data into an emotion recognition model so as to output probabilities respectively corresponding to each emotion type in k emotion types through the emotion recognition model to obtain k probabilities, wherein k is an integer greater than 1; and determining the final emotion type corresponding to the optimized feature data according to the k probabilities.
According to an embodiment of the present disclosure, the data fusing the electroencephalogram feature data of M dimensions and the heart rate feature data of M dimensions includes: carrying out normalization processing on the electroencephalogram characteristic data of M dimensions and the heart rate characteristic data of M dimensions; performing first data fusion according to the mean value and variance of the electroencephalogram characteristic data of the M dimensions after the normalization processing and the mean value and variance of the heart rate characteristic data of the M dimensions after the normalization processing to obtain characteristic data after the first data fusion; performing second data fusion according to the standard deviation of the electroencephalogram characteristic data of the M dimensions after the normalization processing and the standard deviation of the heart rate characteristic data of the M dimensions after the normalization processing to obtain characteristic data after the second data fusion; and performing third data fusion according to the feature data after the first data fusion and the feature data after the second data fusion to obtain fused feature data.
According to an embodiment of the present disclosure, the k emotion types include negative emotion, calm emotion, and excited emotion; the emotion recognition model comprises a first emotion classification model, a second emotion classification model and a third emotion classification model, wherein: the input of the first emotion classification model is the optimized feature data, and the output of the first emotion classification model comprises the probability of negative emotion and the probability of calm emotion; the input of the second emotion classification model is the optimized feature data, and the output of the second emotion classification model comprises the probability of calm emotion and the probability of excited emotion; and the input of the third emotion classification model is the optimized feature data, and the output of the third emotion classification model comprises the probability of negative emotion and the probability of excited emotion.
According to an embodiment of the present disclosure, the inputting the optimized feature data into an emotion recognition model so as to output, through the emotion recognition model, probabilities respectively corresponding to each of k emotion types includes: inputting the optimized feature data into the first emotion classification model, the second emotion classification model and the third emotion classification model respectively so as to output a first classification result, a second classification result and a third classification result through the first emotion classification model, the second emotion classification model and the third emotion classification; and determining an output of the emotion recognition model according to the first classification result, the second classification result, and the third classification result, wherein the output of the emotion recognition model includes a probability of a negative emotion, a probability of a calm emotion, and a probability of an excited emotion.
According to an embodiment of the present disclosure, the determining, according to the k probabilities, a final emotion type corresponding to the optimized feature data includes: generating an emotion recognition function according to the k probabilities; calculating a function value of negative emotion, a function value of calm emotion and a function value of exciting emotion according to the emotion recognition function; and determining a final emotion type corresponding to the optimized feature data according to the relationship among the negative emotion function value, the calm emotion function value and the exciting emotion function value.
According to the embodiment of the disclosure, the electroencephalogram feature data of the M dimensions includes at least M of the following: the mean value, the standard deviation, the first-order difference mean square error, the second-order difference mean square error, the characteristic entropy, the frequency band energy and the frequency band energy ratio of the electroencephalogram signals; the heart rate characteristic data of M dimensions includes at least M of: the mean value, the standard deviation, the first order difference mean square error, the second order difference mean square error, the characteristic entropy, the frequency band energy and the frequency band energy ratio of the electrocardiosignals.
Another aspect of the present disclosure provides an item recommendation system, including: the system comprises an electroencephalogram monitoring device, a heart rate monitoring device, a terminal and a cloud server; the electroencephalogram monitoring equipment is used for acquiring electroencephalogram signals generated by a user in the process of browsing a plurality of articles according to a preset sampling frequency; preprocessing the electroencephalogram signals and then sending the preprocessed electroencephalogram signals to the terminal or the cloud server; the heart rate monitoring equipment is used for acquiring electrocardiosignals generated by a user in the process of browsing a plurality of articles according to the preset sampling frequency; preprocessing the electrocardiosignals and then sending the preprocessed electrocardiosignals to the terminal or the cloud server; the terminal is used for: acquiring electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles within a preset time period; determining the emotion type corresponding to each article according to the electroencephalogram signal and the electrocardiosignal corresponding to each article; sequencing the plurality of articles according to the emotion type corresponding to each article to generate an article preference list; recommending other items related to the specified items in the item preference list to the user according to the arrangement sequence of the items in the item preference list; the terminal is further configured to: sending emotion types corresponding to the electroencephalogram signals and the electrocardiosignals in each preset acquisition period to the cloud server; the cloud server is used for receiving the electroencephalogram signals sent by the electroencephalogram monitoring equipment, the electrocardiosignals sent by the heart rate monitoring equipment, and the emotion types sent by the terminal and corresponding to the electroencephalogram signals and the electrocardiosignals in each preset acquisition period.
Another aspect of the present disclosure provides an item recommendation device including: the acquisition module is used for acquiring electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles within a preset time period; the determining module is used for determining the emotion type corresponding to each article according to the electroencephalogram signal and the electrocardiosignal corresponding to each article; the generating module is used for sequencing the plurality of articles according to the emotion type corresponding to each article to generate an article preference list; and the recommending module is used for recommending other articles related to the specified articles in the article preference list to the user according to the arrangement sequence of the articles in the article preference list.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Another aspect of the present disclosure provides a computer system comprising: one or more processors; storage means for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method as described above.
According to the embodiment of the disclosure, a technical means that electroencephalogram signals and electrocardiosignals generated in a process that a user browses a plurality of articles in a preset time period are acquired, emotion types corresponding to each article are determined according to the electroencephalogram signals and the electrocardiosignals corresponding to each article, the articles are sorted according to the emotion types corresponding to each article to generate an article preference list, and other articles related to a specified article in the article preference list are recommended to the user according to the arrangement sequence of the articles in the article preference list is adopted. Due to the fact that the articles are sequenced according to the emotion types corresponding to the articles, and other related articles are recommended to the user according to the sequence of the articles, the technical problem that new requirements of the user cannot be matched in real time due to the fact that the articles are recommended to the user according to the browsed content of the user in the related art is at least partially solved, and the technical effect of recommending the articles more accurately is achieved.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following description of embodiments of the present disclosure with reference to the accompanying drawings, in which:
fig. 1 schematically illustrates an application scenario of the item recommendation method and apparatus according to the embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow chart of an item recommendation method according to an embodiment of the present disclosure;
FIG. 3A schematically illustrates a flow chart of a method of determining a type of emotion corresponding to each item from a brain electrical signal and a heart electrical signal corresponding to each item according to an embodiment of the present disclosure;
FIG. 3B schematically illustrates a flow chart of a method of data fusion of M-dimensional electroencephalogram feature data and M-dimensional heart rate feature data according to an embodiment of the present disclosure;
fig. 3C schematically shows a flow chart of a method of outputting probabilities corresponding to each of k emotion types respectively by an emotion recognition model according to an embodiment of the present disclosure;
FIG. 3D schematically illustrates a flow chart of a method of determining a final emotion type corresponding to the optimized feature data from the k probabilities according to an embodiment of the present disclosure;
FIG. 4 schematically illustrates an exemplary block diagram of an item recommendation system according to an embodiment of the disclosure;
FIG. 5 schematically illustrates a block diagram of an item pusher according to an embodiment of the present disclosure; and
FIG. 6 schematically illustrates a block diagram of a computer system suitable for an item pushing method according to an embodiment of the disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
The embodiment of the disclosure provides an article recommendation method. Acquiring electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles within a preset time period; determining the emotion type corresponding to each article according to the electroencephalogram signal and the electrocardiosignal corresponding to each article; sequencing the plurality of articles according to the emotion type corresponding to each article to generate an article preference list; recommending other items related to the specified items in the item preference list to the user according to the arrangement sequence of the items in the item preference list.
Fig. 1 schematically illustrates an application scenario of the item recommendation method and apparatus according to the embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a system architecture to which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, and does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, an application scenario 100 according to this embodiment may include an electroencephalogram monitoring device 101, a heart rate monitoring device 102, a terminal 103, and a cloud server 104.
The electroencephalogram monitoring device 101 may be, for example, a hair band device with a built-in electroencephalogram acquisition module. The heart rate monitoring device 102 may be, for example, a bracelet device with a built-in heart rate acquisition module. The user can browse the item by using the terminal 103, and the user can wear the electroencephalogram monitoring device 101 and the heart rate monitoring device 102 when using the terminal 103, so that the electroencephalogram monitoring device 101 can acquire an electroencephalogram (EEG) of the user and the heart rate monitoring device 102 can acquire an Electrocardiogram (ECG) of the user in the process that the user uses the terminal 103 to browse the item.
The electroencephalogram monitoring device 101 may send the acquired electroencephalogram signals to the terminal 103 or to the cloud server 104 through a network, where the network between the electroencephalogram monitoring device 101 and the terminal 103 may include various connection types, such as wired and/or wireless communication links, and the like.
The heart rate monitoring device 102 may transmit the acquired cardiac electrical signals to the terminal 103 or to the cloud server 104 through a network, wherein the network between the heart rate monitoring device 102 and the terminal 103 may include various connection types, such as a wired and/or wireless communication link, and the like.
The electroencephalogram monitoring device 101 and the heart rate monitoring device 102 can respectively acquire electroencephalogram signals and electrocardiosignals of a user at the same acquisition frequency, so that the terminal 103 can acquire the synchronized electroencephalogram signals and electrocardiosignals.
After acquiring the electroencephalogram signal sent by the electroencephalogram monitoring device 101 and the electrocardiosignal sent by the heart rate monitoring device 102, the terminal 103 may perform feature extraction, data fusion, pattern recognition and other processing on the electroencephalogram signal and the electrocardiosignal so as to obtain an emotion type corresponding to the electroencephalogram signal and the electrocardiosignal, and thus may determine an emotional response of the user in a process of browsing articles.
The terminal 103 may sort the browsed items according to the obtained emotion types to obtain an item preference list, and an arrangement order of the items in the item preference list may reflect a degree of interest of the user in the items, so that the items associated with each item in the item preference list may be recommended to the user according to the arrangement order of the items.
The terminal 103 can send the processed electroencephalogram signal and electrocardiosignal, the emotion types corresponding to the electroencephalogram signal and the electrocardiosignal, and the item preference list ordered according to the emotion types to the cloud server 104, so that the data can be stored in the cloud.
The cloud server 104 may process the electroencephalogram signals sent by the electroencephalogram monitoring device 101 and the electrocardiograph signals sent by the heart rate monitoring device 102 to obtain emotion types corresponding to the electroencephalogram signals and the electrocardiograph signals, sort the items according to the emotion types, and send the sorted item preference list to the terminal 103, so that the terminal 103 may recommend other items to the user according to the item preference list.
The cloud server 104 can also receive the processed electroencephalogram signal and the processed electrocardiosignal sent by the terminal 103, the emotion types corresponding to the electroencephalogram signal and the electrocardiosignal, and the item preference list ordered according to the emotion types, and store the information so that the terminal 103 can obtain the information at any time.
It should be noted that the item recommendation method provided by the embodiment of the present disclosure may be generally executed by the terminal 103. Accordingly, the item recommendation device provided by the embodiment of the present disclosure may be generally disposed in the terminal 103. The terminal 103 provided by the embodiment of the present disclosure may be powered by the cloud server 104. Accordingly, the item recommendation device provided by the embodiment of the present disclosure may also be disposed in the cloud server 104.
It should be understood that the number of electroencephalogram monitoring devices 101, heart rate monitoring devices 102, terminals 103, and cloud servers 104 in fig. 1 is merely illustrative. There may be any number of electroencephalogram monitoring devices 101, heart rate monitoring devices 102, terminals 103, and cloud servers 104, as desired for implementation.
Fig. 2 schematically shows a flow chart of an item recommendation method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S201 to S204.
In operation S201, an electroencephalogram signal and an electrocardiograph signal generated in a process that a user browses a plurality of articles within a preset time period are acquired, and both the electroencephalogram signal and the electrocardiograph signal are physiological signals capable of reflecting mood fluctuations.
According to the embodiment of the disclosure, the terminal 103 may acquire an electroencephalogram signal and an electrocardiograph signal of a user in a process of browsing an article within a preset time period, where the preset time period may be 5 minutes, for example.
In operation S202, an emotion type corresponding to each item is determined according to the electroencephalogram signal and the electrocardiograph signal corresponding to each item.
According to the embodiment of the disclosure, a user can browse a plurality of articles within 5 minutes, and the electroencephalogram signal and the electrocardiosignal corresponding to each article can be determined according to the time stamps carried by the electroencephalogram signal and the electrocardiosignal.
According to the embodiment of the disclosure, the electroencephalogram signal and the electrocardiosignal corresponding to each article can be processed respectively, for example, preprocessing such as filtering and amplification can be performed, so as to remove the influence of surrounding noise. Then, feature extraction can be respectively carried out, and feature data of different physiological signals showing different characteristics when the emotion changes can be obtained. And then, data fusion can be carried out on the characteristic data of the electroencephalogram signals and the characteristic data of the electrocardiosignals, and the fused characteristic data can comprehensively reflect the emotional fluctuation condition. From the fused feature data, a mood type corresponding to the feature data can be identified by using a pattern recognition algorithm, wherein the pattern recognition algorithm can be a neural network algorithm, for example, and the mood type can include excitement, calmness, and passivity, for example.
In operation S203, a plurality of items are sorted according to the emotion type corresponding to each item, and an item preference list is generated.
According to the embodiment of the disclosure, the emotion type corresponding to each item may reflect the degree of interest of the user in the item. For example, if the user browses A, B, C, D, E for a predetermined period of time, if the emotional type corresponding to the physiological signal generated by the user when browsing items C and D is excited, it may be characterized that the user has a high interest level in items C and D. If the emotion types corresponding to the physiological signals generated when the user browses the articles a and E are calm, the general interest degree of the user in the articles a and E can be represented. If the emotional type corresponding to the physiological signal generated by the user when browsing item B is negative, it may be characterized that the user has a low interest level in item B.
According to the embodiment of the disclosure, a plurality of items browsed by the user can be sorted according to exciting, calming and negative orders, for example, five items A, B, C, D, E are sorted according to emotion types, and the arrangement order of the items in the item preference list is C, D, A, E, B. The emotion types corresponding to C and D are the same, the sequence of C and D in the list can be interchanged, and similarly, the sequence of A and E can also be interchanged.
In operation S204, other items associated with the specified item in the item preference list are recommended to the user according to the order of arrangement of the items in the item preference list.
According to the embodiment of the present disclosure, following the above example, the items with the high degree of user interest just right can be recommended to the user according to the ranking order of A, B, C, D, E five items in the item preference list. For example, items C and D in the item preference list that are ranked first may be recommended to the user as items similar to items C and D, or as items of the same type or attribute as items C and D.
According to the embodiment of the disclosure, firstly, electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles in a preset time period are acquired, then, emotion types corresponding to the articles are determined according to the electroencephalogram signals and the electrocardiosignals corresponding to the articles, further, the articles are sorted according to the emotion types corresponding to the articles to generate an article preference list, and finally, other articles related to the specified articles in the article preference list are recommended to the user according to the arrangement sequence of the articles in the article preference list. The method and the device can determine the item preference list interested by the user according to the emotion type of the user when browsing the items, match new requirements of the user in real time according to the item preference list, accurately recommend the items preferred by the user to the user, and improve user experience.
The method shown in fig. 2 is further described with reference to fig. 3A-3D in conjunction with specific embodiments.
Fig. 3A schematically illustrates a flowchart of a method for determining an emotion type corresponding to each item from an electroencephalogram signal and an electrocardiograph signal corresponding to each item according to an embodiment of the present disclosure.
As shown in fig. 3A, operation S202 may specifically include operations S301 to S305.
In operation S301, for the electroencephalogram signal and the electrocardiograph signal corresponding to each item, feature extraction is performed on the electroencephalogram signal and the electrocardiograph signal in M dimensions, respectively, to obtain electroencephalogram feature data in M dimensions and heart rate feature data in M dimensions, where M is an integer greater than 1.
According to the embodiment of the disclosure, the physiological signal can present different characteristics when the emotion changes, and different features can be extracted by using a plurality of feature extraction manners, for example, the feature extraction manner can include the extraction of the feature of the signal in the time domain, the entropy feature and the energy feature in the frequency domain. The features of the time domain may include, for example, a mean, a standard deviation, a first order difference mean square error and a second order difference mean square error, and the energy features of the frequency domain may include, for example, a ratio of a band energy and a band energy of an approximation signal of the physiological signal after wavelet transform.
According to the embodiment of the disclosure, different features are respectively extracted from the electroencephalogram signal and the electrocardiosignal through multiple feature extraction modes, and multi-dimensional feature data of the electroencephalogram signal and multi-dimensional feature data of the electrocardiosignal can be obtained. For example, feature data for multiple dimensions of a brain electrical signal may include: mean, standard deviation, first order difference mean square error, second order difference mean square error, characteristic entropy, frequency band energy and frequency band energy ratio of the electroencephalogram signals. The characteristic data of the plurality of dimensions of the cardiac signal may include: mean value, standard deviation, first order difference mean square error, second order difference mean square error, characteristic entropy, frequency band energy and frequency band energy ratio of the electrocardiosignals.
According to the embodiment of the disclosure, the electroencephalogram signal and the electrocardiosignal are subjected to feature extraction, so that redundant and interfering components in the original electroencephalogram signal and electrocardiosignal can be removed, effective information of the original data can be reserved, the data volume is greatly reduced, and meanwhile, the emotion recognition precision is also improved.
In operation S302, data fusion is performed on the M-dimensional electroencephalogram feature data and the M-dimensional heart rate feature data to obtain fused feature data.
According to the embodiment of the disclosure, the feature data of the electroencephalogram signals with multiple dimensions and the feature data of the electrocardiosignals after feature extraction can be fused, so that the emotional state of a user to an article can be comprehensively reflected according to the two physiological signals.
Fig. 3B schematically illustrates a flowchart of a method of data fusion of M-dimensional electroencephalogram feature data and M-dimensional heart rate feature data according to an embodiment of the present disclosure.
As shown in fig. 3B, operation S302 may specifically include operations S3021 to S3024.
In operation S3021, normalization processing is performed on the M-dimensional electroencephalogram feature data and the M-dimensional heart rate feature data.
According to the embodiment of the disclosure, the feature data of the electroencephalogram signals of multiple dimensions after feature extraction can be called an electroencephalogram feature set, and the feature data of the electrocardiosignals of multiple dimensions after feature extraction can be called a heart rate feature set. Taking the normalization of the electroencephalogram feature set as an example, the following formula (one) can be used to perform normalization processing on each feature data in the electroencephalogram feature set, so that the magnitude of the feature value of each feature data is between [0, 1 ]:
Figure BDA0002543776750000111
wherein the content of the first and second substances,
Figure BDA0002543776750000112
representing the ith of the electroencephalogram feature setCharacteristic value, x, of the normalized characteristic dataminFeature data, x, representing the minimum feature value in the electroencephalogram feature setmaxAnd (3) representing the characteristic data with the maximum characteristic value in the electroencephalogram characteristic set, wherein i represents the ith characteristic data in the electroencephalogram characteristic set.
According to the embodiment of the present disclosure, the heart rate feature set after the normalization processing can be obtained according to the first formula, which is not described herein again.
In operation S3022, performing first data fusion according to the mean and variance of the normalized electroencephalogram feature data of M dimensions and the mean and variance of the normalized heart rate feature data of M dimensions, to obtain feature data after the first data fusion.
According to the embodiment of the disclosure, the first data fusion can be performed on the two pairs of normalized electroencephalogram feature set heart rate feature sets according to the following formula:
Figure BDA0002543776750000121
wherein the content of the first and second substances,
Figure BDA0002543776750000122
represents the mean value of the electroencephalogram feature set,
Figure BDA0002543776750000123
mean, σ, representing a heart rate feature setbVariance, σ, representing a set of electroencephalogram featureshVariance, R, representing a set of heart rate featuresbhRepresenting the feature data after the first data fusion.
In operation S3023, performing second data fusion according to the standard deviation of the normalized electroencephalogram feature data of M dimensions and the standard deviation of the normalized heart rate feature data of M dimensions, to obtain feature data after the second data fusion.
According to the embodiment of the disclosure, a least square weighting method can be adopted to perform second data fusion on the brain electrical characteristic set heart rate characteristic set after normalization processing, and the least square weighting method can adopt the following formula three:
Rbh’=wb*Eb+wh*Ehformula (III)
Wherein, wbWeight representing electroencephalogram feature set, EbFeature data, w, representing a set of electroencephalogram featureshWeights representing heart rate feature sets, EhFeature data representing a set of heart rate features. Rbh' represents feature data after the second data fusion.
According to the embodiment of the disclosure, the weight of the electroencephalogram feature set and the weight of the heart rate feature set are determined according to respective standard deviations, taking the weight of the electroencephalogram feature set as an example, the weight of each dimension of feature data in the electroencephalogram feature set can be determined by using the following formula (four):
Figure BDA0002543776750000124
wherein, wjThe weight of the jth characteristic data in the electroencephalogram characteristic set is represented, and the formula satisfies
Figure BDA0002543776750000125
σjAnd (5) representing the standard deviation of j-th dimension characteristic data of the electroencephalogram characteristic set, wherein m is the dimension of the electroencephalogram characteristic set.
In operation S3024, third-time data fusion is performed according to the feature data after the first-time data fusion and the feature data after the second-time data fusion, so as to obtain fused feature data.
According to the embodiment of the disclosure, the feature data R after the first fusion can be obtainedbhAnd the feature data R after the second fusionbh' perform a third data fusion, for example, a fifth and third data fusion according to the following formula:
Figure BDA0002543776750000131
wherein R issPost-hoc for third data fusionAnd (5) characterizing the data.
According to the embodiment of the disclosure, the feature data subjected to data fusion three times may include information of all dimensions of the electroencephalogram signal and the electrocardiograph signal, for example, the electroencephalogram feature set may include feature data of 7 dimensions, the heart rate feature set may include feature data of 7 dimensions, and the feature data subjected to data fusion may include feature data of 14 dimensions.
In operation S303, the complexity of the fused feature data is optimized to obtain optimized feature data.
According to the embodiment of the disclosure, the feature data after feature fusion is a relatively pure signal, but because the data volume is very large, the direct processing is too complex, and the complexity optimization can be performed on the data after feature fusion.
For example, a PCA (Principal Component Analysis, PCA for short) algorithm may be used to reduce the complexity of the feature data, and the algorithm may extract the main information of the feature data and remove redundant information.
In operation S304, the optimized feature data is input into the emotion recognition model, so that probabilities respectively corresponding to each of k emotion types are output through the emotion recognition model, and k probabilities are obtained, where k is an integer greater than 1.
According to the embodiment of the disclosure, the feature data subjected to complexity optimization can be classified and identified so as to identify the emotion type corresponding to the feature data. The emotion recognition accuracy and robustness can be improved by utilizing the multi-feature fusion feature data for classification, and the classification response time can be effectively shortened on the basis of keeping the stable classification accuracy of each feature extraction algorithm by utilizing the feature data subjected to complexity optimization.
According to an embodiment of the present disclosure, the k emotion types may include a negative emotion, a calm emotion, and an excited emotion. The emotion recognition model comprises a first emotion classification model, a second emotion classification model and a third emotion classification model, wherein: the input of the first emotion classification model is optimized feature data, and the output of the first emotion classification model comprises the probability of negative emotions and the probability of calm emotions; the input of the second emotion classification model is optimized feature data, and the output of the second emotion classification model comprises the probability of calm emotion and the probability of excited emotion; and the input of the third emotion classification model is the optimized feature data, and the output of the third emotion classification model comprises the probability of negative emotion and the probability of exciting emotion.
According to the embodiment of the disclosure, the first emotion classification model, the second emotion classification model and the third emotion classification model may be, for example, two classification models obtained by training with a Support Vector Machine (SVM) algorithm. Each two-classification model can be obtained by utilizing a support vector machine algorithm to train and learn electroencephalogram characteristic signals and heart rate characteristic signals of a user.
According to the embodiment of the disclosure, since the output of the two classification models includes the probabilities of the two emotion types, the three emotion types of the negative emotion, the calm emotion and the excited emotion can be divided pairwise to obtain three division results, which can be respectively: the passive emotion and the calm emotion are classified into two categories, the calm emotion and the excited emotion are classified into two categories, the passive emotion can be recorded as 1, the calm emotion can be recorded as 2, and the excited emotion can be recorded as 3, and then the three categories are recorded as: { c1 ═ 1, 2 }, { c2 ═ 2, 3}, and { c3 ═ 1, 3 }.
According to the embodiment of the disclosure, electroencephalogram data and electrocardiosignal data of 40 users with independent shopping decision-making capability (for example, the ages of the users can be distributed in 15-60 years) can be selected for analysis, and 560 sample feature data of the 40 users can be obtained by taking the fused feature data as 14 dimensions as an example. The sample characteristic data can be respectively input into the initial models for training, the initial models are continuously optimized, and the first classification model, the second emotion classification model and the third emotion classification model are respectively obtained.
Fig. 3C schematically shows a flowchart of a method of outputting probabilities corresponding to each of k emotion types respectively through an emotion recognition model according to an embodiment of the present disclosure.
As shown in fig. 3C, operation S304 may specifically include operations S3041 through S3042.
In operation S3041, the optimized feature data is input into the first emotion classification model, the second emotion classification model, and the third emotion classification model, respectively, so that the first classification result, the second classification result, and the third classification result are output through the first emotion classification model, the second emotion classification model, and the third emotion classification.
According to the embodiment of the disclosure, three emotion classification results can be output based on the first emotion classification model, the second emotion classification model and the third emotion classification model, and the three emotion classification results can be recorded as: { p1c1,p2c1},{p2c2,p3c2And { p1 }c3,p3c3And (c) the step of (c) in which,
Figure BDA0002543776750000156
is at ciAnd the probability that the emotion type in the emotion-like classification result is k, wherein k belongs to {1, 2, 3}, and k is 1,
Figure BDA0002543776750000157
the probability that the corresponding emotion type is negative, when k is 2,
Figure BDA0002543776750000158
when k is 3 corresponding to the probability that the emotion type is calm,
Figure BDA0002543776750000159
probability that the corresponding emotion type is excited.
In operation S3042, an output of the emotion recognition model is determined according to the first classification result, the second classification result, and the third classification result, wherein the output of the emotion recognition model includes a probability of a negative emotion, a probability of a calm emotion, and a probability of an excited emotion.
According to the embodiment of the disclosure, the classification results respectively output by the first emotion classification model, the second emotion classification model and the third emotion classification model are utilized: { p1c1,p2c1},{p2c2,p3c2And { p1 }c3,p3c3The output of the emotion recognition model can be obtained by adopting the following formula six:
Figure BDA0002543776750000151
where p (k | F) represents the probability p that the feature vector F belongs to the emotion class kx,fiFor each feature vector FiA and B may be constants as a result of the classification.
According to the sixth formula, the probability of obtaining the emotion type k output by the emotion recognition model is the seventh formula:
Figure BDA0002543776750000152
according to the embodiments of the present disclosure, according to formula six and formula seven, in the case that k is 1, 2, 3, respectively, pkCan be expressed by the following formula eight, formula nine and formula ten:
Figure BDA0002543776750000153
Figure BDA0002543776750000154
Figure BDA0002543776750000155
wherein p is1Probability, p, that indicates the type of emotion as negative2Probability of the emotion type being calm, p3Indicating the probability that the mood type is excited.
In operation S305, a final emotion type corresponding to the optimized feature data is determined according to the k probabilities.
According to the embodiment of the disclosure, three emotion classification results can be output according to the first emotion classification model, the second emotion classification model and the third emotion classification model, and a final emotion type corresponding to the feature data can be determined.
Fig. 3D schematically shows a flow chart of a method of determining a final emotion type corresponding to the optimized feature data from the k probabilities according to an embodiment of the present disclosure.
As shown in fig. 3D, operation S305 may specifically include operations S3051-S3053.
In operation S3051, an emotion recognition function is generated according to the k probabilities.
According to the sixth formula and the seventh formula, the emotion recognition function can be obtained, and the emotion recognition function can be the eleventh formula:
(x) ln (1-Px) -lnPx + C formula (eleven)
Wherein, the probability of the emotion type x is Px, and C is a constant.
In operation S3052, a function value of a negative emotion, a function value of a calm emotion, and a function value of an excited emotion are calculated according to the emotion recognition function.
According to the eleventh formula, the function value of negative emotions is f (1), the function value of calm emotions is f (2), and the function value of excited emotions is f (3), according to the first formula.
In operation S3053, a final emotion type corresponding to the optimized feature data is determined according to a relationship among the negative emotion function value, the calm emotion function value, and the excited emotion function value.
According to the embodiment of the present disclosure, the final emotion type may be determined according to the relationship among f (1), f (2), and f (3).
For example, in the case where f (1) < -1, and f (1) < f (2), f (1) < f (3), the final emotion type may be determined to be negative. In the case where f (2) < -1, and f (2) < f (3), f (2) < f (1), it can be determined that the final emotion type is calm. In the case where f (3) < -1, and f (3) < f (2), f (3) < f (1), it can be determined that the final type of emotion is excitement.
Fig. 4 schematically illustrates an exemplary block diagram of an item recommendation system according to an embodiment of the present disclosure.
As shown in fig. 4, the item recommendation system 400 includes an electroencephalogram monitoring device 410, a heart rate monitoring device 420, a terminal 430, and a cloud server 440.
The electroencephalogram monitoring device 410 is used for acquiring electroencephalogram signals generated by a user in the process of browsing a plurality of articles according to a preset sampling frequency; preprocessing the electroencephalogram signals and then sending the preprocessed electroencephalogram signals to a terminal or a cloud server.
As shown in FIG. 4, the electroencephalogram monitoring device 410 may include an electroencephalogram acquisition module 411, an electroencephalogram preprocessing module 412, and a first Bluetooth module 413. The electroencephalogram signal acquisition module 411 may include an electroencephalogram electrode, a reference electrode, and a ground electrode, where the electroencephalogram electrode may be an electroencephalogram acquisition sensor for acquiring electroencephalogram signals.
According to the embodiment of the disclosure, the electroencephalogram monitoring device 410 can be a wearable headset, and because the electroencephalogram activity of an adult is more obvious on the left forehead, the electroencephalogram electrode of the electroencephalogram signal acquisition module 411 can be installed at the inner top end of the headset, so that when a user wears the electroencephalogram electrode, the electroencephalogram electrode is arranged at the left forehead of the user, and the accuracy of signal acquisition can be improved. The electroencephalogram electrode can be a high-input impedance amplifier dry electrode, and the high-input impedance amplifier dry electrode has an array design structure, so that the high-input impedance amplifier dry electrode does not need to be in large-area contact with the surface layer of the scalp of a user, the user does not need to paint a large amount of conductive paste on the scalp, and the electroencephalogram signal can be acquired in real time only by placing the dry electrode in the position of the left forehead, so that the electroencephalogram signal acquisition process is more convenient. The reference electrode may be placed at a relatively zero potential point of the body, for example, the reference electrode may be placed on the earlobe or the back of the ear. The ground electrode, the reference electrode and the brain electrode may be connected by wireless communication.
According to the embodiment of the present disclosure, the electroencephalogram electrodes of the electroencephalogram signal acquisition module 411 may acquire electroencephalogram signals according to a certain adopted frequency, and then the electroencephalogram signal acquisition module 411 may send differential signals of the electroencephalogram electrodes and the reference electrodes to the electroencephalogram signal preprocessing module 412. Because the electroencephalogram signal obtained in the electroencephalogram signal acquisition module 411 is not a pure electroencephalogram signal, which contains a large amount of artifacts and noise, the electroencephalogram signal can be preprocessed by the electroencephalogram signal preprocessing module 412, wherein the preprocessing may include selecting a suitable frequency for filtering, denoising, a/D conversion, and the like.
According to the embodiment of the present disclosure, the electroencephalogram signal preprocessing module 412 may be an intelligent chip, for example, a ThinkGear AM chip, which may perform preprocessing of the low-noise electroencephalogram spectrum signal. In order to improve the quality of the acquired electroencephalogram signal, a shielding cable can be connected between the electroencephalogram signal acquisition module 411 and the electroencephalogram signal preprocessing module 412 to weaken the noise interference of the surrounding environment as much as possible.
According to the embodiment of the disclosure, the electroencephalogram signal preprocessing module 412 can perform serial port communication on the preprocessed electroencephalogram signal with the terminal through the first bluetooth module 413 according to a certain data format, and send the preprocessed electroencephalogram signal to the terminal. The first bluetooth module 413 has the advantages of ultra-low power consumption, low cost, small size, stable transmission performance, quick start and instantaneous connection and the like, and can be matched with the data transmission rate of the electroencephalogram signal preprocessing module 412 to ensure the stability of the transmission performance.
The heart rate monitoring device 420 is used for acquiring electrocardiosignals generated by a user in the process of browsing a plurality of articles according to a preset sampling frequency; and preprocessing the electrocardiosignal and then sending the preprocessed electrocardiosignal to a terminal or a cloud server.
As shown in fig. 4, the heart rate monitoring device 420 may include an ecg signal acquisition module 421, an ecg signal preprocessing module 422, and a second bluetooth module 423. The electrocardiograph signal acquiring module 421 may include a heart rate sensor, and is configured to acquire an electrocardiograph signal according to a preset sampling frequency. Heart rate monitor 420 can be the bracelet device, and the user can wear this bracelet in wrist department, alright carry out electrocardiosignal's collection, and the operation is free nimble.
According to the embodiment of the present disclosure, the sampling frequency of the ecg signal collection module 421 may be the same as the sampling frequency of the electroencephalogram signal collection module, so as to ensure the synchronization between the acquired electroencephalogram signal and the ecg signal.
According to the embodiment of the present disclosure, the electrocardiographic signal acquired by the electrocardiographic signal acquisition module 421 may be sent to the electrocardiographic signal preprocessing module 422, and the electrocardiographic signal preprocessing module 422 may preprocess the electrocardiographic signal, where the preprocessing may include filtering power frequency interference by a digital band-stop filter, removing baseline drift of the signal by a zero-phase shift digital filter, and eliminating motion artifacts and noise in measurement by a butterworth band-pass filter and a wavelet threshold denoising method.
According to the embodiment of the disclosure, the electrocardiosignal preprocessing module 422 can perform serial port communication on the preprocessed electrocardiosignal with the terminal through the second bluetooth module 423 according to a certain data format, and send the preprocessed electrocardiosignal to the terminal.
The terminal 430 is configured to: acquiring electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles within a preset time period; determining the emotion type corresponding to each article according to the electroencephalogram signal and the electrocardiosignal corresponding to each article; transmitting the emotion type corresponding to each item to the cloud server.
Cloud server 440 is configured to: receiving the emotion type corresponding to each article sent by the terminal, and sequencing the articles according to the emotion type corresponding to each article to generate an article preference list; recommending other items associated with the specified items in the item preference list to the user according to the arrangement order of the items in the item preference list.
According to the embodiment of the present disclosure, the operations performed by the terminal 430 and the cloud server 440 correspond to the item recommendation method part in the embodiment of the present disclosure, and the description of the operations performed by the terminal 430 and the cloud server 440 specifically refers to the item recommendation method part, which is not described herein again.
The cloud server 440 may also be configured to receive an electroencephalogram signal sent by the electroencephalogram monitoring device and an electrocardiograph signal sent by the heart rate monitoring device.
According to the embodiment of the disclosure, the cloud server 440 may process the electroencephalogram signal sent by the electroencephalogram monitoring device 410 and the electrocardiograph signal sent by the heart rate monitoring device 420 to obtain emotion types corresponding to the electroencephalogram signal and the electrocardiograph signal, sort the items according to the emotion types, and send the sorted item preference list to the terminal 430, so that the terminal 430 may recommend other items to the user according to the item preference list.
According to the embodiment of the present disclosure, the cloud server 440 may further receive the processed electroencephalogram signal and the processed electrocardiograph signal sent by the terminal 430, the emotion types corresponding to the electroencephalogram signal and the electrocardiograph signal, and the item preference list ordered according to the emotion types, and store these information, so that the terminal 430 may obtain the information at any time.
Fig. 5 schematically illustrates a block diagram of an item pushing device according to an embodiment of the present disclosure.
As shown in FIG. 5, the item recommendation system 500 includes an acquisition module 510, a determination module 520, a generation module 530, and a recommendation module 540.
The obtaining module 510 is configured to obtain electroencephalogram signals and electrocardiograph signals generated in a process that a user browses a plurality of articles within a preset time period.
And a determining module 520, configured to determine, according to the electroencephalogram signal and the electrocardiograph signal corresponding to each item, an emotion type corresponding to each item.
A generating module 530, configured to sort the multiple items according to the emotion type corresponding to each item, and generate an item preference list.
The recommending module 540 recommends other items associated with the specified items in the item preference list to the user according to the arrangement order of the items in the item preference list.
According to an embodiment of the present disclosure, the determining module 520 may include a feature extracting unit, a data fusing unit, an optimization processing unit, a recognizing unit, and a determining unit.
The feature extraction unit is used for respectively performing M-dimension feature extraction on the electroencephalogram signals and the electrocardiosignals corresponding to each article to obtain M-dimension electroencephalogram feature data and M-dimension heart rate feature data, wherein M is an integer greater than 1.
And the data fusion unit is used for carrying out data fusion on the M dimensionalities of electroencephalogram characteristic data and the M dimensionalities of heart rate characteristic data to obtain fused characteristic data.
And the optimization processing unit is used for optimizing the complexity of the fused feature data to obtain the optimized feature data.
And the recognition unit is used for inputting the optimized feature data into the emotion recognition model so as to output probabilities respectively corresponding to each emotion type in k emotion types through the emotion recognition model to obtain k probabilities, wherein k is an integer larger than 1.
And the determining unit is used for determining the final emotion type corresponding to the optimized feature data according to the k probabilities.
According to an embodiment of the present disclosure, the data fusion unit may include: a processing subunit, a first fusion subunit, a second fusion subunit, and a third fusion subunit.
And the processing subunit is used for carrying out normalization processing on the electroencephalogram characteristic data with M dimensions and the heart rate characteristic data with M dimensions.
And the first fusion subunit is used for performing first data fusion according to the mean value and the variance of the normalized electroencephalogram characteristic data of the M dimensions and the mean value and the variance of the normalized heart rate characteristic data of the M dimensions to obtain characteristic data after the first data fusion.
And the second fusion subunit is used for performing second data fusion according to the standard deviation of the normalized electroencephalogram characteristic data of the M dimensions and the standard deviation of the normalized heart rate characteristic data of the M dimensions to obtain characteristic data after the second data fusion.
And the third fusion subunit is used for performing third-time data fusion according to the feature data subjected to the first-time data fusion and the feature data subjected to the second-time data fusion to obtain fused feature data.
According to an embodiment of the present disclosure, the k emotion types include negative emotion, calm emotion, and excited emotion. The emotion recognition model comprises a first emotion classification model, a second emotion classification model and a third emotion classification model, wherein: the input of the first emotion classification model is optimized feature data, and the output of the first emotion classification model comprises the probability of negative emotions and the probability of calm emotions; the input of the second emotion classification model is optimized feature data, and the output of the second emotion classification model comprises the probability of calm emotion and the probability of excited emotion; and the input of the third emotion classification model is the optimized feature data, and the output of the third emotion classification model comprises the probability of negative emotion and the probability of exciting emotion.
According to an embodiment of the present disclosure, the recognition unit may include: a first identifying subunit and a second identifying subunit.
And the first identification subunit is used for respectively inputting the optimized feature data into the first emotion classification model, the second emotion classification model and the third emotion classification model so as to output a first classification result, a second classification result and a third classification result through the first emotion classification model, the second emotion classification model and the third emotion classification.
And a second identifying subunit, configured to determine an output of the emotion recognition model according to the first classification result, the second classification result, and the third classification result, wherein the output of the emotion recognition model includes a probability of a negative emotion, a probability of a calm emotion, and a probability of an excited emotion.
According to an embodiment of the present disclosure, the determining unit may include: the device comprises a generating subunit, a calculating subunit and a determining subunit.
And the generating subunit is used for generating the emotion recognition function according to the k probabilities.
And the calculating subunit is used for calculating a function value of the negative emotion, a function value of the calm emotion and a function value of the excited emotion according to the emotion recognition function.
And the determining subunit is used for determining the final emotion type corresponding to the optimized characteristic data according to the relationship among the negative emotion function value, the calmness emotion function value and the excitation emotion function value.
According to the embodiment of the disclosure, the electroencephalogram feature data of M dimensions comprises at least M of the following types: mean, standard deviation, first order difference mean square error, second order difference mean square error, characteristic entropy, frequency band energy and frequency band energy ratio of the electroencephalogram signals. The heart rate characteristic data of the M dimensions comprises at least M of the following: mean value, standard deviation, first order difference mean square error, second order difference mean square error, characteristic entropy, frequency band energy and frequency band energy ratio of the electrocardiosignals.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any plurality of the obtaining module 510, the determining module 520, the generating module 530 and the recommending module 540 may be combined and implemented in one module/unit/sub-unit, or any one of the modules/units/sub-units may be split into a plurality of modules/units/sub-units. Alternatively, at least part of the functionality of one or more of these modules/units/sub-units may be combined with at least part of the functionality of other modules/units/sub-units and implemented in one module/unit/sub-unit. According to an embodiment of the present disclosure, at least one of the obtaining module 510, the determining module 520, the generating module 530 and the recommending module 540 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or may be implemented by any one of three implementations of software, hardware and firmware, or any suitable combination of any of them. Alternatively, at least one of the obtaining module 510, the determining module 520, the generating module 530 and the recommending module 540 may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
It should be noted that the item recommendation device portion in the embodiment of the present disclosure corresponds to the item recommendation method portion in the embodiment of the present disclosure, and the description of the item recommendation device portion specifically refers to the item recommendation method portion, which is not described herein again.
Fig. 6 schematically shows a block diagram of a computer system suitable for implementing the above described method according to an embodiment of the present disclosure. The computer system illustrated in FIG. 6 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 6, a computer system 600 according to an embodiment of the present disclosure includes a processor 601, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. Processor 601 may include, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 601 may also include onboard memory for caching purposes. Processor 601 may include a single processing unit or multiple processing units for performing different actions of a method flow according to embodiments of the disclosure.
In the RAM 603, various programs and data necessary for the operation of the system 600 are stored. The processor 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. The processor 601 performs various operations of the method flows according to the embodiments of the present disclosure by executing programs in the ROM 602 and/or RAM 603. It is to be noted that the programs may also be stored in one or more memories other than the ROM 602 and RAM 603. The processor 601 may also perform various operations of the method flows according to embodiments of the present disclosure by executing programs stored in the one or more memories.
According to an embodiment of the present disclosure, system 600 may also include an input/output (I/O) interface 605, input/output (I/O) interface 605 also connected to bus 604. The system 600 may also include one or more of the following components connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
According to embodiments of the present disclosure, method flows according to embodiments of the present disclosure may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable storage medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program, when executed by the processor 601, performs the above-described functions defined in the system of the embodiments of the present disclosure. The systems, devices, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to an embodiment of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium. Examples may include, but are not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
For example, according to embodiments of the present disclosure, a computer-readable storage medium may include the ROM 602 and/or RAM 603 described above and/or one or more memories other than the ROM 602 and RAM 603.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
The embodiments of the present disclosure have been described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used in advantageous combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present disclosure, and such alternatives and modifications are intended to be within the scope of the present disclosure.

Claims (11)

1. An item recommendation method comprising:
acquiring electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles within a preset time period;
determining the emotion type corresponding to each article according to the electroencephalogram signal and the electrocardiosignal corresponding to each article;
sequencing the plurality of articles according to the emotion type corresponding to each article to generate an article preference list; and
recommending other items related to the specified items in the item preference list to the user according to the arrangement sequence of the items in the item preference list.
2. The method of claim 1, wherein determining the type of emotion corresponding to each item from the electroencephalographic signal and the electrocardiographic signal corresponding to each item comprises:
respectively extracting features of M dimensions from the electroencephalogram signal and the electrocardiosignal corresponding to each article to obtain electroencephalogram feature data of M dimensions and heart rate feature data of M dimensions, wherein M is an integer greater than 1;
performing data fusion on the electroencephalogram characteristic data with M dimensions and the heart rate characteristic data with M dimensions to obtain fused characteristic data;
optimizing the complexity of the fused feature data to obtain optimized feature data;
inputting the optimized feature data into an emotion recognition model so as to output probabilities respectively corresponding to each emotion type in k emotion types through the emotion recognition model to obtain k probabilities, wherein k is an integer greater than 1; and
and determining the final emotion type corresponding to the optimized feature data according to the k probabilities.
3. The method of claim 2, wherein the data fusing the M dimensions of the brain electrical characteristic data and the M dimensions of the heart rate characteristic data comprises:
carrying out normalization processing on the electroencephalogram characteristic data of M dimensions and the heart rate characteristic data of M dimensions;
performing first data fusion according to the mean value and variance of the electroencephalogram characteristic data of the M dimensions after the normalization processing and the mean value and variance of the heart rate characteristic data of the M dimensions after the normalization processing to obtain characteristic data after the first data fusion;
performing second data fusion according to the standard deviation of the electroencephalogram characteristic data of the M dimensions after the normalization processing and the standard deviation of the heart rate characteristic data of the M dimensions after the normalization processing to obtain characteristic data after the second data fusion; and
and performing third data fusion according to the feature data after the first data fusion and the feature data after the second data fusion to obtain fused feature data.
4. The method of claim 2, wherein the k emotion types include negative emotion, calm emotion, and excited emotion;
the emotion recognition model comprises a first emotion classification model, a second emotion classification model and a third emotion classification model, wherein:
the input of the first emotion classification model is the optimized feature data, and the output of the first emotion classification model comprises the probability of negative emotion and the probability of calm emotion;
the input of the second emotion classification model is the optimized feature data, and the output of the second emotion classification model comprises the probability of calm emotion and the probability of excited emotion; and
the input of the third emotion classification model is the optimized feature data, and the output of the third emotion classification model comprises the probability of a negative emotion and the probability of an excited emotion.
5. The method of claim 4, wherein the inputting the optimized feature data into an emotion recognition model so as to output, by the emotion recognition model, probabilities respectively corresponding to each of the k emotion types comprises:
inputting the optimized feature data into the first emotion classification model, the second emotion classification model and the third emotion classification model respectively so as to output a first classification result, a second classification result and a third classification result through the first emotion classification model, the second emotion classification model and the third emotion classification; and
determining an output of the emotion recognition model from the first classification result, the second classification result, and the third classification result, wherein the output of the emotion recognition model includes a probability of a negative emotion, a probability of a calm emotion, and a probability of an excited emotion.
6. The method of claim 5, wherein said determining a final emotion type corresponding to the optimized feature data from the k probabilities comprises:
generating an emotion recognition function according to the k probabilities;
calculating a function value of negative emotion, a function value of calm emotion and a function value of exciting emotion according to the emotion recognition function; and
and determining a final emotion type corresponding to the optimized feature data according to the relationship among the negative emotion function value, the calm emotion function value and the exciting emotion function value.
7. The method of any one of claim 2, the brain electrical feature data in M dimensions comprising at least M of:
the mean value, the standard deviation, the first-order difference mean square error, the second-order difference mean square error, the characteristic entropy, the frequency band energy and the frequency band energy ratio of the electroencephalogram signals;
the heart rate characteristic data of M dimensions includes at least M of:
the mean value, the standard deviation, the first order difference mean square error, the second order difference mean square error, the characteristic entropy, the frequency band energy and the frequency band energy ratio of the electrocardiosignals.
8. An item recommendation system comprising: the system comprises an electroencephalogram monitoring device, a heart rate monitoring device, a terminal and a cloud server;
the electroencephalogram monitoring equipment is used for acquiring electroencephalogram signals generated by a user in the process of browsing a plurality of articles according to a preset sampling frequency; preprocessing the electroencephalogram signals and then sending the preprocessed electroencephalogram signals to the terminal or the cloud server;
the heart rate monitoring equipment is used for acquiring electrocardiosignals generated by a user in the process of browsing a plurality of articles according to the preset sampling frequency; preprocessing the electrocardiosignals and then sending the preprocessed electrocardiosignals to the terminal or the cloud server;
the terminal is used for:
acquiring electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles within a preset time period;
determining the emotion type corresponding to each article according to the electroencephalogram signal and the electrocardiosignal corresponding to each article;
transmitting an emotion type corresponding to each item to the cloud server;
the cloud server is configured to:
sequencing the plurality of articles according to the emotion type corresponding to each article to generate an article preference list; and
recommending other items related to the specified items in the item preference list to the user according to the arrangement sequence of the items in the item preference list.
9. An item recommendation device comprising:
the acquisition module is used for acquiring electroencephalogram signals and electrocardiosignals generated in the process that a user browses a plurality of articles within a preset time period;
the determining module is used for determining the emotion type corresponding to each article according to the electroencephalogram signal and the electrocardiosignal corresponding to each article;
the generating module is used for sequencing the plurality of articles according to the emotion type corresponding to each article to generate an article preference list; and
and the recommending module is used for recommending other articles related to the specified articles in the article preference list to the user according to the arrangement sequence of the articles in the article preference list.
10. A computer system, comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-7.
11. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to carry out the method of any one of claims 1 to 7.
CN202010557074.0A 2020-06-17 2020-06-17 Article recommendation method, device and system, computer system and storage medium Pending CN113807904A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010557074.0A CN113807904A (en) 2020-06-17 2020-06-17 Article recommendation method, device and system, computer system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010557074.0A CN113807904A (en) 2020-06-17 2020-06-17 Article recommendation method, device and system, computer system and storage medium

Publications (1)

Publication Number Publication Date
CN113807904A true CN113807904A (en) 2021-12-17

Family

ID=78943496

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010557074.0A Pending CN113807904A (en) 2020-06-17 2020-06-17 Article recommendation method, device and system, computer system and storage medium

Country Status (1)

Country Link
CN (1) CN113807904A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106345034A (en) * 2016-11-09 2017-01-25 武汉智普天创科技有限公司 Device based on brain electricity acquisition terminal for cognitive emotion regulation
KR101719546B1 (en) * 2015-10-08 2017-03-27 세종대학교산학협력단 Real-time emotion recognition interface using multiple physiological signals of eeg and ecg
CN107123019A (en) * 2017-03-28 2017-09-01 华南理工大学 A kind of VR shopping commending systems and method based on physiological data and Emotion identification
CN107463874A (en) * 2017-07-03 2017-12-12 华南师范大学 The intelligent safeguard system of Emotion identification method and system and application this method
CN107766898A (en) * 2017-12-08 2018-03-06 南京邮电大学盐城大数据研究院有限公司 The three classification mood probabilistic determination methods based on SVM
CN108073284A (en) * 2017-12-15 2018-05-25 南京信息工程大学 Purchase system based on brain wave identification mood

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101719546B1 (en) * 2015-10-08 2017-03-27 세종대학교산학협력단 Real-time emotion recognition interface using multiple physiological signals of eeg and ecg
CN106345034A (en) * 2016-11-09 2017-01-25 武汉智普天创科技有限公司 Device based on brain electricity acquisition terminal for cognitive emotion regulation
CN107123019A (en) * 2017-03-28 2017-09-01 华南理工大学 A kind of VR shopping commending systems and method based on physiological data and Emotion identification
CN107463874A (en) * 2017-07-03 2017-12-12 华南师范大学 The intelligent safeguard system of Emotion identification method and system and application this method
CN107766898A (en) * 2017-12-08 2018-03-06 南京邮电大学盐城大数据研究院有限公司 The three classification mood probabilistic determination methods based on SVM
CN108073284A (en) * 2017-12-15 2018-05-25 南京信息工程大学 Purchase system based on brain wave identification mood

Similar Documents

Publication Publication Date Title
Singh et al. A comprehensive review on critical issues and possible solutions of motor imagery based electroencephalography brain-computer interface
WO2016154298A1 (en) System and method for automatic interpretation of eeg signals using a deep learning statistical model
CN108932511B (en) Shopping decision method based on brain-computer interaction
Kose et al. A new approach for emotions recognition through EOG and EMG signals
CN114787883A (en) Automatic emotion recognition method, system, computing device and computer-readable storage medium
Vazquez-Rodriguez et al. Transformer-based self-supervised learning for emotion recognition
US20230080175A1 (en) Method and device for predicting user state
Alvarado-González et al. P300 detection based on EEG shape features
CN111954250A (en) Lightweight Wi-Fi behavior sensing method and system
WO2015153240A1 (en) Directed recommendations
Zhu et al. Physiological signals-based emotion recognition via high-order correlation learning
Joy et al. Multiclass mi-task classification using logistic regression and filter bank common spatial patterns
Mohdiwale et al. Investigating Feature Ranking Methods for Sub‐Band and Relative Power Features in Motor Imagery Task Classification
JP7173482B2 (en) Health care data analysis system, health care data analysis method and health care data analysis program
CN114190897A (en) Training method of sleep staging model, sleep staging method and device
Wankhade et al. IKKN predictor: An EEG signal based emotion recognition for HCI
You et al. Multivariate time–frequency analysis of electrohysterogram for classification of term and preterm labor
CN113807904A (en) Article recommendation method, device and system, computer system and storage medium
Chugh et al. The hybrid deep learning model for identification of attention-deficit/hyperactivity disorder using EEG
CN116687409A (en) Emotion recognition method and system based on digital twin and deep learning
CN116392148A (en) Electroencephalogram signal classification method, device, equipment and storage medium
Nagar et al. Orthogonal features-based eeg signal denoising using fractionally compressed autoencoder
CHEN et al. Removal of muscle artifact from EEG data based on independent vector analysis
Lapsa et al. Adaptive signal-to-noise ratio indicator for wearable bioimpedance monitoring
Dan et al. Sensor selection and miniaturization limits for detection of interictal epileptiform discharges with wearable EEG

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination