CN110517085A - It generates and shows method for reporting, electronic equipment and computer readable storage medium - Google Patents

It generates and shows method for reporting, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN110517085A
CN110517085A CN201910797205.XA CN201910797205A CN110517085A CN 110517085 A CN110517085 A CN 110517085A CN 201910797205 A CN201910797205 A CN 201910797205A CN 110517085 A CN110517085 A CN 110517085A
Authority
CN
China
Prior art keywords
information
works
user
report
during watching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910797205.XA
Other languages
Chinese (zh)
Other versions
CN110517085B (en
Inventor
杨育松
王曦光
王晨
王勇
徐峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XINHUA NETWORK CO Ltd
Original Assignee
XINHUA NETWORK CO Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XINHUA NETWORK CO Ltd filed Critical XINHUA NETWORK CO Ltd
Priority to CN201910797205.XA priority Critical patent/CN110517085B/en
Publication of CN110517085A publication Critical patent/CN110517085A/en
Application granted granted Critical
Publication of CN110517085B publication Critical patent/CN110517085B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Accounting & Taxation (AREA)
  • Physics & Mathematics (AREA)
  • Finance (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Psychiatry (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Child & Adolescent Psychology (AREA)
  • Psychology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Social Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • General Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Game Theory and Decision Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the present application provides a kind of generation and shows method for reporting, electronic equipment and computer readable storage medium.This method comprises: obtaining emotional change information of user during watching works as the first information, the information for the works that acquisition user is watched during watching works is as the second information, it is then based on the first information, the matching relationship of the second information and the first information and the second information in time, generates and shows report;Wherein, show report for showing user for the corresponding emotional information of each works institute.The embodiment of the present application realizes the emotional change that can intuitively observe user during watching works, and can intuitively determine that user is directed to the favorable rating of each works based on this, and promote user experience.

Description

It generates and shows method for reporting, electronic equipment and computer readable storage medium
Technical field
This application involves computer and technical field of biological, report specifically, showing this application involves a kind of generation Announcement method, electronic equipment and computer readable storage medium.
Background technique
Art increases value for personal life and entire society, and participatory art activity can help personal and society's hair Cognition and affective process inside exhibition.Art attracts spectators in emotion and intelligence and aesthetic level.Under normal circumstances, pass through Spectators' experiences to measure the value of works, the especially value of artistic work, but the experience of spectators is difficult simultaneously directly to observe It arrives.
In the prior art, after spectators watch works, spectators' viewing is obtained generally by the mode of questionnaire survey The experience of works and viewing are directed to the favorable rating of each works, for example, spectators are after watching exhibition of works, by that will ask It perms to spectators, with interrogation reply system, each spectators is allowed to fill in favorite works, like journey for works with each spectators of determination Degree.But this mode, and can not intuitively observe emotional change of user during watching exhibition of works, and be directed to The favorable rating of each works, therefore how more intuitively to observe emotional change of user during watching exhibition of works And become a critical issue for the favorable rating of each works.
Summary of the invention
This application provides a kind of generations to show method for reporting, device, electronic equipment and computer readable storage medium, can To solve above-mentioned at least one technical problem.Technical solution is as follows:
In a first aspect, this application provides generate to show method for reporting, comprising:
Emotional change information of user during watching works is obtained as the first information;
The information for the works that acquisition user is watched during watching works is as the second information;
Based on the first information, the matching relationship of the second information and the first information and the second information in time, exhibition is generated Show report;
Works works works show report for showing user for the corresponding emotional information of each works institute.
In one possible implementation, emotional change information of user during watching works is obtained, comprising:
Obtain biosensor collected electro-physiological signals during user watches works;
Based on the electro-physiological signals got, emotional change information of user during watching works is determined.
In alternatively possible implementation, it is based on the first information, the second information and the first information and the second information Matching relationship in time generates and shows report, before further include:
Location information of user during watching works is obtained from alignment sensor;
Location information based on user during watching works determines stop position of user during watching works Information and travel routes information;
Stop position of user during watching works is marked in travel routes information, obtains third information, third Information is the travel routes information for being labeled with position of stopping.
In alternatively possible implementation, it is based on the first information, the second information and the first information and the second information Matching relationship in time generates and shows report, comprising:
Based on following information, generates and shows report:
The first information;Second information;Third information;The matching relationship of the first information and the second information in time;Second Information and the matching relationship of third information in position.
In alternatively possible implementation, this method further include:
The 4th information is obtained, the 4th information includes: that user is collected from the third party visual angle during watching works Multimedia messages, from third party visual angle, collected multimedia messages include: the works that the behavioural information of user, user are watched Content and user around environmental information.
In alternatively possible implementation, based on the electro-physiological signals got, determine user in viewing works mistake Emotional change information in journey, comprising:
Electro-physiological signals and user based on acquisition are collected more from the third party visual angle during watching works Media information determines emotional change information of user during watching works.
In alternatively possible implementation, it is based on following information, generates and shows report, comprising:
Based on following information, generates and shows report:
The first information;Second information;Third information;4th information;The matching of the first information and the second information in time Relationship;Second information and the matching relationship of third information in position;The matching of third information and the 4th information in position is closed System.
In alternatively possible implementation, this method further include:
When display operation is reported in the displaying for detecting user's triggering, control display shows report.
In alternatively possible implementation, this method further include:
When detecting user in display report in the first information, the second information, third information and the 4th information The trigger action of any information determines trigger action corresponding display information in any information, and divides in other information Not corresponding display information;
Control display trigger action corresponding display information in any information, and in other information it is corresponding Information is shown, to realize the linkage display of the first information, the second information, third information and the 4th information in display report;
Other information is the letter in the first information, the second information, third information and the 4th information in addition to any information Breath.
Second aspect, this application provides generate to show reporting device, comprising:
First obtains module, for obtaining emotional change information of user during watching works as the first information;
Acquisition module, for acquiring the information for the works that user is watched during watching works as the second letter Breath;
Generation module, for based on the first information, of the second information and the first information and the second information in time With relationship, generates and show report;
Wherein, show report for showing user for the corresponding emotional information of each works institute.
In one possible implementation, the first acquisition module includes: acquiring unit and determination unit, wherein
Acquiring unit, for obtaining biosensor collected electro-physiological signals during user watches works;
Determination unit, for determining that mood of user during watching works becomes based on the electro-physiological signals got Change information.
In alternatively possible implementation, the device further include: second obtains module, the first determining module and mark Injection molding block, wherein
Second obtains module, for obtaining location information of user during watching works from alignment sensor;
First determining module determines that user makees in viewing for the location information based on user during watching works Stop location information and travel routes information during product;
Labeling module is obtained for marking position of stopping of user during watching works in travel routes information Third information, third information are the travel routes information for being labeled with position of stopping.
In alternatively possible implementation, generation module is specifically used for being based on following information, generates and show report:
The first information;Second information;Third information;The matching relationship of the first information and the second information in time;Second Information and the matching relationship of third information in position.
In alternatively possible implementation, the device further include: third obtains module, wherein
Third obtains module, for obtaining the 4th information, the 4th information include: user during watching works from the The three collected multimedia messages in people visual angle, from third party visual angle collected multimedia messages include: user behavioural information, The content and the environmental information around user for the works that user is watched.
In alternatively possible implementation, determination unit, specifically for electro-physiological signals and use based on acquisition During watching works, from third party visual angle, collected multimedia messages determine user watch works during at family Emotional change information.
In alternatively possible implementation, generation module is specifically used for being based on following information, generates and show report:
The first information;Second information;Third information;4th information;The matching of the first information and the second information in time Relationship;Second information and the matching relationship of third information in position;The matching of third information and the 4th information in position is closed System.
In alternatively possible implementation, the device further include: the first control display module, wherein
Display module is controlled, for when display operation is reported in the displaying for detecting user's triggering, control display to show report.
In alternatively possible implementation, the device further include: the second determining module and the second control display mould Block, wherein
Second determining module detects user in display report for the first information, the second information, third letter for working as The trigger action of any information, determines trigger action corresponding display information in any information in breath and the 4th information, with And the corresponding display information in other information;
Second control display module shows trigger action corresponding display information in any information for controlling, and The corresponding display information in other information, to realize that the first information, the second information, third information and the 4th information exist Linkage display in display report;
Other information is the letter in the first information, the second information, third information and the 4th information in addition to any information Breath.
The third aspect provides a kind of electronic equipment, which includes:
One or more processors;
Memory;
One or more application program, wherein one or more application programs be stored in memory and be configured as by One or more processors execute, and one or more programs are configured to: execute according to first aspect or first aspect are any can It is generated shown in the implementation of energy and shows the corresponding operation of method for reporting.
Fourth aspect, provides a kind of computer readable storage medium, and storage medium is stored at least one instruction, at least One Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, code set or instruction set are loaded by processor And it executes to realize that the generation as shown in first aspect or first aspect any possible implementation shows method for reporting.
Technical solution provided by the present application has the benefit that
This application provides a kind of generations to show method for reporting, device, electronic equipment and computer readable storage medium, with Spectators are obtained by way of questionnaire survey in the prior art to watch the experience of works and watch the happiness for being directed to each works Love degree is compared, the application based on emotional change information of user during watching works and user viewing works mistake The content for the works watched in journey generates and shows report, i.e., more can intuitively observe user based on displaying report and see It sees the emotional change during works, and can intuitively determine that user is directed to the favorable rating of each works based on this, thus The time of user's questionnaire survey can be saved, user experience is promoted.
Detailed description of the invention
In order to more clearly explain the technical solutions in the embodiments of the present application, institute in being described below to the embodiment of the present application Attached drawing to be used is needed to be briefly described.
Fig. 1 is a kind of flow diagram for generating displaying method for reporting provided by the embodiments of the present application;
Fig. 2 is a kind of structural schematic diagram for generating displaying reporting device provided by the embodiments of the present application;
Fig. 3 is a kind of structural schematic diagram of electronic equipment for generating displaying report provided by the embodiments of the present application;
Fig. 4 is the displaying schematic diagram that the travel routes information for information of stopping is labeled in the embodiment of the present application;
Fig. 5 is emotional change schematic diagram of user during watching works in the embodiment of the present application.
Specific embodiment
Embodiments herein is described below in detail, examples of the embodiments are shown in the accompanying drawings, wherein from beginning to end Same or similar label indicates same or similar element or element with the same or similar functions.Below with reference to attached The embodiment of figure description is exemplary, and is only used for explaining the application, and is not construed as limiting the claims.
Those skilled in the art of the present technique are appreciated that unless expressly stated, singular " one " used herein, " one It is a ", " described " and "the" may also comprise plural form.It is to be further understood that being arranged used in the description of the present application Diction " comprising " refer to that there are the feature, integer, step, operation, element and/or component, but it is not excluded that in the presence of or addition Other one or more features, integer, step, operation, element, component and/or their group.It should be understood that when we claim member Part is " connected " or when " coupled " to another element, it can be directly connected or coupled to other elements, or there may also be Intermediary element.In addition, " connection " used herein or " coupling " may include being wirelessly connected or wirelessly coupling.It is used herein to arrange Diction "and/or" includes one or more associated wholes for listing item or any cell and all combinations.
How the technical solution of the application and the technical solution of the application are solved with specifically embodiment below above-mentioned Technical problem is described in detail.These specific embodiments can be combined with each other below, for the same or similar concept Or process may repeat no more in certain embodiments.Below in conjunction with attached drawing, embodiments herein is described.
The embodiment of the present application provides a kind of generation displaying method for reporting, as shown in Figure 1, this method comprises:
Step S101, emotional change information of user during watching works is obtained as the first information.
Works can be artistic work in the embodiment of the present application, but be not limited to artistic work, and other users can watch Works within the scope of protection of this application.
For the embodiment of the present application, user may include: that user makees in viewing scientific exhibit during watching works During product, for user during watching Paintings Exhibition works, user is during watching film.Implement in the application In example without limitation.Wherein, emotional change information of user during watching works can be as shown in Figure 5.
Step S102, the information for the works that acquisition user is watched during watching works is as the second information.
For example, step S102 may include: each science and technology for acquiring user and being watched during watching scientific exhibit Put on display works content;Each drawing showpiece that acquisition user is watched during watching Paintings Exhibition works;Acquire user Each frame image watched during watching film or each vidclip.
For the embodiment of the present application, step S102 be can specifically include: from the first visual angle acquisition user in viewing works During the information of works watched.
For example, user can carry glasses during watching exhibition of works, image capture device is installed on glasses, it should Image capture device is used to acquire the information for each works that user is watched during watching works.Or pass through others The information for each works that acquisition mode acquisition user is watched during watching works, is not limited to by pacifying on glasses The information for each works that the image capture device acquisition user of dress is watched during watching works.
For the embodiment of the present application, step S101 can be executed before step S102, can be held after step s 102 Row, can also be performed simultaneously with step S102.In the embodiment of the present application without limitation.In the embodiment of the present application, shown in Fig. 1 The step of execution sequence be only a kind of example, be not intended as the restriction to the application.
Step S103, it is closed based on the matching of the first information, the second information and the first information and the second information in time System generates and shows report.
Wherein, works show report for showing user for the corresponding emotional information of each works institute.
For the embodiment of the present application, since user is when watching works, acquires in real time and determine user in viewing works mistake Emotional change in journey, therefore the content of works watched during watching works of user and user are in viewing works process In emotional change information there is matching relationship in time, and be based on this, generate and show report, for showing that user is seeing See the emotional change during each works.
For example, user watches works 1 in 8:00~8:05, user watches works 2 in 8:06~8:10, then available to arrive 8:00~8:05 user corresponding emotional change information during watching works 1;And get 8:06~8:10 user Corresponding emotional change information during watching works 2, and generated based on this and show report, to show that user makees in viewing Corresponding emotional change information during product 1 and works 2.
The embodiment of the present application provides a kind of generation and shows method for reporting, and in the prior art by way of questionnaire survey It is compared to obtain the favorable rating that spectators watch the experience of works and viewing is directed to each works, the embodiment of the present application is based on using Family watching emotional change information during works and the content of works that user is watched during watching works, It generates and shows report, i.e., emotional change of user during watching works more can intuitively be observed based on displaying report, And based on this can intuitively determine user be directed to each works favorable rating, so as to save user's questionnaire survey when Between, promote user experience.
A kind of possible implementation of the embodiment of the present application, step S101 can specifically include: obtain biosensor Collected electro-physiological signals during user watches works;Based on the electro-physiological signals got, determine that user is watching Emotional change information during works.
For the embodiment of the present application, collected generation signal can during user watches artistic work for biosensor To include: at least one in electrocardiosignal, skin electrical signal, EEG signals and eye movement signal.
For the embodiment of the present application, user can wear each biosensor during watching each works, with Acquire user during watching each works electro-physiological signals (include: electrocardiosignal, skin electrical signal, EEG signals with At least one of and in eye movement signal), and the electro-physiological signals based on user during watching each works, determine that user exists Watch the emotional change information during each works.
The alternatively possible implementation of the embodiment of the present application, step S103 can also include: before step Sa (in figure Be not shown), step Sb (not shown) and step Sc (not shown), wherein
Step Sa, location information of user during watching works is obtained from alignment sensor.
For the embodiment of the present application, the biosensor that user wears during watching works has positioning function, Perhaps user wears the sensor for having positioning function or user during watching works during watching works The glasses of wearing can have positioning function.Alignment sensor is individual sensor, or integrated with other biosensors At one piece, or and the glasses worn when watching works of user be integrated in one piece, and without limitation.
Step Sb, the location information based on user during watching works determines user watch works during It stops location information and travel routes information.
For the embodiment of the present application, based on the location information in alignment sensor, it can determine that each user makees in viewing Travel routes information and stop place during product.In the embodiment of the present application, if detecting user in a certain position Residence time is greater than preset time threshold, it is determined that the residence time is greater than location information corresponding to the position of preset time threshold As location information of stopping.
Step Sc, stop position of user during watching works is marked in travel routes information, obtains third letter Breath.
Wherein, third information is the travel routes information for being labeled with position of stopping.
For the embodiment of the present application, ordinary circumstance, user stops longer time during watching works, characterizes user It is more interested in the works at the position, by marking user's stopping during watching works in travel routes information Location information can preferably determine interested works of user etc..
For example, as shown in figure 4, Fig. 4 is the travel routes information for being labelled with position of stopping, wherein black circle characterization is used Stop position of family during watching works.
For the embodiment of the present application, step Sa, step Sb and step Sc and step S101, step S102 execute sequence And without limitation.
The alternatively possible implementation of the embodiment of the present application, step S103 can specifically include: step S1031 (figure In be not shown), wherein
Step S1031, it is based on following information, generates and shows report:
The first information;Second information;Third information;The matching relationship of the first information and the second information in time;Second Information and the matching relationship of third information in position.
Specifically, on the basis of step Sa, step Sb and step Sc, step S103 can specifically include step S1031。
For in the embodiment of the present application, generated in step S1031 displaying report in include: user viewing works process In emotional change information, the content of works watched during watching works of user and the row for being labeled with position of stopping It can also be shown between the first information, the second information and third information in the displaying report sailing route information, and generating Incidence relation.
The displaying of generation is shown when the displaying report for receiving user's triggering shows instruction for the embodiment of the present application Report, and when showing some works, while can show emotional change information of the user when watching the works, and User's stop place information corresponding when watching the works.
The alternatively possible implementation of the embodiment of the present application, this method further include: step Sd (not shown), In,
Step Sd, the 4th information is obtained.
Wherein, the 4th information includes: that user during watching works from third party visual angle, believe by collected multimedia Breath.
Wherein, from third party visual angle, collected multimedia messages include: work that the behavioural information of user, user are watched The content and the environmental information around user of product.
For the embodiment of the present application, position is presently in based on the user that alignment sensor navigates to, controls corresponding position Camera be monitored, to acquire the content of the behavioural information of user, the works that user is watched from the visual angle of the third party And the environmental information around user.
For the embodiment of the present application, the behavioural information of user is behavioural information of user during watching works, example Such as, it plays mobile phone, make a phone call.In the embodiment of the present application, the behavioural information of user can also include that the facial expression of user is believed Breath.
The alternatively possible implementation of the embodiment of the present application, step S101 can specifically include: step S1011 (figure In be not shown), wherein
Step S1011, the electro-physiological signals based on acquisition and user are during watching works, from third party visual angle Collected multimedia messages determine emotional change information of user during watching works.
For the embodiment of the present application, if user's corresponding user behavior information during watching a certain works is to play hand Machine, then it is not that user watches the works institute that the user that can be confirmly detected, which watches the corresponding emotional change information of a certain works, Corresponding true emotional, furthermore due to may include user from the collected multimedia messages in third party visual angle in viewing works When facial expression information.Therefore the electro-physiological signals based on acquisition and user are during watching works, from the third party The accuracy of emotional change information of the user that the collected multimedia messages in visual angle are determined during watching works is higher.
The alternatively possible implementation of the embodiment of the present application, step S1031 can specifically include: step S1031a (not shown), wherein
Step S1031a, it is based on following information, generates and shows report:
The first information;Second information;Third information;4th information;The matching of the first information and the second information in time Relationship;Second information and the matching relationship of third information in position;The matching of third information and the 4th information in position is closed System.
It may include: that (user is in viewing works process for the first information for the embodiment of the present application, in the display report of generation In emotional change information), the second information (information of works that user is watched during watching art), third information (user adopts during watching works from third party visual angle for (travel routes information for being labeled with position of stopping), the 4th information The multimedia messages collected), the first information and the second information matching relationship in time, the second information and third information it is in place The matching relationship and third information and the matching relationship of the 4th information in position set.
For the embodiment of the present application, each works that user is watched during watching works are determined based on the second information Presupposed information as the 5th information, be then based on the first information, the second information, third information, the 4th information and the 5th letter Breath generates and shows report, can also be generated based on the first information, the 5th information, third information and the 4th information and show report, Or according to the first information, in the second information, third information, the 4th information and the 5th information at least two generations show report It accuses.
For the embodiment of the present application, the displaying report of generation can be common electronic report, or aobvious for that can link The electronic report shown.In the embodiment of the present application without limitation.
The alternatively possible implementation of the embodiment of the present application, this method can also include: that step Se (does not show in figure Out), wherein
Step Se, when display operation is reported in the displaying for detecting user's triggering, control display shows report.
For the embodiment of the present application, when display operation is reported in the displaying for detecting user's triggering, control shows the step The displaying report generated in S1031a.
The embodiment of the present application can be shown if displaying to be shown is reported as general display report on any screen Show displaying report.The displaying report of displaying if displaying to be shown is reported as capable of linking, can be with step Sf and step Sg Display mode carry out linkage displaying.
The alternatively possible implementation of the embodiment of the present application, this method can also include: that step Sf (does not show in figure Out) and step Sg (not shown), wherein
Step Sf, when detecting user in display report for the first information, the second information, third information and the 4th The trigger action of any information in information determines trigger action corresponding display information in any information, and in other letters Corresponding display information in breath.
Step Sg, control display trigger action corresponding display information in any information, and divide in other information Not corresponding display information, to realize the connection of the first information, the second information, third information and the 4th information in display report Dynamic display.
Wherein, other information be the first information, the second information, third information and the 4th information in addition to any information Information.
For example, being based on matching relationship when detecting that user triggers any works of display, determine that user is watching the works When emotional change information, user marking when watching the works from the collected multimedia messages in third party visual angle and user It is marked with corresponding location information on the travel route for position of stopping, and controls display.
In another example be labeled on the travel route for position of stopping when the user clicks a certain position when, determine the position pair Emotional information and the multimedia messages monitored from third party visual angle corresponding to the works information answered, user, and control aobvious Show.
For the embodiment of the present application, emotional change of user during watching works is shown by way of linkage display Works information that information, user are watched during watching works, the travel routes information for being labeled with information of stopping and From the collected multimedia messages in third party visual angle, the abundant degree of the displaying report of display can be improved, improve and determine user The degree of convenience of emotional change during watching each works, and then user experience can be promoted.
Above-described embodiment describes generation from the angle of method flow and shows method for reporting, and following embodiments are from virtual module Or the angle of dummy unit describes generation and shows reporting device, is suitable for above method embodiment, it is specific as follows shown:
The embodiment of the present application provides a kind of generation displaying reporting device, as shown in Fig. 2, the generation shows reporting device 20 It may include: the first acquisition module 21, acquisition module 22 and generation module 23, wherein
First obtains module 21, for obtaining emotional change information of user during watching works as the first letter Breath.
Acquisition module 22, for acquiring the information for the works that user is watched during watching works as the second letter Breath.
Generation module 23, for based on the first information, the second information and the first information and the second information in time Matching relationship generates and shows report.
Wherein, show report for showing user for the corresponding emotional information of each works institute.
The alternatively possible implementation of the embodiment of the present application, first acquisition module 21 include: acquiring unit and really Order member, wherein
Acquiring unit, for obtaining biosensor collected electro-physiological signals during user watches works;
Determination unit, for determining that mood of user during watching works becomes based on the electro-physiological signals got Change information.
The alternatively possible implementation of the embodiment of the present application, device 20 further include: the second acquisition module, first determine Module and labeling module, wherein
Second obtains module, for obtaining location information of user during watching works from alignment sensor;
First determining module determines that user makees in viewing for the location information based on user during watching works Stop location information and travel routes information during product;
Labeling module is obtained for marking position of stopping of user during watching works in travel routes information Third information.
Wherein, third information is the travel routes information for being labeled with position of stopping.
The alternatively possible implementation of the embodiment of the present application, generation module 23 are specifically used for being based on following information, raw It is reported at showing:
The first information;Second information;Third information;The matching relationship of the first information and the second information in time;Second Information and the matching relationship of third information in position.
The alternatively possible implementation of the embodiment of the present application, device 20 further include: third obtains module, wherein
Third obtains module, for obtaining the 4th information.
Wherein, the 4th information includes: that user during watching works from third party visual angle, believe by collected multimedia Cease
Wherein, from third party visual angle, collected multimedia messages include: work that the behavioural information of user, user are watched The content and the environmental information around user of product.
The alternatively possible implementation of the embodiment of the present application, determination unit, specifically for the physiology electric based on acquisition During watching works, from third party visual angle, collected multimedia messages determine that user makees in viewing by signal and user Emotional change information during product.
The alternatively possible implementation of the embodiment of the present application, generation module 23 are specifically used for being based on following information, raw It is reported at showing:
The first information;Second information;Third information;4th information;The matching of the first information and the second information in time Relationship;Second information and the matching relationship of third information in position;The matching of third information and the 4th information in position is closed System.
The alternatively possible implementation of the embodiment of the present application, device 20 further include: the first control display module, In,
First control display module, for when display operation is reported in the displaying for detecting user's triggering, control display to be shown Report.
The alternatively possible implementation of the embodiment of the present application, device 20 further include: the second determining module and second Control display module, wherein
Second determining module detects user in display report for the first information, the second information, third letter for working as The trigger action of any information, determines trigger action corresponding display information in any information in breath and the 4th information, with And the corresponding display information in other information;
Second control display module shows trigger action corresponding display information in any information for controlling, and The corresponding display information in other information, to realize that the first information, the second information, third information and the 4th information exist Linkage display in display report;
Wherein, other information be the first information, the second information, third information and the 4th information in addition to any information Information.
It is noted that the concepts such as " first " that refers in the embodiment of the present application, " second " are only used for device, module or list Member distinguishes, and is not intended to limit these devices, module or unit one and is set to different devices, module or unit, is not also For limiting the sequence or relation of interdependence of function performed by these devices, module or unit.
The embodiment of the present application provides a kind of generation and shows reporting device, and in the prior art by way of questionnaire survey It is compared to obtain the favorable rating that spectators watch the experience of works and viewing is directed to each works, the embodiment of the present application is based on using Family watching emotional change information during works and the content of works that user is watched during watching works, It generates and shows report, i.e., emotional change of user during watching works more can intuitively be observed based on displaying report, And based on this can intuitively determine user be directed to each works favorable rating, so as to save user's questionnaire survey when Between, promote user experience.
The generation of the present embodiment shows that reporting device can be performed to generate shown in any of the above-described embodiment of the application and shows report Announcement method, realization principle is similar, and details are not described herein again.
Above-described embodiment describes generation from the angle of method flow and shows method for reporting and from virtual module or void The angle of quasi-simple member describes generation and shows reporting device, and following angles from entity apparatus structure introduce a kind of electronic equipment, It is specific as follows shown for executing above method embodiment:
The embodiment of the present application provides a kind of electronic equipment, as shown in figure 3, electronic equipment shown in Fig. 3 3000 includes: place Manage device 3001 and memory 3003.Wherein, processor 3001 is connected with memory 3003, is such as connected by bus 3002.It is optional Ground, electronic equipment 3000 can also include transceiver 3004.It should be noted that transceiver 3004 is not limited to one in practical application A, the structure of the electronic equipment 3000 does not constitute the restriction to the embodiment of the present application.
Processor 3001 can be CPU, general processor, DSP, ASIC, FPGA or other programmable logic device, crystalline substance Body pipe logical device, hardware component or any combination thereof.It, which may be implemented or executes, combines described by present disclosure Various illustrative logic blocks, module and circuit.Processor 3001 is also possible to realize the combination of computing function, such as wraps It is combined containing one or more microprocessors, DSP and the combination of microprocessor etc..
Bus 3002 may include an access, and information is transmitted between said modules.Bus 3002 can be pci bus or Eisa bus etc..Bus 3002 can be divided into address bus, data/address bus, control bus etc..Only to be used in Fig. 3 convenient for indicating One thick line indicates, it is not intended that an only bus or a type of bus.
Memory 3003 can be ROM or can store the other kinds of static storage device of static information and instruction, RAM Or the other kinds of dynamic memory of information and instruction can be stored, it is also possible to EEPROM, CD-ROM or other CDs Storage, optical disc storage (including compression optical disc, laser disc, optical disc, Digital Versatile Disc, Blu-ray Disc etc.), magnetic disk storage medium Or other magnetic storage apparatus or can be used in carry or store have instruction or data structure form desired program generation Code and can by any other medium of computer access, but not limited to this.
Memory 3003 is used to store the application code for executing application scheme, and is held by processor 3001 to control Row.Processor 3001 is for executing the application code stored in memory 3003, to realize aforementioned either method embodiment Shown in content.
For the embodiment of the present application, electronic equipment 3000 can be terminal device, or server.In the application reality It applies in example without limitation.
The embodiment of the present application provides a kind of electronic equipment, the electronic equipment in the embodiment of the present application include: memory and Processor;At least one program is stored in the memory, when for being executed by the processor, compared with prior art Can be achieved: the embodiment of the present application is watching works based on emotional change information of user during watch works and user The content for the works watched in the process generates and shows report, i.e., more can intuitively observe user based on displaying report and exist The emotional change during works is watched, and can intuitively determine that user is directed to the favorable rating of each works based on this, from And the time of user's questionnaire survey can be saved, promote user experience.
The embodiment of the present application provides a kind of computer readable storage medium, is stored on the computer readable storage medium Computer program allows computer to execute corresponding contents in preceding method embodiment when run on a computer.With The prior art is compared, and the embodiment of the present application is being watched based on emotional change information of user during watching works and user The content for the works watched during works generates and shows report, i.e., more can intuitively be observed based on displaying report Emotional change of user during watching works, and can intuitively determine that user likes journey for each works based on this Degree promotes user experience so as to save the time of user's questionnaire survey.
It should be understood that although each step in the flow chart of attached drawing is successively shown according to the instruction of arrow, These steps are not that the inevitable sequence according to arrow instruction successively executes.Unless expressly stating otherwise herein, these steps Execution there is no stringent sequences to limit, can execute in the other order.Moreover, at least one in the flow chart of attached drawing Part steps may include that perhaps these sub-steps of multiple stages or stage are not necessarily in synchronization to multiple sub-steps Completion is executed, but can be executed at different times, execution sequence, which is also not necessarily, successively to be carried out, but can be with other At least part of the sub-step or stage of step or other steps executes in turn or alternately.
The above is only some embodiments of the invention, it is noted that for the ordinary skill people of the art For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also answered It is considered as protection scope of the present invention.

Claims (17)

1. a kind of generation shows method for reporting characterized by comprising
Emotional change information of user during watching works is obtained as the first information;
The information for the works that acquisition user is watched during watching works is as the second information;
Based on the first information, second information and the first information and the matching of second information in time Relationship generates and shows report;The displaying report is for showing user for the corresponding emotional information of each works institute.
2. the method according to claim 1, wherein the mood for obtaining user during watching works becomes Change information, comprising:
Obtain biosensor collected electro-physiological signals during user watches works;
Based on the electro-physiological signals got, emotional change information of user during watching works is determined.
3. method according to claim 1 or 2, which is characterized in that described to be based on the first information, the second information and first Information and the matching relationship of the second information in time generate and show report, before further include:
Location information of user during watching works is obtained from alignment sensor;
Location information based on the user during watching works determines stop position of user during watching works Information and travel routes information;
Stop position of user during watching works is marked in the travel routes information, obtains third information, The third information is the travel routes information for being labeled with position of stopping.
4. according to the method described in claim 3, it is characterized in that, described based on the first information, the second information and the first letter Breath and the matching relationship of the second information in time, generate and show report, comprising:
Based on following information, generates and shows report:
The first information;Second information;Third information;The first information and the matching relationship of second information in time; Second information and the matching relationship of third information in position.
5. according to the method described in claim 2, it is characterized in that, the method also includes:
The 4th information is obtained, wherein the 4th information includes: that user is collected from the third party visual angle during watching works Multimedia messages, it is described from the collected multimedia messages in third party visual angle include: the behavioural information of the user, the user Environmental information around the content for the works watched and the user.
6. according to the method described in claim 5, it is characterized in that, described based on the electro-physiological signals got, determine described in Emotional change information of user during watching works, comprising:
Electro-physiological signals and the user based on the acquisition collect during watching works from third party visual angle Multimedia messages determine emotional change information of user during watching works.
7. according to the method described in claim 5, generation shows report it is characterized in that, described be based on following information, comprising:
Based on following information, generates and shows report:
The first information;Second information;Third information;4th information;The first information and second information are in time Matching relationship;Second information and the matching relationship of third information in position;The third information and the 4th information are in place The matching relationship set.
8. the method according to the description of claim 7 is characterized in that the method also includes:
When display operation is reported in the displaying for detecting user's triggering, control shows the display report.
9. according to the method described in claim 8, it is characterized in that, the method also includes:
When detect user the display report on for the first information, second information, the third information with And in the 4th information any information trigger action, determine the trigger action corresponding display in any information Information, and the corresponding display information in other information;
Control shows trigger action corresponding display information in any information, and divides in the other information Not corresponding display information, to realize the first information, second information, the third information and the 4th information Linkage display in the display report;
The other information is to remove in the first information, second information, the third information and the 4th information Information except any information.
10. a kind of generation shows reporting device characterized by comprising
First obtains module, for obtaining emotional change information of user during watching works as the first information;
Acquisition module, for acquiring the information for the works that user is watched during watching works as the second information;
Generation module, for based on the first information, second information and the first information and second information Matching relationship in time generates and shows report;
The displaying report is for showing user for the corresponding emotional information of each works institute.
11. device according to claim 10, which is characterized in that it is described first acquisition module include: acquiring unit and Determination unit, wherein
The acquiring unit, for obtaining biosensor collected electro-physiological signals during user watches works;
The determination unit, for determining feelings of user during watching works based on the electro-physiological signals got Thread change information.
12. device described in 0 or 11 according to claim 1, which is characterized in that described device further include: second obtains module, the One determining module and labeling module, wherein
Described second obtains module, for obtaining location information of user during watching works from alignment sensor;
First determining module determines that user is seeing for the location information based on the user during watching works See stop location information and the travel routes information during works;
The labeling module, for marking stop position of user during watching works in the travel routes information It sets, obtains third information, the third information is the travel routes information for being labeled with position of stopping.
13. device according to claim 12, which is characterized in that
The generation module is specifically used for being based on following information, generates and show report:
The first information;Second information;Third information;The first information and the matching relationship of second information in time; Second information and the matching relationship of third information in position.
14. device according to claim 11, which is characterized in that described device further include: third obtains module, wherein
The third obtains module, and for obtaining the 4th information, the 4th information includes: user during watching works From the collected multimedia messages in third party visual angle, it is described from the collected multimedia messages in third party visual angle include: the use The content and the environmental information around the user for the works that the behavioural information at family, the user are watched.
15. device according to claim 10, which is characterized in that described device further include: the second determining module and Two control display modules, wherein
Second determining module, for when detecting that user is directed to the first information, described the in display report The trigger action of any information, determines the trigger action in institute in two information, the third information and the 4th information State corresponding display information in any information, and the corresponding display information in other information;
The second control display module shows the trigger action corresponding display letter in any information for controlling Breath, and the corresponding display information in the other information, to realize the first information, second information, institute State the linkage display of third information and the 4th information in the display report;
The other information is to remove in the first information, second information, the third information and the 4th information Information except any information.
16. a kind of electronic equipment, characterized in that it comprises:
One or more processors;
Memory;
One or more application program, wherein one or more of application programs are stored in the memory and are configured To be executed by one or more of processors, one or more of programs are configured to: being executed according to claim 1~9 Described in any item generations show method for reporting.
17. a kind of computer readable storage medium, which is characterized in that the storage medium is stored at least one instruction, at least One Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, the code set or instruction set It is loaded by the processor and is executed to realize that the generation as described in claim 1~9 is any shows method for reporting.
CN201910797205.XA 2019-08-27 2019-08-27 Display report generation method and device, electronic equipment and storage medium Active CN110517085B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910797205.XA CN110517085B (en) 2019-08-27 2019-08-27 Display report generation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910797205.XA CN110517085B (en) 2019-08-27 2019-08-27 Display report generation method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110517085A true CN110517085A (en) 2019-11-29
CN110517085B CN110517085B (en) 2022-06-07

Family

ID=68627191

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910797205.XA Active CN110517085B (en) 2019-08-27 2019-08-27 Display report generation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110517085B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120233633A1 (en) * 2011-03-09 2012-09-13 Sony Corporation Using image of video viewer to establish emotion rank of viewed video
CN104202718A (en) * 2014-08-05 2014-12-10 百度在线网络技术(北京)有限公司 Method and device for providing information for user
US20150088542A1 (en) * 2013-09-26 2015-03-26 Be Labs, Llc System and method for correlating emotional or mental states with quantitative data
CN105615902A (en) * 2014-11-06 2016-06-01 北京三星通信技术研究有限公司 Emotion monitoring method and device
US9788777B1 (en) * 2013-08-12 2017-10-17 The Neilsen Company (US), LLC Methods and apparatus to identify a mood of media
CN107463874A (en) * 2017-07-03 2017-12-12 华南师范大学 The intelligent safeguard system of Emotion identification method and system and application this method
CN108225366A (en) * 2016-12-21 2018-06-29 丰田自动车株式会社 Car-mounted device and route information prompt system
CN108693974A (en) * 2018-05-11 2018-10-23 新华网股份有限公司 Data processing method, system and nonvolatile computer storage media
CN108764010A (en) * 2018-03-23 2018-11-06 姜涵予 Emotional state determines method and device
CN109211261A (en) * 2017-07-06 2019-01-15 新华网股份有限公司 data display method and device
CN109766759A (en) * 2018-12-12 2019-05-17 成都云天励飞技术有限公司 Emotion identification method and Related product
CN110096613A (en) * 2019-04-12 2019-08-06 北京奇艺世纪科技有限公司 A kind of video recommendation method, device, electronic equipment and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120233633A1 (en) * 2011-03-09 2012-09-13 Sony Corporation Using image of video viewer to establish emotion rank of viewed video
US9788777B1 (en) * 2013-08-12 2017-10-17 The Neilsen Company (US), LLC Methods and apparatus to identify a mood of media
US20150088542A1 (en) * 2013-09-26 2015-03-26 Be Labs, Llc System and method for correlating emotional or mental states with quantitative data
CN104202718A (en) * 2014-08-05 2014-12-10 百度在线网络技术(北京)有限公司 Method and device for providing information for user
CN105615902A (en) * 2014-11-06 2016-06-01 北京三星通信技术研究有限公司 Emotion monitoring method and device
CN108225366A (en) * 2016-12-21 2018-06-29 丰田自动车株式会社 Car-mounted device and route information prompt system
CN107463874A (en) * 2017-07-03 2017-12-12 华南师范大学 The intelligent safeguard system of Emotion identification method and system and application this method
CN109211261A (en) * 2017-07-06 2019-01-15 新华网股份有限公司 data display method and device
CN108764010A (en) * 2018-03-23 2018-11-06 姜涵予 Emotional state determines method and device
CN108693974A (en) * 2018-05-11 2018-10-23 新华网股份有限公司 Data processing method, system and nonvolatile computer storage media
CN109766759A (en) * 2018-12-12 2019-05-17 成都云天励飞技术有限公司 Emotion identification method and Related product
CN110096613A (en) * 2019-04-12 2019-08-06 北京奇艺世纪科技有限公司 A kind of video recommendation method, device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱婕: "旋律形态对情绪影响的实证研究", 《中国优秀硕士学位论文全文数据库 (哲学与人文科学辑)》, 15 January 2018 (2018-01-15), pages 086 - 103 *

Also Published As

Publication number Publication date
CN110517085B (en) 2022-06-07

Similar Documents

Publication Publication Date Title
CN109154860B (en) Emotional/cognitive state trigger recording
US10045077B2 (en) Consumption of content with reactions of an individual
US11288310B2 (en) Presenting content items based on previous reactions
US9204836B2 (en) Sporadic collection of mobile affect data
JP6165846B2 (en) Selective enhancement of parts of the display based on eye tracking
US9934425B2 (en) Collection of affect data from multiple mobile devices
US20210099405A1 (en) Content item module arrangements
US20150264432A1 (en) Selecting and presenting media programs and user states based on user states
JP2018521381A (en) Emotion detection system
US20140200417A1 (en) Mental state analysis using blink rate
EP3058873A1 (en) Device for measuring visual efficacy
EP3198560A1 (en) User gesture driven avatar apparatus and method
US20140201207A1 (en) Mental state data tagging for data collected from multiple sources
Ma et al. Glancee: An adaptable system for instructors to grasp student learning status in synchronous online classes
CN104915646B (en) A kind of method and terminal of conference management
CN105551206A (en) Emotion-based prompting method and related device and prompting system
US20150078728A1 (en) Audio-visual work story analysis system based on tense-relaxed emotional state measurement and analysis method
CN110517085A (en) It generates and shows method for reporting, electronic equipment and computer readable storage medium
CN113709565B (en) Method and device for recording facial expression of watching video
US20180204471A1 (en) Methods, systems, and computer program products for providing feedback to a user in motion
Guo et al. Understanding mobile reading via camera based gaze tracking and kinematic touch modeling
JP2015146847A (en) Physical information acquisition apparatus, method, and program
WO2014106216A1 (en) Collection of affect data from multiple mobile devices
CN110908505A (en) Interest identification method and device, terminal equipment and storage medium
Yonezawa et al. Capturing Subjective Time as Context and It's Applications (poster)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant