KR101838792B1 - Method and apparatus for sharing user's feeling about contents - Google Patents
Method and apparatus for sharing user's feeling about contents Download PDFInfo
- Publication number
- KR101838792B1 KR101838792B1 KR1020160019897A KR20160019897A KR101838792B1 KR 101838792 B1 KR101838792 B1 KR 101838792B1 KR 1020160019897 A KR1020160019897 A KR 1020160019897A KR 20160019897 A KR20160019897 A KR 20160019897A KR 101838792 B1 KR101838792 B1 KR 101838792B1
- Authority
- KR
- South Korea
- Prior art keywords
- emotion
- image
- user
- information
- image content
- Prior art date
Links
Images
Classifications
-
- H04N5/23219—
-
- G06K9/00268—
-
- G06K9/00302—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
Landscapes
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
A method and apparatus for sharing a user ' s feelings about content are disclosed. A method for analyzing image content based on emotion recognition includes the steps of acquiring a captured image by a user viewing the image content, recognizing an emotion of a user viewing the image content by analyzing the image, May be stored in association with the image content.
Description
Description of the Related Art [0002] The following description relates to a technique of sharing a user's feelings about contents, specifically, a technique of recognizing and storing a user's feelings about the content and providing it to other users.
Conventional video players provide not only content but also additional information related to the content. For example, specialized services such as a personal broadcasting service, a sports relay, and a caption uploading service may be provided as additional information, respectively. The user's reaction to the content helps to analyze the user's preference for the content. For example, the content creator can produce better content based on the user's reaction to the content.
According to an embodiment of the present invention, there is provided a method of analyzing image contents based on emotion recognition, comprising the steps of: acquiring a photographed image by a user viewing image content; recognizing emotions of a user viewing the image content by analyzing the image; And storing emotion information for the recognized emotion in association with the image content.
The recognizing step includes the steps of recognizing the face region of the user in the image, detecting the feature points of the face in the face region, and determining the emotion of the user based on the amount of change of the feature points over time can do.
The storing step may store at least one of the identification information of the image content, the time information on which the feeling is recognized, the image information of the image content outputted when the feeling is recognized, and the type information of the recognized feeling .
According to an embodiment of the present invention, there is provided a method of reproducing an image content, the method comprising: receiving emotion information for image content from a server; and reproducing the image content based on the emotion information, It is possible to display the type of emotion and the time period in which the emotion appears.
The emotion information may be based on emotion information of another user who watched the image content.
FIG. 1 is a diagram illustrating a general configuration of a system for sharing emotion information about image contents according to an exemplary embodiment of the present invention. Referring to FIG.
FIG. 2 is a flowchart illustrating an image content analysis method based on emotion recognition according to an exemplary embodiment of the present invention.
3 is a flowchart illustrating an image content playback method according to an exemplary embodiment of the present invention.
4A is a diagram illustrating a Direct Show interface of an image reproducing apparatus according to an embodiment.
4B is a diagram illustrating an interface of a network according to one embodiment.
FIG. 5 is a diagram illustrating a system operation process according to an embodiment.
6 is a diagram showing an example of emotion detection according to an embodiment.
7 is a diagram illustrating operations between various kinds of video playback apparatuses and servers according to an embodiment.
8A is a diagram illustrating an example of connection to a server in a web page according to an embodiment.
8B is a view showing an example of the emotion recording of the user according to the embodiment.
8C is a diagram illustrating an example of emotion statistics data of a user according to an exemplary embodiment.
The specific structural or functional descriptions below are illustrated for purposes of illustration only and are not to be construed as limiting the scope of the present disclosure. Various modifications and variations may be made thereto by those skilled in the art to which the present invention pertains. Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment, It should be understood that references to "an embodiment" are not all referring to the same embodiment.
Although the terms first or second may be used to distinguish the various components, the components should not be construed as being limited by the first or second term. It is also to be understood that the terminology used in the description is by way of example only and is not intended to be limiting. The singular expressions include plural expressions unless the context clearly dictates otherwise.
In this specification, the terms "comprises" or "having" and the like refer to the presence of stated features, integers, steps, operations, elements, components, or combinations thereof, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS In the following description of the present invention with reference to the accompanying drawings, the same components are denoted by the same reference numerals regardless of the reference numerals, and a duplicate description thereof will be omitted.
FIG. 1 is a diagram illustrating a general configuration of a system for sharing emotion information about image contents according to an exemplary embodiment of the present invention. Referring to FIG.
The system for sharing emotion information about image contents according to an exemplary embodiment includes collecting audience information, deriving emotion information, storing the emotion information, and reproducing the image content based on the derived emotion information. Here, the image content may include content such as a movie, a broadcast, an advertisement, a game, etc. as a content including image information. The system may be operated through communication between the
When the emotion information is generated by the audience
When the emotion information is generated by the
The emotion information about the image contents helps to grasp the detailed symbol of the user about the image contents. For example, a filmmaker can generate a better movie by receiving feedback from a user's response to the movie, and an ad creator can produce an advertisement that motivates the purchaser based on the user's reaction.
FIG. 2 is a flowchart illustrating an image content analysis method based on emotion recognition according to an exemplary embodiment of the present invention.
According to one embodiment, when the emotion information is generated by the audience
The audience
The screen to be captured may be in the form of a BMP file, and the audience
The emotion information generated by the audience
According to another embodiment, when the emotion information is generated by the
According to an exemplary embodiment, the audience
According to one embodiment, the
According to another embodiment, the
According to one embodiment, the
According to another embodiment, the audience
3 is a flowchart illustrating an image content playback method according to an exemplary embodiment of the present invention.
The
The
For example, the user may use the emotional bar on the playback bar to inform the web page or the
According to one embodiment, the interface used in the present invention can largely include two interfaces. The first is the interface of the
4A is a diagram illustrating an interface of an image reproducing apparatus according to an embodiment.
The interface of the
4B is a diagram illustrating an interface of a network according to one embodiment.
The network interface may include information for communication with the
FIG. 5 is a diagram illustrating a system operation process according to an embodiment.
The
For example, the display may display emotion information of other users for the currently playing image content through the first status bar. This can be referred to as the accumulated emotion interval. In addition, the display may display the current user's emotion information on the currently playing image content through the second status bar. This may be referred to as the emotion interval of the current user.
6 is a diagram showing an example of emotion detection according to an embodiment.
For example, when the emotion information is generated by the
Since the position of the face changes when the user moves his or her head while viewing the image content or the posture changes, the
7 is a diagram illustrating operations between various kinds of video playback apparatuses and servers according to an embodiment.
According to one embodiment, when reproducing a specific image content, the
According to another embodiment, the user may request the
8A is a diagram illustrating an example of connection to a server in a web page according to an embodiment. The web page can display not only emotion information, but also various information such as the performance, performance, and rating of the reality given after watching the movie.
8B is a view showing an example of the emotion recording of the user according to the embodiment. The user can grasp the emotion information about the movie he / she viewed through the web page. The user can retrieve or delete screen captures for the emotions.
8C is a diagram illustrating an example of emotion statistics data of a user according to an exemplary embodiment. The
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.
Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.
Claims (5)
Analyzing the image and recognizing the emotion of a user viewing the image content;
Storing emotion information for the recognized emotion in association with the image content;
Receiving emotion information about a video content from a server; And
And reproducing the image content based on emotion information of another user who watched the image content,
The method of claim 1,
A first emotional bar type, a second emotional type, and a second emotional type, wherein when the image content is reproduced, the type of emotions of the other user and the time periods in which the emotions of the other user are displayed, Displays a time period in which the emotion of the user who watches the image content appears in the form of a second emotional bar using a pattern or color depending on the emotional type of the user who watches the image content, 1 emotional bar and the second emotional bar are arranged adjacent to each other,
A method of reproducing image contents based on emotion recognition.
Wherein the recognizing comprises:
Recognizing a face region of the user in the image;
Detecting feature points of a face in the face region; And
Determining the emotion of the user based on the amount of change of the feature points with respect to time
Wherein the emotion recognition method comprises:
Wherein the storing step comprises:
Wherein the emotion recognition unit stores at least one of identification information of the image content, time information of the emotion recognized, image information of the image content output when the emotion is recognized, and type information of the recognized emotion, Playback method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160019897A KR101838792B1 (en) | 2016-02-19 | 2016-02-19 | Method and apparatus for sharing user's feeling about contents |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160019897A KR101838792B1 (en) | 2016-02-19 | 2016-02-19 | Method and apparatus for sharing user's feeling about contents |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170098380A KR20170098380A (en) | 2017-08-30 |
KR101838792B1 true KR101838792B1 (en) | 2018-03-15 |
Family
ID=59760415
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160019897A KR101838792B1 (en) | 2016-02-19 | 2016-02-19 | Method and apparatus for sharing user's feeling about contents |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101838792B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11847827B2 (en) | 2020-07-09 | 2023-12-19 | Samsung Electronics Co., Ltd. | Device and method for generating summary video |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102664589B1 (en) * | 2018-09-11 | 2024-05-10 | 현대자동차주식회사 | Emotion classifying apparatus, and controlling method of the emotion classifying apparatus |
KR102555524B1 (en) * | 2021-06-29 | 2023-07-21 | 주식회사 케이에스앤픽 | Server for providing video contents platform and method thereof |
KR102427964B1 (en) * | 2021-12-16 | 2022-08-02 | 조현경 | Interactive Responsive Web Drama Playback system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008234431A (en) * | 2007-03-22 | 2008-10-02 | Toshiba Corp | Comment accumulation device, comment creation browsing device, comment browsing system, and program |
JP2016024577A (en) * | 2014-07-18 | 2016-02-08 | 株式会社Nttドコモ | User behavior recording device, user behavior recording method, and program |
-
2016
- 2016-02-19 KR KR1020160019897A patent/KR101838792B1/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008234431A (en) * | 2007-03-22 | 2008-10-02 | Toshiba Corp | Comment accumulation device, comment creation browsing device, comment browsing system, and program |
JP2016024577A (en) * | 2014-07-18 | 2016-02-08 | 株式会社Nttドコモ | User behavior recording device, user behavior recording method, and program |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11847827B2 (en) | 2020-07-09 | 2023-12-19 | Samsung Electronics Co., Ltd. | Device and method for generating summary video |
Also Published As
Publication number | Publication date |
---|---|
KR20170098380A (en) | 2017-08-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7889073B2 (en) | Laugh detector and system and method for tracking an emotional response to a media presentation | |
US8726304B2 (en) | Time varying evaluation of multimedia content | |
US7953254B2 (en) | Method and apparatus for generating meta data of content | |
JP6282769B2 (en) | Engagement value processing system and engagement value processing device | |
KR101838792B1 (en) | Method and apparatus for sharing user's feeling about contents | |
JP5391144B2 (en) | Facial expression change degree measuring device, program thereof, and program interest degree measuring device | |
KR101895846B1 (en) | Facilitating television based interaction with social networking tools | |
US20120072936A1 (en) | Automatic Customized Advertisement Generation System | |
KR101618590B1 (en) | Method and system for providing immersive effects | |
CN110868554B (en) | Method, device and equipment for changing faces in real time in live broadcast and storage medium | |
US20140223474A1 (en) | Interactive media systems | |
WO2018097177A1 (en) | Engagement measurement system | |
KR20140043070A (en) | Devices, systems, methods, and media for detecting, indexing, and comparing video signals from a video display in a background scene using a camera-enabled device | |
CN106851395B (en) | Video playing method and player | |
JP6236875B2 (en) | Content providing program, content providing method, and content providing apparatus | |
KR101947079B1 (en) | Psychological reaction inference system and method of a watching viewer on broadcasting contents | |
JP7206741B2 (en) | HEALTH CONDITION DETERMINATION SYSTEM, HEALTH CONDITION DETERMINATION DEVICE, SERVER, HEALTH CONDITION DETERMINATION METHOD, AND PROGRAM | |
JP2011239158A (en) | User reaction estimation apparatus, user reaction estimation method and user reaction estimation program | |
CN113762156B (en) | Video data processing method, device and storage medium | |
CN105992065B (en) | Video on demand social interaction method and system | |
JP2011254232A (en) | Information processing device, information processing method, and program | |
TR201921941A2 (en) | A system that allows the audience to be presented with a mood-related scenario. | |
TW201826086A (en) | Engagement measurement system | |
Uribe et al. | New accessibility services in HbbTV based on a deep learning approach for media content analysis | |
JP2013114732A (en) | Recorder and recording method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |