US20180054564A1 - Apparatus and method for providing user's emotional information in electronic device - Google Patents
Apparatus and method for providing user's emotional information in electronic device Download PDFInfo
- Publication number
- US20180054564A1 US20180054564A1 US15/796,120 US201715796120A US2018054564A1 US 20180054564 A1 US20180054564 A1 US 20180054564A1 US 201715796120 A US201715796120 A US 201715796120A US 2018054564 A1 US2018054564 A1 US 2018054564A1
- Authority
- US
- United States
- Prior art keywords
- electronic device
- emotional information
- user
- information
- moving picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000002996 emotional effect Effects 0.000 title claims abstract description 465
- 238000000034 method Methods 0.000 title claims abstract description 61
- 239000000284 extract Substances 0.000 claims description 40
- 230000015654 memory Effects 0.000 claims description 16
- 230000008451 emotion Effects 0.000 description 152
- 210000001097 facial muscle Anatomy 0.000 description 35
- 238000013500 data storage Methods 0.000 description 29
- 230000001815 facial effect Effects 0.000 description 17
- 238000004891 communication Methods 0.000 description 14
- 230000005611 electricity Effects 0.000 description 11
- 238000000605 extraction Methods 0.000 description 9
- 230000003213 activating effect Effects 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 4
- 238000010276 construction Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000009529 body temperature measurement Methods 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 210000005036 nerve Anatomy 0.000 description 2
- 230000002889 sympathetic effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000002583 angiography Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- -1 electricity Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000005037 parasympathetic nerve Anatomy 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H04N5/23219—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00326—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
- H04N1/00339—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an electronic or magnetic storage medium I/O device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N1/32101—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N1/32101—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N1/32128—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H04N5/2258—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0084—Digital still camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3204—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
Definitions
- the present disclosure relates to an apparatus and a method for providing a user's emotional information in an electronic device.
- the electronic device may provide a service using emotional information included in content as a way for meeting the user's various needs.
- a portable electronic device may provide a service using emotion regarding an object included in a photo.
- the electronic device may estimate emotional information included in the content such as, for example, a photo.
- the electronic device cannot estimate emotion for the relevant content (e.g., the relevant photo).
- the electronic device cannot estimate the emotion of a user who has taken a picture from a photo. Accordingly, a method for determining emotion of a user who uses content and adding the emotion to content in an electronic device is required.
- an aspect of the present disclosure is to provide an apparatus and a method for adding emotional information of a user who uses content to content.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a second camera to an image shot via a first camera in an electronic device.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a second camera to a moving picture shot via a first camera in an electronic device.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a camera to a moving picture being reproduced when an electronic device reproduces the moving picture.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a camera to electronic book content when an electronic device provides an electronic book service.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a camera to purchasable goods information in an electronic device.
- Another aspect of the present disclosure is to provide an apparatus and a method for retrieving content using emotional information added to content in an electronic device.
- Another aspect of the present disclosure is to provide an apparatus and a method for classifying content using emotional information added to content in an electronic device.
- a method for providing emotional information in an electronic device includes displaying at least one content, extracting emotional information from an image obtained via a camera, and adding the emotional information to the content.
- an electronic device includes at least one camera, a display unit, and at least one processor, wherein the at least one processor operatively displays at least one content on the display unit, extracts emotional information from an image obtained via at least one of the at least one camera, and adds the emotional information to the content.
- a method for capturing emotional information in an electronic device includes capturing an image of at least one user via a camera while the electronic device is providing a service, extracting emotional information of the at least one user, and associating the emotional information of the at least user with the service being provided when the image of the at least one user is captured.
- FIG. 1 is a block diagram illustrating an electronic device according to an embodiment of the present disclosure
- FIG. 2 is a detailed block diagram illustrating a processor according to an embodiment of the present disclosure
- FIG. 3 is a flowchart illustrating a procedure for adding a user's emotional information to content in an electronic device according to an embodiment of the present disclosure
- FIG. 4 is a flowchart illustrating a procedure for adding a user's emotional information to a shot image in an electronic device according to an embodiment of the present disclosure
- FIGS. 5A, 5B, 5C, and 5D are views illustrating screen configuration for adding a user's emotional information to a shot image in an electronic device according to an embodiment of the present disclosure
- FIG. 6 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure
- FIG. 7 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure
- FIG. 8 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure
- FIGS. 9A, 9B, and 9C are views illustrating screen configuration for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure
- FIG. 10 is a flowchart illustrating a procedure for adding a user's emotional information to electronic book content in an electronic device according to an embodiment of the present disclosure
- FIGS. 11A and 11B are views illustrating screen configuration for adding a user's emotional information to electronic book content in an electronic device according to an embodiment of the present disclosure
- FIG. 12 is a flowchart illustrating a procedure for adding a user's emotional information to shopping information in an electronic device according to an embodiment of the present disclosure
- FIG. 13 is a flowchart illustrating a procedure for adding a user's emotional information to shopping information in an electronic device according to an embodiment of the present disclosure
- FIG. 14 is a flowchart illustrating a procedure for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure
- FIGS. 15A, 15B, 15C, and 15D are views illustrating screen configuration for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure
- FIG. 16 is a flowchart illustrating a procedure for displaying moving picture information with consideration of emotional information in an electronic device according to an embodiment of the present disclosure
- FIG. 17 is a flowchart illustrating a procedure for displaying moving picture information with consideration of emotional information in an electronic device according to an embodiment of the present disclosure
- FIGS. 18A, 18B, and 18C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure
- FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure
- FIG. 20 is a flowchart illustrating a procedure for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure
- FIG. 21 is a flowchart illustrating a procedure for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure
- FIGS. 22A and 22B are views illustrating screen configuration for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the electronic device includes a mobile communication terminal, a Personal Digital Assistant (PDA), a Personal Computer (PC), a laptop computer, a tablet computer, a smartphone, a video phone, an e-book reader, a netbook, a Mobile Internet Device (MID), an Ultra Mobile PC (UMPC), a tablet PC, a navigation, a Portable Multimedia Player (PMP), an MP3 player having a camera, a mobile medical device, a camera, a wearable device (e.g., a Head-Mounted Device (HMD), electronic clothes, electronic braces, an electronic necklace, an electronic appcessory, an electronic tattoo, or a smart watch), and/or the like.
- PDA Personal Digital Assistant
- PC Personal Computer
- laptop computer a laptop computer
- a tablet computer a smartphone
- a video phone an e-book reader
- a netbook a Mobile Internet Device (MID)
- MID Mobile Internet Device
- UMPC Ultra Mobile PC
- PMP Portable Multimedia Player
- MP3 player having
- an electronic device may be a smart home appliance with communication functionality.
- a smart home appliance may be, for example, a television (TV) (e.g., a smart TV), a Digital Video Disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washer, a dryer, an air purifier, a set-top box, a TV box (e.g., Samsung HomeSyncTM, Apple TVTM, or Google TVTM), a gaming console, an electronic dictionary, an electronic key, a camcorder, an electronic picture frame, and/or the like.
- TV television
- DVD Digital Video Disk
- an electronic device may be a medical device (e.g., Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging (MRI) device, Computed Tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), an automotive infotainment device, a naval electronic device (e.g., naval navigation device, gyroscope, or compass), an avionic electronic device, a security device, an industrial or consumer robot, and/or the like.
- MRA Magnetic Resonance Angiography
- MRI Magnetic Resonance Imaging
- CT Computed Tomography
- an imaging device an ultrasonic device
- GPS Global Positioning System
- EDR Event Data Recorder
- FDR Flight Data Recorder
- automotive infotainment device e.g., a navigation device, a Global Positioning System (GPS) receiver, an Event
- an electronic device may be furniture, part of a building/structure, an electronic board, electronic signature receiving device, a projector, various measuring devices (e.g., water, electricity, gas or electro-magnetic wave measuring devices), and/or the like that include communication functionality.
- various measuring devices e.g., water, electricity, gas or electro-magnetic wave measuring devices
- an electronic device may be any combination of the foregoing devices.
- an electronic device according to various embodiments of the present disclosure is not limited to the foregoing devices.
- FIG. 1 is a block diagram illustrating an electronic device according to an embodiment of the present disclosure.
- the electronic device 100 includes a memory 110 , a processor unit 120 , an audio processor 130 , a first camera unit 140 , a second camera unit 150 , an Input/Output (I/O) controller 160 , a display unit 170 , and an input unit 180 .
- a plurality of memories 110 may exist.
- the memory 110 includes a program storage 111 for storing a program for controlling an operation of the electronic device 100 , and a data storage 112 for storing data generated during execution of a program.
- the data storage 112 stores a user's emotional information for content.
- the data storage 112 may store content to which an emotional tag for user emotional information extracted via an emotion extract program 114 has been added.
- the data storage 112 may store at least one content and metadata including emotional information of each content.
- the data storage 112 may store at least one content and an emotional information table including emotional information of each content.
- the program storage 111 includes a Graphic User Interface (GUI) program 113 , the emotion extract program 114 , a file management program 115 , and at least one application 116 .
- GUI Graphic User Interface
- a program included in the program storage 111 is a set of instructions and may be expressed as an instruction set.
- FIGS. 18A, 18B, and 18C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the GUI program 113 includes at least one software element for providing a user interface on the display unit 170 using graphics.
- the GUI program 113 may control to display information of an application driven by a processor 122 on the display unit 170 .
- the GUI program 113 may control to display a user's emotional information for content displayed on the display unit 170 .
- the GUI program 113 may control to display emotional information 1801 , 1803 and 1805 on a time search bar for the moving picture, as illustrated in FIGS. 18B and 18C .
- FIGS. 22A and 22B are views illustrating screen configuration for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the GUI program 113 may control to additionally display emotional information 2211 and 2213 on electronic book content displayed on the display unit 170 as illustrated in FIG. 22B .
- the GUI program 113 may control to additionally display emotional information on goods information.
- the emotion extract program 114 includes at least one software element for extracting a user's emotional information. For example, the emotion extract program 114 estimates movements of a plurality of facial muscles for estimating emotional information from a user's facial image obtained via the first camera unit 140 and the second camera unit 150 . After estimating movements of a plurality of facial muscles for estimating emotional information from a user's facial image, the emotion extract program 114 extracts the user's emotion with consideration of movement information for the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotion value, the emotion extract program 114 may recognize that the emotion extract program 114 has extracted the user's emotion for relevant content.
- the emotion extract program 114 may selectively activate the first camera unit 140 or the second camera unit 150 in order to obtain a user image. For example, the emotion extract program 114 may extract the user's emotional information using a user image obtained via the second camera unit 150 at a point of capturing an image via the first camera unit 140 . At this point, the emotion extract program 114 may activate the second camera unit 150 when taking a photograph via the first camera unit 140 .
- the emotion extract program 114 may extract the user's emotional information using the user image obtained via the second camera unit 150 while capturing a moving picture via the first camera unit 140 . At this point, the emotion extract program 114 may activate the second camera unit 150 when capturing a moving picture via the first camera unit 140 .
- the emotion extract program 114 may extract the user's emotional information using a user image obtained via the second camera unit 150 while reproducing a moving picture. At this point, the emotion extract program 114 may activate the second camera unit 150 when reproducing the moving picture.
- the emotion extract program 114 may extract the user's emotional information using a user image obtained via the second camera unit 150 while providing an electronic book service. At this point, the emotion extract program 114 may activate the second camera unit 150 when providing the electronic book service.
- the emotion extract program 114 may extract the user's emotional information using a user image obtained via the second camera unit 150 at a point of determining purchasable goods information. At this point, the emotion extract program 114 may activate the second camera unit 150 when providing a shopping service.
- the emotion extract program 114 may extract the user's emotional information using a user image obtained via the second camera unit 150 at a point of purchasing goods. At this point, the emotion extract program 114 may activate the second camera unit 150 when providing a shopping service.
- the emotion extract program 114 may extract the user's emotional information using a user image obtained via the second camera unit 150 at a point of reading and/or viewing a communication (e.g., an email, a text message, an instant message, and/or the like). At this point, the emotion extract program 114 may activate the second camera unit 150 when a communication is being displayed and/or read.
- a communication e.g., an email, a text message, an instant message, and/or the like.
- the emotion extract program 114 may extract the user's emotional information using a user image obtained via the second camera unit 150 at a point of providing an on-line dating service. At this point, the emotion extract program 114 may activate the second camera unit 150 when a communication is being displayed and/or read, when a profile is being viewed, and/or the like.
- the emotion extract program 114 may extract the user's emotional information using a user image obtained via the second camera unit 150 at a point of providing a Social Networking Service (SNS). At this point, the emotion extract program 114 may activate the second camera unit 150 when a communication is being displayed and/or read, when a profile is being viewed, when a status is being updated, when a status is being viewed and/or read, and/or the like.
- SNS Social Networking Service
- the file management program 115 includes at least one software element for retrieving, classifying, and reproducing each content using emotional information for content stored in the data storage 112 .
- FIGS. 15A, 15B, 15C, and 15D are views illustrating screen configuration for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the file management program 115 may control to classify and display photo content stored in the data storage 112 depending on emotional information as illustrated in FIGS. 15B and 15C .
- FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- FIGS. 22A and 22B are views illustrating screen configuration for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the file management program 115 may control to classify and display emotional information included in electronic book content using a structure window 2201 as illustrated in FIG. 22A .
- the application 116 includes a software element for at least one application installed in the electronic device 100 .
- the processor unit 120 includes a memory interface 121 , at least one processor 122 , and a peripheral interface 124 .
- the memory interface 121 , the at least one processor 122 , and the peripheral interface 124 included in the processor unit 120 may be integrated in at least one integrated circuit, or implemented as separate elements.
- the memory interface 121 controls a memory access of an element such as the processor 122 or the peripheral interface 124 .
- the audio processor 130 provides an audio interface between the user and the electronic device 100 via a speaker 131 and a microphone 132 .
- the first camera unit 140 is positioned in the rear side of the electronic device 100 to provide a collected image to the processor unit 120 by capturing an object
- the second camera unit 150 is positioned in the front side of the electronic device 100 to provide a collected image to the processor unit 120 by capturing an object.
- the first camera unit 140 and the second camera unit 150 may include a camera sensor for converting an optical signal to an electric signal, an image processor for converting an analog image signal to a digital image signal, and a signal processor for image-processing an image signal output from the image processor so that the image signal may be displayed on the display unit 170 .
- the I/O controller 160 provides an interface between an I/O unit such as the display unit 170 and the input unit 180 , and the peripheral interface 123 .
- the display unit 170 displays status information of the electronic device 100 , a character input by the user, a moving picture, a still picture, and the like. For example, the display unit 170 displays information of an application driven by the processor 122 . If an emotion display menu has been set, the display unit 170 may additionally display a user's emotional information for content displayed on the display unit 170 under control of the GUI program 113 . For example, in the case in which the emotion display menu has been set when reproducing a moving picture, the display unit 170 may display emotional information 1801 , 1803 , and 1805 on a time search bar for the moving picture as illustrated in FIGS. 18B and 18C .
- the display unit 170 may additionally display emotional information on electronic book content displayed on the display unit 170 as illustrated in FIG. 22B .
- the display unit 170 may additionally display emotional information on goods information.
- the input unit 180 provides input data generated by the user's selection to the processor unit 120 via the I/O controller 160 .
- the input unit 180 includes a keypad including at least one hardware button and a touchpad for detecting touch information.
- the input unit 180 provides touch information detected via the touchpad to the processor 122 via the I/O controller 160 .
- the electronic device 100 may include a communication system for performing a communication function for voice communication and data communication.
- the communication system may be divided to a plurality of sub-modules supporting different communication networks.
- the communication network includes a Global System for Mobile communication (GSM) network, an Enhanced Data GSM Environment (EDGE) network, a Code Division Multiple Access (CDMA) network, a W-Code Division Multiple Access (W-CDMA) network, a Long Term Evolution (LTE) network, an Orthogonal Frequency Division Multiple Access (OFDMA) network, a wireless Local Area Network (LAN), a Bluetooth network, Near Field Communications (NFC), and/or the like.
- GSM Global System for Mobile communication
- EDGE Enhanced Data GSM Environment
- CDMA Code Division Multiple Access
- W-CDMA W-Code Division Multiple Access
- LTE Long Term Evolution
- OFDMA Orthogonal Frequency Division Multiple Access
- LAN wireless Local Area Network
- Bluetooth Near Field Communications
- FIG. 2 is a detailed block diagram illustrating a processor according to an embodiment of the present disclosure.
- the processor 122 includes an application driver 200 , an emotion extractor 210 , a file manager 220 , and a display controller 230 .
- the application driver 200 executes at least one application 116 stored in the program storage 111 to provide a service corresponding to the relevant program.
- the application driver 200 may execute an application stored in the program storage 111 to reproduce a moving picture.
- the application driver 200 may execute an application stored in the program storage 111 to capture a photo or a moving picture using the first camera unit 140 or the second camera unit 150 .
- the application driver 200 may execute an application stored in the program storage 111 to provide an electronic book service.
- the application driver 200 may execute an application stored in the program storage 111 to provide a shopping service.
- the emotion extractor 210 executes the emotion extract program 114 stored in the program storage 111 to extract a user's emotion. For example, the emotion extractor 210 estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via the first camera unit 140 or the second camera unit 150 . After estimating movements of a plurality of facial muscles for estimating emotional information from a user's facial image, the emotion extractor 210 extracts the user's emotion with consideration of movement information of the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotion value, the emotion extract program 114 may recognize that the emotion extract program 114 has extracted the user's emotion for relevant content.
- the emotion extractor 210 may selectively activate the first camera unit 140 or the second camera unit 150 for obtaining the user's image. For example, when taking a photograph using the first camera 140 , the emotion extractor 210 activates the second camera unit 150 . After activating the second camera unit 150 , the emotion extractor 210 extracts the user's emotional information using a user image obtained via the second camera unit 150 at a point of capturing an image via the first camera unit 140 .
- the emotion extractor 210 when capturing a moving picture via the first camera unit 140 , the emotion extractor 210 activates the second camera unit 150 . After activating the second camera unit 150 , the emotion extractor 210 may extract the user's emotional information using a user image obtained via the second camera unit 150 while capturing a moving picture using the first camera unit 140 .
- the emotion extractor 210 activates the second camera unit 150 . After activating the second camera unit 150 , the emotion extractor 210 may extract the user's emotional information using a user image obtained via the second camera unit 150 while reproducing a moving picture.
- the emotion extractor 210 activates the second camera unit 150 . After activating the second camera unit 150 , the emotion extractor 210 may extract the user's emotional information using a user image obtained via the second camera unit 150 while providing the electronic book service.
- the emotion extractor 210 activates the second camera unit 150 .
- the emotion extractor 210 may extract the user's emotional information using a user image obtained via the second camera unit 150 at a point of determining purchasable goods information.
- the emotion extractor 210 activates the second camera unit 150 . After activating the second camera unit 150 , the emotion extractor 210 may extract the user's emotional information using a user image obtained via the second camera unit 150 at a point of purchasing goods.
- the emotion extractor 210 may execute the emotion extract program 114 to transmit extracted user's emotional information to the memory 110 .
- the memory 110 may map the user's emotional information to relevant content and store the same.
- the file manager 220 may execute the file management program 115 stored in the program storage 111 to manage each content depending on emotional information of content. Specifically, the file manager 220 may retrieve, classify, and reproduce each content using emotional information of content stored in the data storage 112 . For example, the file manager 220 may control to classify and display photo content stored in the data storage 112 according to emotional information as illustrated in FIG. 15B or 15C . As another example, the file manager 220 may control to display a thumbnail of a moving picture depending on emotional information included in the moving picture as illustrated in FIG. 19A . As another example, the file manager 220 may control to classify and display emotional information included in electronic book content using the structure window 2201 as illustrated in FIG. 22A .
- the display controller 230 controls to execute the GUI program 113 stored in the program storage 111 to display a user interface on the display unit 170 using graphics. For example, the display controller 230 controls to display information of an application driven by the application driver 200 on the display unit 170 . If the emotion display menu has been set, the display controller 230 may control to display the user's emotional information for content displayed on the display unit 170 . For example, if the emotion display menu has been set while reproducing a moving picture, the display controller 230 may control to display emotional information 1801 , 1803 , and 1805 on a time search bar of the moving picture as illustrated in FIG. 18B or 18C .
- the display controller 230 may control to additionally display emotional information on electronic book content displayed on the display unit 170 as illustrated in FIG. 22B .
- the display controller 230 may control to additionally display emotional information on goods information.
- the electronic device 100 may add the user's emotional information to content and manage the content depending on the user's emotion using the processor 122 including the emotion extractor 210 and the file manager 220 .
- the electronic device 100 may include a separate control module for adding the user's emotional information to content, and managing the content depending on the user's emotion.
- FIG. 3 is a flowchart illustrating a procedure for adding a user's emotional information to content in an electronic device according to an embodiment of the present disclosure.
- the electronic device drives a first application to provide a service.
- the electronic device may drive a camera application to provide a photographing service or a moving picture capturing service using the first camera unit 140 or the second camera unit 150 .
- the electronic device may drive a moving picture reproduction application to reproduce a moving picture.
- the electronic device may drive an electronic book application to provide an electronic book service.
- the electronic device may drive a mobile shopping application to provide a mobile shopping service.
- electronic device extracts emotional information of a user who uses a service corresponding to a first application using a user image provided via at least one camera. For example, when providing a photographing service or a moving picture capturing service via the first camera unit 140 , the electronic device may activate the second camera unit 150 to extract emotional information from an obtained user image. As another example, when reproducing a moving picture, the electronic device may activate the second camera unit 150 to extract emotional information from an obtained user image. As another example, when providing an electronic book service, the electronic device may activate the second camera unit 150 to extract emotional information from an obtained user image. As another example, when providing a shopping service, the electronic device may activate the second camera unit 150 to extract emotional information from an obtained user image.
- the electronic device After extracting the user's emotional information at operation 303 , at operation 305 , the electronic device adds the user's emotional information to content corresponding to a first application and stores the same. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to content corresponding to a first application and store the same. As another example, the electronic device may generate and store metadata including the user's emotional information for content corresponding to the first application.
- FIG. 4 is a flowchart a procedure for adding a user's emotional information to a shot image in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether an image capturing service using the first camera unit 140 is provided. For example, the electronic device determines whether an image capturing event using the first camera 140 occurs depending on touch information provided via the input unit 180 .
- the electronic device determines that the image capturing service using the first camera unit 140 is not being provided at operation 401 , then the electronic continues to poll for an indication that the image capturing service using the first camera unit 140 is provided.
- the electronic device determines that the image capturing service using the first camera unit 140 is being provided at operation 401 , then the electronic device proceeds to operation 403 at which the electronic device drives the second camera unit 150 .
- FIGS. 5A, 5B, 5C, and 5D are views illustrating screen configuration for adding a user's emotional information to a shot image in an electronic device according to an embodiment of the present disclosure.
- the electronic device activates the second camera unit 150 positioned in the direction opposite to the first camera unit 140 that shoots an object when capturing an object. For example, when a capturing service using the first camera unit 140 is provided, the electronic device displays a preview screen obtained via the first camera unit 140 on the display unit 170 as illustrated in FIG. 5A . When detecting selection of a set icon 501 included in the preview screen, the electronic device displays a setting menu 503 .
- the electronic device After displaying the setting menu 503 , when detecting selection of an emotion display menu (sense tag) 511 in the setting menu 503 as illustrated in FIG. 5B , the electronic device recognizes the emotion display menu has been set. Accordingly, as illustrated in FIG. 5C , the electronic device may display a user's emotional information 521 extracted by the second camera unit 150 on the preview screen obtained via the first camera unit 140 .
- the electronic device determines whether an image capturing event occurs. For example, the electronic device determines whether selection of a capturing icon 523 is detected.
- the electronic device determines that the image capturing event does not occur at operation 405 , then the electronic device continue to poll for an indication that the image capturing event occurs.
- the electronic device determines that the image capturing event occurs at operation 405 , then the electronic device proceeds to operation 407 at which the electronic device obtains a capturing image via the first camera unit 140 .
- the electronic device may also proceeds to operation 409 at which the electronic device extracts the user's emotional information from a user image obtained via the second camera unit 150 at a point at which an image capturing event has occurred. For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via the second camera unit 150 . After extracting the user's emotional information from the user image, the electronic device extracts the user's emotion with consideration of the movement information of the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for the relevant shot image.
- the electronic device After extracting the captured image and the user's emotional information at operations 407 and 409 , respectively, the electronic device proceeds to operation 411 at which the electronic device adds the user's emotional information extracted when obtaining the shot image to the shot image and stores the same. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to a shot image, or generate metadata including the user's emotional information for a shot image to store the user's emotional information for the shot image. As illustrated in FIG. 5D , the electronic device displays the shot image on the display unit 170 . For example, when obtaining the shot image, the electronic device may display the user's emotional information 531 on a partial region of the display unit 170 .
- the electronic device may extract the user's emotional information using a user image obtained via the second camera unit 150 .
- the electronic device may extract the user's emotional information using a user image obtained via the first camera unit 140 .
- FIG. 6 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether a moving picture capturing service using the first camera unit 140 is provided. For example, the electronic device determines whether a moving picture capturing event using the first camera unit 140 occurs depending on touch information provided via the input unit 180 .
- the electronic device determines that the moving picture capturing service using the first camera unit 140 is not being provided at operation 601 , then the electronic continues to poll for an indication that the moving picture capturing service using the first camera unit 140 is provided.
- the electronic device determines that the moving picture capturing service using the first camera unit 140 is provided at operation 601 , then the electronic device proceeds to operation 603 at which the electronic device drives the second camera unit 150 .
- the electronic device activates the second camera unit 150 positioned in the direction opposite to the first camera unit 140 capturing a moving picture when capturing a moving picture.
- the electronic device displays a preview screen obtained via the first camera unit 140 on the display unit 170 . If selection of a setting icon included in the preview screen is detected, the electronic device displays a setting menu.
- the electronic device detects selection of the emotion display menu in the setting menu, the electronic device recognizes the emotion display menu has been set.
- the electronic device determines whether a moving picture capturing event occurs. For example, the electronic device determines whether selection of a capturing icon displayed on the preview screen is detected.
- the electronic device may continue to poll for an indication that a moving picture capturing event occurs.
- the electronic device determines that a moving picture capturing event occurs at operation 605 , then the electronic device proceed to operation 607 at which the electronic device obtains a moving picture via the first camera unit 140 .
- the electronic device may proceed to operation 613 at which the electronic device determines whether the moving picture capturing ends.
- the electronic device determines that a moving picture capturing event occurs in operation 605 , then the electronic device may also proceed to operation 609 at which the electronic device determines whether the user's emotional information is extracted from a user image obtained via the second camera unit 150 while the moving picture is shot. For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via the second camera unit 150 . After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information of the facial muscles. At this point, when an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize the electronic device has extracted the user's emotion from a frame of the capturing point of the relevant moving picture.
- the electronic device determines that the user's emotional information has not been extracted at operation 609 , then the electronic device proceeds to operation 613 at which the electronic device determines whether the moving picture capturing ends.
- the electronic device determines that the emotional information of the user is extracted at operation 609 , then the electronic device proceeds to operation 611 at which the electronic device adds the user's emotional information obtained while capturing a moving picture to the moving picture.
- the electronic device adds the user's emotional information to the moving picture so that a point of extracting the user's emotional information is displayed.
- FIGS. 9A, 9B, and 9C are views illustrating screen configuration for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure.
- the electronic device adds the user's emotional information 901 to a time search bar of the moving picture of the point of extracting the user's emotional information as illustrated in FIG. 9A .
- the electronic device may add the user's emotional information 903 and 905 to the time search bar of the moving picture of the point of extracting the user's emotional information as illustrated in FIGS. 9B and 9C .
- the electronic device may constantly display the previously added user's emotional information on the time search bar as illustrated in FIGS. 9B and 9C .
- the electronic device determines whether the moving picture capturing ends.
- the electronic device determines that the moving picture capturing does not end at operation 613 , then the electronic device proceeds to operation 607 at which the electronic device obtains a moving picture via the first camera unit 140 .
- the electronic device determines that the moving picture capturing does not end at operation 613 , then the electronic device also proceeds to operation 609 at which the electronic device determines whether the user's emotional information is extracted from a user image obtained via the second camera unit 150 while the moving picture is captured.
- the electronic device determines that the moving picture capturing ends at operation 613 , then the electronic device proceeds to operation 615 at which the electronic device stores the moving picture to which the emotional information has been added.
- the electronic device may add an emotional tag corresponding to the user's emotional information to a frame of a point of extracting the user's emotional information among frames forming the moving picture, and store the same.
- the electronic device may generate metadata including the user's emotional information obtained during moving picture capture, and store the same together with the moving picture.
- the metadata including the user's emotional information includes information of a point of extracting the user's emotional information during moving picture capture together.
- the electronic device may add the user's emotional information to a time search bar of the moving picture every point of extracting the user's emotional information.
- the electronic device may add the user's emotional information to the time search bar of the moving picture at a point at which the user's emotional information has changed. For example, in case of extracting the user's emotion of happiness at a first point of the moving picture and then extracting the user's emotion of happiness also at a second point, the electronic device may add only information of the user's emotion of happiness at the first point to the time search bar of the moving picture.
- FIG. 7 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to another embodiment of the present disclosure.
- the electronic device determines whether a moving picture capturing service using the first camera unit 140 is provided. For example, the electronic device determines whether a moving picture capturing event using the first camera unit 140 occurs depending on touch information provided via the input unit 180 .
- the electronic device determines that the moving picture capturing service using the first camera unit 140 is not being provided at operation 701 , then the electronic continues to poll for an indication that the moving picture capturing service using the first camera unit 140 is provided.
- the electronic device determines that the moving picture capturing service using the first camera unit 140 is being provided at operation 701 , then the electronic device proceeds to operation 703 at which the electronic device drives the second camera unit 150 .
- the electronic device activates the second camera unit 150 positioned in the direction opposite to the first camera unit 140 capturing a moving picture when capturing a moving picture.
- the electronic device determines whether a moving picture capturing event occurs. For example, the electronic device determines whether selection of a capturing icon displayed on the preview screen is detected.
- the electronic device determines that the moving picture capturing event occurs at operation 705 , then the electronic device proceeds to operation 707 at which the electronic device obtains a moving picture via the first camera unit 140 .
- the electronic device determines whether the moving picture capturing ends at operation 715 .
- the electronic device determines whether the moving picture capturing event occurs at operation 705 .
- the electronic device also proceeds to operation 709 at which the electronic device determines whether an emotion extraction period arrives.
- the emotion extraction period may change depending on the user's input information.
- the electronic device may continue to poll for an indication that the emotion extraction period arrives.
- the electronic device determines whether the user's emotional information is extracted from a user image obtained via the second camera unit 150 . For example, the electronic device estimates movement of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via the second camera unit 150 . After estimating movement of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information of the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for a frame of a point of capturing a relevant moving picture.
- the electronic device determines that the user's emotional information is not extracted from a user image obtained via the second camera unit 150 at operation 711 , then the electronic device proceeds to operation 715 at which the electronic device determines whether moving picture capturing ends.
- the electronic device determines that the user's emotional information is extracted from a user image obtained via the second camera unit 150 at operation 711 , then the electronic device proceeds to operation 713 at which the electronic device adds the user's emotional information to the moving picture.
- the electronic device adds the user's emotional information to the moving picture so that a point of extracting the user's emotional information is displayed. For example, the electronic device adds the user's emotional information to a time search bar of the point of extracting the user's emotional information of the moving picture as illustrated in FIG. 9A .
- the electronic device may add the user's emotional information to a time search bar of a point of extracting the user's emotional information of the moving picture as illustrated in FIGS. 9B and 9C .
- the electronic device may constantly display the previously added user's emotional information to the time search bar as illustrated in FIGS. 9B and 9C .
- the electronic device determines whether moving picture capturing ends.
- the electronic device determines that the moving picture capturing does not end at operation 715 , then the electronic device proceeds to operation 707 at which the electronic device obtains a moving picture via the first camera unit 140 .
- the electronic device may also proceed to operation 709 at which the electronic device determines whether an emotion extract period arrives.
- the electronic device determines that the moving picture capturing ends at operation 715 , then the electronic device proceeds to operation 717 at which the electronic device stores the moving picture to which emotional information has been added.
- the electronic device may add an emotional tag corresponding to the user's emotional information to a frame of a point of extracting the user's emotional information among frames forming the moving picture, and store the same.
- the electronic device may generate metadata including the user's emotional information obtained during moving picture capturing, and store the same together with the moving picture.
- the metadata including the user's emotional information includes information of a point of extracting the user's emotional information during the moving picture capturing together.
- the electronic device may add the user's emotional information to the time search bar of the moving picture every point of extracting the user's emotional information.
- the electronic device may add the user's emotional information to the time search bar of the moving picture at a point at which the user's emotional information has changed. For example, if the user's emotion of happiness at a first emotion extraction period of the moving picture is extracted and then the user's emotion of happiness at a second emotion extraction period is extracted, the electronic device may add only the user's emotional information extracted at the first emotion extraction period to the time search bar of the moving picture.
- the electronic device may extract the user's emotional information from a user image obtained via the second camera unit 150 .
- the electronic device may extract the user's emotional information from a user image obtained via the first camera unit 140 .
- FIG. 8 illustrates a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether a moving picture reproduction service is provided. For example, the electronic device determines whether one of one or more moving picture files stored in the data storage 112 is selected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that the moving picture reproduction service is being provided.
- the electronic device determines that the moving picture reproduction service is being provided at operation 801 , then the electronic device proceeds to operation 803 at which the electronic device reproduces a moving picture selected for the moving picture reproduction service. For example, the electronic device displays the reproduced moving picture on the display unit 170 .
- the electronic device proceeds to operation 811 at which the electronic device determines whether the moving picture reproduction ends.
- the electronic device may also proceed to operation 805 at which the electronic device drives the second camera unit 150 .
- the electronic device activates the second camera unit 150 positioned in the same direction as the display unit 170 for displaying the moving picture being reproduced in order to obtain the user's image while reproducing the moving picture. Thereafter, the electronic device proceeds to operation 807 .
- the electronic device determines whether the user's emotional information is extracted from the user image obtained via the second camera unit 150 . For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via the second camera unit 150 . After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of the movement information of the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotion value, the electronic device may recognize that the electronic device has extracted the user's emotion for a frame of a point of capturing the relevant moving picture.
- the electronic device determines that the user's emotional information is not extracted at operation 807 , then the electronic device proceeds to operation 811 at which the electronic device determines whether the moving picture reproduction ends.
- the electronic device determines that the user's emotional information is extracted at operation 807 , then the electronic device proceeds to operation 809 at which the electronic device adds the user's emotional information to the moving picture.
- the electronic device adds the user's emotional information to the moving picture so that a point of extracting the user's emotional information is displayed. For example, the electronic device adds the user's emotional information to a time search bar of a point of extracting the user's emotional information of the moving picture as illustrated in FIG. 9A .
- the electronic device may add the user's emotional information to the time search bar of the point of extracting the user's emotional information of the moving picture as illustrated in FIGS. 9B and 9C .
- the electronic device may constantly display the previously added user's emotional information to the time search bar as illustrated in FIGS. 9B and 9C .
- the electronic device determines whether the moving picture reproduction ends.
- the electronic device determines that the moving picture reproduction does not end at operation 811 , then the electronic device proceeds to operation 803 at which the electronic device reproduces the moving picture selected for the moving picture reproduction service.
- the electronic device may also proceed to operation 807 at which the electronic device determines whether the user's emotional information is extracted.
- the electronic device determines that the moving picture reproduction ends at operation 811 , then the electronic device proceeds to operation 813 at which the electronic device stores the moving picture to which the emotional information has been added.
- the electronic device may add an emotional tag corresponding to the user's emotional information to a frame of a point of extracting the user's emotional information among frames forming the moving picture, and store the same.
- the electronic device may generate metadata including the user's emotional information obtained during moving picture capturing, and store the same together with the moving picture.
- the metadata including the user's emotional information includes information of a point of extracting the user's emotional information during moving picture capturing together.
- the electronic device may add the user's emotional information to a time search bar of the moving picture every point of extracting the user's emotional information.
- the electronic device may add the user's emotional information to the time search bar of the moving picture at a point at which the user's emotional information has changed. For example, if the user's emotion of happiness at a first point of the moving picture is extracted and then the user's emotion of happiness at a second point is extracted, the electronic device may add only information of the user's emotion of happiness extracted at the first point to the time search bar of the moving picture.
- FIG. 10 is a flowchart illustrating a procedure for adding a user's emotional information to electronic book content in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether an electronic book service is provided. For example, the electronic device determines whether one of one or more electronic book files stored in the data storage 112 is selected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that the electronic book service is provided.
- the electronic device determines that the electronic book service is being provided at operation 1001 , then the electronic device proceeds to operation 1003 at which the electronic device displays electronic book content selected for the electronic book service on the display unit 170 .
- the electronic device proceeds to operation 1011 at which the electronic device determines whether the electronic book service ends.
- the electronic device may also proceed to operation 1005 at which the electronic device drives the second camera unit 150 .
- the electronic device activates the second camera unit 150 positioned in the same direction as the display unit 170 for displaying the electronic book content in order to obtain a user's image while displaying the electronic book content. Thereafter, the electronic device proceeds to operation 1007 .
- the electronic device determines whether the user's emotional information is extracted from the user image obtained via the second camera unit 150 . For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via the second camera unit 150 . After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information for the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for a relevant position of the electronic book content.
- the electronic device may proceed to operation 1011 at which the electronic device determines whether the electronic book service ends.
- the electronic device determines that the user's emotional information has been extracted at operation 1007 , then the electronic device proceeds to operation 1009 at which the electronic device adds the user's emotional information to the electronic book content and displays the same.
- FIGS. 11A and 11B are views illustrating screen configuration for adding a user's emotional information to electronic book content in an electronic device according to an embodiment of the present disclosure.
- the electronic device adds the user's emotional information 1101 to a page from which the user's emotional information has been extracted and displays the user's emotional information 1101 as illustrated in FIG. 11A .
- the electronic device may add the user's emotional information to a sentence or a paragraph from which the user's emotional information has been extracted and display the user's emotional information 1103 as illustrated in FIG. 11B .
- the electronic device determines whether the electronic book service ends.
- the electronic device determines that the electronic book service does not end at operation 1011 , then the electronic device proceeds to operation 1003 at which the electronic device displays the electronic book content on the display unit 170 .
- the electronic device may also proceed to operation 1007 at which the electronic device determines whether the user's emotional information is extracted.
- the electronic device determines that the electronic book service ends at operation 1011 , then the electronic device proceeds to operation 1013 at which the electronic device stores electronic book content to which the emotional information has been added.
- the electronic device may add an emotional tag corresponding to the user's emotional information to the electronic book content and store the same.
- the electronic device may generate metadata including the user's emotional information obtained from the electronic book content and store the same together with the electronic book content.
- the metadata including the user's emotional information includes information of a position at which the user's emotional information has been extracted from the electronic book content together.
- the electronic device may add the user's emotional information to the electronic book content. If the electronic book contents is intended for learning, the electronic device may control a learning level of difficulty with consideration of the user's emotional information for the electronic book content.
- FIG. 12 is a flowchart illustrating a procedure for adding a user's emotional information to shopping information in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether a shopping service is provided. For example, the electronic device determines whether a shopping icon is selected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that the shopping service is being provided.
- the electronic device determines that the shopping service is being provided at operation 1201 , then the electronic device proceeds to operation 1203 at which the electronic device displays purchasable goods information on the display unit 170 . Thereafter, the electronic device proceeds to operation 1211 .
- the electronic device determines whether a shopping service ends.
- the electronic device may also proceed to operation 1205 at which the electronic device drives the second camera unit 150 .
- the electronic device activates the second camera unit 150 positioned in the same direction as the display unit 170 for displaying the goods information in order to obtain a user's image while displaying the purchasable goods information. Thereafter, the electronic device proceeds to operation 1207 .
- the electronic device determines whether the user's emotional information is extracted from the user image obtained via the second camera unit 150 . For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via the second camera unit 150 . After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information for the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for the relevant goods information.
- the electronic device determines that the user's emotional information has not been extracted at operation 1207 , then the electronic device proceeds to operation 1211 at which the electronic device determines whether the shopping service ends.
- the electronic device determines that the user's emotional information has been extracted at operation 1207 , then the electronic device proceeds to operation 1209 at which the electronic device adds the user's emotional information to the goods information and displays the same.
- the electronic device may add the user's emotional information to the goods information from which the user's emotional information has been extracted among a list of purchasable goods, and display the same.
- the electronic device may add the user's emotional information to goods detail information of goods from which the user's emotional information has been extracted among one or more purchasable goods, and display the same.
- the electronic device determines whether the shopping service ends.
- the electronic device determines that the shopping service does not end at operation 1211 , then the electronic device proceeds to operation 1203 at which the electronic device displays purchasable goods information on the display unit 170 .
- the electronic device may also proceed to operation 1207 at which the electronic device determines whether the user's emotional information is extracted.
- the electronic device determines that the shopping service ends at operation 1211 , then the electronic device proceeds to operation 1213 at which the electronic device stores the goods information to which the emotional information has been added.
- the electronic device may add an emotional tag corresponding to the user's emotional information to the goods information from which the user's emotional information has been extracted, and store the same.
- the electronic device may generate metadata including the user's emotional information obtained from the goods information, and store the same together with the goods information.
- FIG. 13 is a flowchart illustrating a procedure for adding a user's emotional information to shopping information in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether a shopping service is provided. For example, the electronic device determines a shopping icon is selected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that the shopping service is provided.
- the electronic device determines that the shopping service is being provided at operation 1301 , then the electronic device proceeds to operation 1303 at which the electronic device displays purchasable goods information on the display unit 170 .
- the electronic device determines whether a goods purchase event occurs.
- the electronic device may continue to poll for an indication that the goods purchase event occur, while the electronic device displays the purchasable goods information on the display unit 170 .
- the electronic device may proceed to operation 1307 at which the electronic device drives the second camera unit 150 .
- the electronic device activates the second camera unit 150 positioned in the same direction as the display unit 170 for displaying goods information in order to obtain a user's image while displaying the purchasable goods information.
- the electronic device determines whether the user's emotional information is extracted from the user image obtained via the second camera unit 150 . For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via the second camera unit 150 . After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information for the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for the relevant goods information.
- the electronic device determines that the user's emotional information has not been extracted, then the electronic device ends the procedure for adding the user's emotional information to shopping information.
- the electronic device determines that the user's emotional information has been extracted at operation 1309 , then the electronic device proceeds to operation 1311 at which the electronic device adds the user's emotional information to the goods information and displays the same.
- the electronic device may add the user's emotional information to goods information from which the user's emotional information has been extracted among a purchasable goods list and display the same.
- the electronic device may add the user's emotional information to goods detail information of goods from which the user's emotional information has been extracted among one or more purchasable goods, and display the same. Thereafter, the electronic device proceeds to operation 1313 .
- the electronic device stores the goods information to which the emotional information has been added. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to the goods information from which the user's emotional information has been extracted, and store the same. As another example, the electronic device may generate metadata including the user's emotional information obtained from the goods information, and store the same together with the goods information.
- the electronic device may transmit the user's emotional information added to the goods information to a shopping server.
- the shopping server may display a user compliance rate for relevant goods with consideration of the user's emotional information for the specific goods collected from a plurality of electronic devices.
- the electronic device may use the users' emotional information for marketing of the relevant goods.
- the electronic device may add the extracted user's emotional information to relevant content when using the content. Accordingly, the electronic device may manage at least one content stored in the data storage 112 with consideration of the user's emotional information for each content.
- FIG. 14 is a flowchart illustrating a procedure for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether a photo display event occurs. For example, the electronic device determines whether selection of a photo display icon is detected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that a photo display event occurs.
- the electronic device determines that the photo display event occurs at operation 1401 , then the electronic device proceeds to operation 1403 at which the electronic device displays a photo list for at least one photo stored in the data storage 112 on the display unit 170 .
- FIGS. 15A, 15B, 15C, and 15D are views illustrating screen configuration for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the electronic device displays the photo list for at least one photo stored in the data storage 112 on the display unit 170 as illustrated in FIG. 15A .
- the electronic device determines whether a sort event corresponding to emotional information occurs. For example, the electronic device determines whether emotional information (“emotion”) 1503 is selected as a sort condition 1501 for a photo as illustrated in FIG. 15A .
- emotional information (“emotion”) 1503 is selected as a sort condition 1501 for a photo as illustrated in FIG. 15A .
- the electronic device determines that a sort event corresponding to emotional information does not occur at operation 1405 , then the electronic device ends the procedure for displaying a photo with consideration of emotional information.
- the electronic device determines that a sort event corresponding to emotional information does occur at operation 1405 . If the electronic device determines that a sort event corresponding to emotional information does occur at operation 1405 , the electronic device proceeds to operation 1407 at which the electronic device determines the user's emotional information added to each photo stored in the data storage 112 . For example, the electronic device determines the user's emotional information for each photo via an emotional tag tagged to a photo. As another example, the electronic device may determine the user's emotional information for each photo with consideration of metadata for emotional information stored in the data storage 112 . Thereafter, the electronic device proceeds to operation 1409 .
- the electronic device sorts and displays at least one photo file stored in the data storage 112 depending on the user's emotional information. For example, as illustrated in FIG. 15B , the electronic device may group and display at least one photo file depending on the user's emotional information. After grouping and displaying at least one photo file depending on the user's emotional information, in the case in which selection of a “happiness” folder 1505 is detected as illustrated in FIG. 15B , the electronic device displays at least one file list including emotion of “happiness” on the display unit 170 as illustrated in FIG. 15C . In addition, when selection of a specific photo file 1507 is detected in a file list, the electronic device may display the selected photo file on the display unit 170 as illustrated in FIG. 15D .
- the electronic device may display a thumbnail of a moving picture together with a photo list depending on the user's emotional information as illustrated in FIG. 15C .
- the electronic device may reproduce the moving picture for the thumbnail as illustrated in FIGS. 19B and 19C .
- the electronic device may reproduce the moving picture from a point displayed by the thumbnail.
- the electronic device may determine emotional information for each photo to sort and display photos depending on the emotional information.
- the electronic device determines the user's emotional information added to each photo stored in the data storage 112 at operation 1407 . Thereafter, at operation 1409 , the electronic device sorts and displays one or more photo files stored in the data storage 112 depending on the user's emotional information.
- the electronic device may display an emotion display icon on a photo to which emotional information has been added. Specifically, when a photo display event occurs, the electronic device determines whether the emotional information display menu has been set. If the emotional information display menu has been set, the electronic device determines the user's emotional information added to each photo stored in the data storage 112 . After determining the user's emotional information added to each photo, the electronic device may display an emotion display icon on a photo to which emotional information has been added.
- FIG. 16 is a flowchart illustrating a procedure for displaying moving picture information with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether a moving picture reproduction service is provided. For example, the electronic device determines whether one of one or more moving picture files stored in the data storage 112 is selected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that the moving picture reproduction service is being provided.
- the electronic device determines that the moving picture reproduction service is being provided at operation 1601 , then the electronic device proceeds to operation 1603 at which the electronic device reproduces the moving picture selected for the moving picture reproduction service. For example, the electronic device reproduces the moving picture and displays the same on the display unit 170 as illustrated in FIG. 18A .
- the electronic device determines whether an emotional information display event occurs. For example, the electronic device determines whether selection of the emotional information display menu is detected depending on touch information provided via the input unit 180 .
- the electronic device may end the procedure for displaying moving picture information with consideration of emotional information. For example, the electronic device constantly reproduces the moving picture selected for the moving picture reproduction service.
- the electronic device determines that the emotional information display event occurs at operation 1605 . If the electronic device determines that the emotional information display event occurs at operation 1605 , then the electronic device proceeds to operation 1607 at which the electronic device determines the user's emotional information added to the moving picture. For example, the electronic device may determine the user's emotional information added to a moving picture via an emotional tag tagged to a frame forming the moving picture. As another example, the electronic device may determine the user's emotional information for the moving picture with consideration of metadata for emotional information stored in the data storage 112 . Thereafter, the electronic device proceeds to operation 1609 .
- the electronic device displays the user's emotional information when reproducing the moving picture. For example, as illustrated in FIG. 18B , the electronic device displays the user's emotional information 1801 and 1803 at a point of extracting the user's emotion from the time search bar of the moving picture being reproduced. Additionally, as illustrated in FIG. 18C , if the user of the electronic device controls a reproduction point of the moving picture using the time search bar, the electronic device may display the user's emotional information 1805 at a point of extracting the user's emotion. Thereafter, the electronic device proceeds to operation 16113 .
- the electronic device determines whether selection of emotional information displayed on the time search bar is detected.
- the electronic device may continue to poll for an indication that selection of emotional information displayed on the time search bar is detected.
- the electronic device may proceed to operation 1613 at which the electronic device changes a reproduction point of the moving picture to the point at which the emotional information selected in operation 1611 has been extracted. For example, if selection of emotional information “depression” 1801 is detected from the emotional information illustrated in FIG. 18B , the electronic device changes the reproduction point of the moving picture to the point of extracting the emotional information “depression” 1801 .
- the electronic device may display the user's emotional information added to the moving picture on the display unit 170 .
- the electronic device may display the user's emotional information added to the moving picture from a moving picture reproduction point. In this case, the electronic device determines the emotional information added to the moving picture for reproduction before reproducing the moving picture.
- FIG. 17 is a flowchart illustrating a procedure for displaying moving picture information with consideration of emotional information in an electronic device according to another embodiment of the present disclosure.
- the electronic device determines whether to provide a moving picture reproduction service. For example, the electronic device determines whether an icon of a moving picture reproduction application is selected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that moving picture reproduction service is being provided.
- the electronic device determines that the moving picture reproduction service is being provided at operation 1701 , then the electronic device proceeds to operation 1703 at which the electronic device displays a moving picture file list for at least one moving picture file stored in the data storage 112 on the display unit 170 . Thereafter, the electronic device proceeds to operation 1705 .
- the electronic device determines whether selection of a first moving picture file which is one of moving picture files in the moving picture file list is detected.
- the electronic device may continue to poll for an indication that selection of the first moving picture file is detected.
- the electronic device may proceed to operation 1707 at which the electronic device determines the user's emotional information included in the first moving picture file. For example, the electronic device determines the user's emotional information added to the first moving picture file via an emotional tag tagged to a frame forming the first moving picture file. As another example, the electronic device may determine the user's emotional information for the first moving picture file with consideration of metadata for emotional information stored in the data storage 112 . Thereafter, the electronic device proceeds to operation 1709 .
- the electronic device displays a thumbnail for the user's emotional information added to the first moving picture file on the display unit 170 .
- FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the electronic device displays a thumbnail of a point of extracting the user's emotional information from the first moving picture file on the display unit 170 .
- the electronic device determines whether one of one or more thumbnails representing the user's emotional information for the first moving picture is selected. For example, as illustrated in FIG. 19A , the electronic device determines whether selection of a first thumbnail 1901 is detected among thumbnails for the first moving picture.
- the electronic device may continue to poll for an indication that one of one or more thumbnails representing the user's emotional information for the first moving picture is selected.
- the electronic device determines that selection of one of one or more thumbnails representing the user's emotional information for the first moving picture is detected at operation 1711 , then the electronic device proceeds to operation 1713 at which the electronic device reproduces the moving picture from a point from which emotional information of the thumbnail has been extracted in operation 1713 .
- the electronic device reproduces the moving picture from a point from which emotional information of the thumbnail has been extracted.
- the electronic device may display the thumbnail information for the first moving picture or thumbnail information including the same emotional information as the thumbnail selected at operation 1711 on a partial region 1903 of the display unit 170 as illustrated in FIG. 19B .
- the electronic device may display emotional information 1905 and 1907 for the first moving picture on the time search bar of the first moving picture
- the electronic device may display a thumbnail for each emotional information added to the first moving picture as illustrated in FIG. 19A .
- the electronic device may display only a thumbnail of a point at which emotional information has changed inside the first moving picture.
- FIG. 20 is a flowchart illustrating a procedure for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether an electronic book service is provided. For example, the electronic device determines whether one of electronic book content stored in the data storage 112 is selected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that the electronic book service is being provided.
- the electronic device determines that the electronic book service is being provided at operation 2001 , then the electronic device proceeds to operation 2003 at which the electronic device displays electronic book content selected for the electronic book service on the display unit 170 .
- the electronic device determines whether an emotional information display event occurs. For example, the electronic device determines whether selection of the emotional information display menu is detected depending on touch information provided via the input unit 180 .
- the electronic device determines that the emotional information display event does not occur at operation 2005 , then the electronic device proceeds to operation 2003 at which the electronic device constantly displays electronic book content selected for the electronic book service on the display unit 170 .
- the electronic device determines that the emotional information display event occurs at operation 2005 .
- the electronic device proceeds to operation 2007 at which the electronic device determines the user's emotional information added to the electronic book content.
- the electronic device may determine the user's emotional information added to the electronic book content via an emotional tag tagged to the electronic book content.
- the electronic device may determine the user's emotional information added to the electronic book content with consideration of metadata for emotional information stored in the data storage 112 . Thereafter, the electronic device proceeds to operation 2009 .
- the electronic device displays the user's emotional information including position information where the emotional information has been extracted from the electronic book content on the display unit 170 .
- the electronic device displays a structure window 2201 for the emotional information added to the electronic book content on the display unit 170 as illustrated in FIG. 22A .
- the structure window 2201 classifies and displays an extraction position of the user's emotional information for each emotion kind.
- the electronic device determines whether selection of the emotional information displayed on the structure window 2201 is detected.
- the electronic device may continue to poll for an indication that one of one or more emotional information is selected.
- the electronic device determines that selection of one of one or more emotional information displayed on the structure window 2201 is detected at operation 2011 , then the electronic device proceeds to operation 2013 at which the electronic device changes a display region of the electronic book content to the position from which the emotional information whose selection has been detected has been extracted.
- the electronic device may display the emotional information added to the electronic book content using the emotional information structure window 2201 .
- the electronic device may display the user's emotional information inside the electronic book content as illustrated in FIG. 21 .
- FIG. 21 is a flowchart illustrating a procedure for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether an electronic book service is provided. For example, the electronic device determines whether one of one or more electronic book content stored in the data storage 112 is selected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that the electronic book service is provided.
- the electronic device determines that the electronic book service is being provided at operation 2101 , then the electronic device proceeds to operation 2103 at which the electronic device determines whether the emotional display menu has been set.
- the electronic device determines that the emotional display menu has not been set at operation 2103 , then the electronic device proceeds to operation 2109 at which the electronic device displays the electronic book content selected for the electronic book service on the display unit 170 .
- the electronic device determines that the emotional display menu has been set at operation 2103 .
- the electronic device proceeds to operation 2105 at which the electronic device determines the user's emotional information added to the electronic book content.
- the electronic device may determine the user's emotional information added to the electronic book content via an emotional tag tagged to the electronic book content.
- the electronic device may determine the user's emotional information added to the electronic book content with consideration of metadata for emotional information stored in the data storage 112 . Thereafter, the electronic device proceeds to operation 2107 .
- the electronic device displays the electronic book content together with emotional information extracted from the electronic book content on the display unit 170 .
- the electronic device displays the user's emotional information at a position at which the emotional information has been extracted in the electronic book content.
- the electronic device may display an emotional icon 2211 depending on the user's emotional information on a page that has extracted the user's emotional information in the electronic book content.
- the electronic device may display an emotional icon 2213 corresponding to the user's emotional information on a paragraph where the user's emotional information has been extracted in the electronic book content.
- FIG. 22B the electronic device may display an emotional icon 2213 depending on the user's emotional information on a paragraph where the user's emotional information has been extracted in the electronic book content.
- the electronic device may mark a shade 2215 corresponding to the user's emotional information on a paragraph where the user's emotional information has been extracted in the electronic book content. At this point, the electronic device may determine at least one of the shape, the color, and the opacity of the shade depending on the user's emotional information. As another example, the electronic device may draw an underline 2217 on a paragraph from which the user's emotional information has been extracted in the electronic book content depending on the user's emotional information as illustrated in FIG. 22B . At this point, the electronic device may determine at least one of the shape, the color, and the thickness of the underline depending on the user's emotional information. As another example, as illustrated in FIG.
- the electronic device may mark a parenthesis 2219 on a paragraph where the user's emotional information has been extracted in the electronic book content depending on the user's emotional information.
- the electronic device may determine the shape of a parenthesis depending on the user's emotional information.
- FIG. 23 is a flowchart illustrating a procedure for displaying shopping information with consideration of emotional information in an electronic device according to an embodiment of the present disclosure.
- the electronic device determines whether a shopping service is provided. For example, the electronic device determines whether selection of an icon for a shopping application is detected depending on touch information provided via the input unit 180 .
- the electronic device may continue to poll for an indication that the shopping service is provided.
- the electronic device determines that the shopping service is being provided at operation 2301 , then the electronic device proceeds to operation 2303 at which the electronic device determines whether the emotion display menu has been set.
- the electronic device may proceed to operation 2309 at which the electronic device displays a list of goods purchasable via the shopping service on the display unit 170 .
- the electronic device determines that the emotion display menu has been set at operation 2303 .
- the electronic device proceeds to operation 2305 at which the electronic device determines the user's emotional information added to goods information.
- the electronic device may determine the user's emotional information added to goods information via an emotional tag tagged to each goods information.
- the electronic device may determine the user's emotional information added to each goods information with consideration of metadata for emotional information stored in the data storage 112 . Thereafter, the electronic device proceeds to operation 2307 .
- the electronic device displays the goods information including emotional information on the display unit 170 .
- the electronic device may display an emotional icon on goods information to which the user's emotional information has been added in a purchasable goods list.
- the electronic device may display the user's emotional information.
- the electronic device may extract the user's emotion from image information of the user obtained via a camera.
- the electronic device may measure a stimulus degree of the user's sympathetic nerve and parasympathetic nerve to estimate the user's emotion.
- the electronic device may further include a skin electricity measurement sensor for measuring the user's skin electricity in addition to the construction of the electronic device 100 illustrated in FIG. 1 .
- the electronic device may measure stimulus information of the sympathetic nerve with consideration of the user's skin electricity measured by the skin electricity measurement sensor.
- the electronic device measures the user's skin electricity using the skin electricity measurement sensor while driving a music application.
- the emotion extract program 114 of the electronic device may estimate the user's emotional information with consideration of the user's skin electricity measured by the skin electricity measurement sensor.
- the emotion extract program 114 may recognize that the electronic device has extracted the user's emotion for music content. Additionally, the electronic device may add relevant emotional information at a point of extracting the user's emotion while reproducing music content.
- the electronic device may estimate the user's emotion with consideration of the user's skin temperature change.
- the electronic device may further include a skin temperature measurement sensor for measuring the user's skin temperature in addition to the construction of the electronic device 100 illustrated in FIG. 1 .
- the electronic device may extract the user's emotion with consideration of the user's skin temperature change measured by the skin temperature measurement sensor.
- the electronic device may estimate the user's emotion with consideration of the user's movement pattern measured by a motion sensor such as an acceleration sensor, a gravity sensor, and the like.
- the electronic device may further include a motion sensor in addition to the construction of the electronic device 100 illustrated in FIG. 1 .
- the electronic device may provide a service corresponding to the user's emotion depending on content use by estimating the user's emotion and adding the same to relevant contents when using the content, and retrieve, classify, and reproduce at least one content depending on the user's emotion depending on content use.
- Any such software may be stored in a non-transitory computer readable storage medium.
- the non-transitory computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present disclosure.
- Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like.
- ROM Read Only Memory
- RAM Random Access Memory
- CD Compact Disk
- DVD Digital Versatile Disc
- the storage devices and storage media are various embodiments of non-transitory machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement various embodiments of the present disclosure. Accordingly, various embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Tourism & Hospitality (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
- This application is a continuation application of prior application Ser. No. 14/147,842, filed on Jan. 6, 2014, which has issued as U.S. Pat. No. 9,807,298 on Oct. 31, 2017 and claimed the benefit under 35 U.S.C. §119(a) of a Korean patent application filed in the Korean Intellectual Property Office on Jan. 4, 2013 and assigned Serial number 10-2013-0001087, the entire disclosure of which is hereby incorporated by reference.
- The present disclosure relates to an apparatus and a method for providing a user's emotional information in an electronic device.
- As an electronic device provides a multimedia service, a user of the electronic device increasingly requires various services via the electronic device. Accordingly, the electronic device may provide a service using emotional information included in content as a way for meeting the user's various needs. For example, a portable electronic device may provide a service using emotion regarding an object included in a photo.
- When a service using emotional information included in content is provided, the electronic device may estimate emotional information included in the content such as, for example, a photo. However, when an object such as a person or an animal from which a user may estimate emotion does not exist in the content (e.g., a photo), the electronic device cannot estimate emotion for the relevant content (e.g., the relevant photo). In addition, because the electronic device cannot estimate the emotion of a user who has taken a picture from a photo, the electronic device has a limitation in providing a service using emotion. Accordingly, a method for determining emotion of a user who uses content and adding the emotion to content in an electronic device is required.
- The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
- Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide an apparatus and a method for adding emotional information of a user who uses content to content.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a second camera to an image shot via a first camera in an electronic device.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a second camera to a moving picture shot via a first camera in an electronic device.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a camera to a moving picture being reproduced when an electronic device reproduces the moving picture.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a camera to electronic book content when an electronic device provides an electronic book service.
- Another aspect of the present disclosure is to provide an apparatus and a method for adding a user's emotional information recognized via a camera to purchasable goods information in an electronic device.
- Another aspect of the present disclosure is to provide an apparatus and a method for retrieving content using emotional information added to content in an electronic device.
- Another aspect of the present disclosure is to provide an apparatus and a method for classifying content using emotional information added to content in an electronic device.
- In accordance with an aspect of the present disclosure, a method for providing emotional information in an electronic device is provided. The method includes displaying at least one content, extracting emotional information from an image obtained via a camera, and adding the emotional information to the content.
- In accordance with another aspect of the present disclosure, an electronic device is provided. The electronic device includes at least one camera, a display unit, and at least one processor, wherein the at least one processor operatively displays at least one content on the display unit, extracts emotional information from an image obtained via at least one of the at least one camera, and adds the emotional information to the content.
- In accordance with another aspect of the present, a method for capturing emotional information in an electronic device is provided. The method includes capturing an image of at least one user via a camera while the electronic device is providing a service, extracting emotional information of the at least one user, and associating the emotional information of the at least user with the service being provided when the image of the at least one user is captured.
- Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
- The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a block diagram illustrating an electronic device according to an embodiment of the present disclosure; -
FIG. 2 is a detailed block diagram illustrating a processor according to an embodiment of the present disclosure; -
FIG. 3 is a flowchart illustrating a procedure for adding a user's emotional information to content in an electronic device according to an embodiment of the present disclosure; -
FIG. 4 is a flowchart illustrating a procedure for adding a user's emotional information to a shot image in an electronic device according to an embodiment of the present disclosure; -
FIGS. 5A, 5B, 5C, and 5D are views illustrating screen configuration for adding a user's emotional information to a shot image in an electronic device according to an embodiment of the present disclosure; -
FIG. 6 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure; -
FIG. 7 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure; -
FIG. 8 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure; -
FIGS. 9A, 9B, and 9C are views illustrating screen configuration for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure; -
FIG. 10 is a flowchart illustrating a procedure for adding a user's emotional information to electronic book content in an electronic device according to an embodiment of the present disclosure; -
FIGS. 11A and 11B are views illustrating screen configuration for adding a user's emotional information to electronic book content in an electronic device according to an embodiment of the present disclosure; -
FIG. 12 is a flowchart illustrating a procedure for adding a user's emotional information to shopping information in an electronic device according to an embodiment of the present disclosure; -
FIG. 13 is a flowchart illustrating a procedure for adding a user's emotional information to shopping information in an electronic device according to an embodiment of the present disclosure; -
FIG. 14 is a flowchart illustrating a procedure for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure; -
FIGS. 15A, 15B, 15C, and 15D are views illustrating screen configuration for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure; -
FIG. 16 is a flowchart illustrating a procedure for displaying moving picture information with consideration of emotional information in an electronic device according to an embodiment of the present disclosure; -
FIG. 17 is a flowchart illustrating a procedure for displaying moving picture information with consideration of emotional information in an electronic device according to an embodiment of the present disclosure; -
FIGS. 18A, 18B, and 18C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure; -
FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure; -
FIG. 20 is a flowchart illustrating a procedure for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure; -
FIG. 21 is a flowchart illustrating a procedure for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure; -
FIGS. 22A and 22B are views illustrating screen configuration for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure; and -
FIG. 23 is a flowchart illustrating a procedure for displaying shopping information with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Throughout the drawings, like reference numerals will be understood to refer to like parts, components and structures.
- The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
- The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure are provided for illustration purpose only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalents.
- Hereinafter, a technology for adding a user's emotional information to content in an electronic device is described.
- According to various embodiments of the present disclosure, the electronic device includes a mobile communication terminal, a Personal Digital Assistant (PDA), a Personal Computer (PC), a laptop computer, a tablet computer, a smartphone, a video phone, an e-book reader, a netbook, a Mobile Internet Device (MID), an Ultra Mobile PC (UMPC), a tablet PC, a navigation, a Portable Multimedia Player (PMP), an MP3 player having a camera, a mobile medical device, a camera, a wearable device (e.g., a Head-Mounted Device (HMD), electronic clothes, electronic braces, an electronic necklace, an electronic appcessory, an electronic tattoo, or a smart watch), and/or the like.
- According to various embodiments of the present disclosure, an electronic device may be a smart home appliance with communication functionality. A smart home appliance may be, for example, a television (TV) (e.g., a smart TV), a Digital Video Disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washer, a dryer, an air purifier, a set-top box, a TV box (e.g., Samsung HomeSync™, Apple TV™, or Google TV™), a gaming console, an electronic dictionary, an electronic key, a camcorder, an electronic picture frame, and/or the like.
- According to various embodiments of the present disclosure, an electronic device may be a medical device (e.g., Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging (MRI) device, Computed Tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), an automotive infotainment device, a naval electronic device (e.g., naval navigation device, gyroscope, or compass), an avionic electronic device, a security device, an industrial or consumer robot, and/or the like.
- According to various embodiments of the present disclosure, an electronic device may be furniture, part of a building/structure, an electronic board, electronic signature receiving device, a projector, various measuring devices (e.g., water, electricity, gas or electro-magnetic wave measuring devices), and/or the like that include communication functionality.
- According to various embodiments of the present disclosure, an electronic device may be any combination of the foregoing devices. In addition, it will be apparent to one having ordinary skill in the art that an electronic device according to various embodiments of the present disclosure is not limited to the foregoing devices.
-
FIG. 1 is a block diagram illustrating an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 1 , theelectronic device 100 includes amemory 110, aprocessor unit 120, anaudio processor 130, afirst camera unit 140, asecond camera unit 150, an Input/Output (I/O)controller 160, adisplay unit 170, and aninput unit 180. According to various embodiments of the present disclosure, a plurality ofmemories 110 may exist. - The
memory 110 includes aprogram storage 111 for storing a program for controlling an operation of theelectronic device 100, and adata storage 112 for storing data generated during execution of a program. - The
data storage 112 stores a user's emotional information for content. For example, thedata storage 112 may store content to which an emotional tag for user emotional information extracted via anemotion extract program 114 has been added. For another example, thedata storage 112 may store at least one content and metadata including emotional information of each content. For example, thedata storage 112 may store at least one content and an emotional information table including emotional information of each content. - The
program storage 111 includes a Graphic User Interface (GUI)program 113, theemotion extract program 114, afile management program 115, and at least oneapplication 116. According to various embodiments of the present disclosure, a program included in theprogram storage 111 is a set of instructions and may be expressed as an instruction set. -
FIGS. 18A, 18B, and 18C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - The
GUI program 113 includes at least one software element for providing a user interface on thedisplay unit 170 using graphics. For example, theGUI program 113 may control to display information of an application driven by aprocessor 122 on thedisplay unit 170. In the case in which an emotion display menu has been set, theGUI program 113 may control to display a user's emotional information for content displayed on thedisplay unit 170. - Referring to
FIGS. 18A, 18B, and 18C , according to various embodiments of the present disclosure, when reproducing a moving picture, if an emotion display menu has been set, theGUI program 113 may control to displayemotional information FIGS. 18B and 18C . -
FIGS. 22A and 22B are views illustrating screen configuration for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIGS. 22A and 22B , according to various embodiments of the present disclosure, if the emotion display menu has been set when providing an electronic book service, theGUI program 113 may control to additionally displayemotional information display unit 170 as illustrated inFIG. 22B . - According to various embodiments of the present disclosure, if the emotion display menu has been set when providing a shopping service, the
GUI program 113 may control to additionally display emotional information on goods information. - The
emotion extract program 114 includes at least one software element for extracting a user's emotional information. For example, theemotion extract program 114 estimates movements of a plurality of facial muscles for estimating emotional information from a user's facial image obtained via thefirst camera unit 140 and thesecond camera unit 150. After estimating movements of a plurality of facial muscles for estimating emotional information from a user's facial image, theemotion extract program 114 extracts the user's emotion with consideration of movement information for the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotion value, theemotion extract program 114 may recognize that theemotion extract program 114 has extracted the user's emotion for relevant content. - The
emotion extract program 114 may selectively activate thefirst camera unit 140 or thesecond camera unit 150 in order to obtain a user image. For example, theemotion extract program 114 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 at a point of capturing an image via thefirst camera unit 140. At this point, theemotion extract program 114 may activate thesecond camera unit 150 when taking a photograph via thefirst camera unit 140. - According to various embodiments of the present disclosure, the
emotion extract program 114 may extract the user's emotional information using the user image obtained via thesecond camera unit 150 while capturing a moving picture via thefirst camera unit 140. At this point, theemotion extract program 114 may activate thesecond camera unit 150 when capturing a moving picture via thefirst camera unit 140. - According to various embodiments of the present disclosure, the
emotion extract program 114 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 while reproducing a moving picture. At this point, theemotion extract program 114 may activate thesecond camera unit 150 when reproducing the moving picture. - According to various embodiments of the present disclosure, the
emotion extract program 114 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 while providing an electronic book service. At this point, theemotion extract program 114 may activate thesecond camera unit 150 when providing the electronic book service. - According to various embodiments of the present disclosure, the
emotion extract program 114 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 at a point of determining purchasable goods information. At this point, theemotion extract program 114 may activate thesecond camera unit 150 when providing a shopping service. - According to various embodiments of the present disclosure, the
emotion extract program 114 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 at a point of purchasing goods. At this point, theemotion extract program 114 may activate thesecond camera unit 150 when providing a shopping service. - According to various embodiments of the present disclosure, the
emotion extract program 114 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 at a point of reading and/or viewing a communication (e.g., an email, a text message, an instant message, and/or the like). At this point, theemotion extract program 114 may activate thesecond camera unit 150 when a communication is being displayed and/or read. - According to various embodiments of the present disclosure, the
emotion extract program 114 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 at a point of providing an on-line dating service. At this point, theemotion extract program 114 may activate thesecond camera unit 150 when a communication is being displayed and/or read, when a profile is being viewed, and/or the like. - According to various embodiments of the present disclosure, the
emotion extract program 114 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 at a point of providing a Social Networking Service (SNS). At this point, theemotion extract program 114 may activate thesecond camera unit 150 when a communication is being displayed and/or read, when a profile is being viewed, when a status is being updated, when a status is being viewed and/or read, and/or the like. - The
file management program 115 includes at least one software element for retrieving, classifying, and reproducing each content using emotional information for content stored in thedata storage 112. -
FIGS. 15A, 15B, 15C, and 15D are views illustrating screen configuration for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIGS. 15A, 15B, 15C, and 15D , according to various embodiments of the present disclosure, thefile management program 115 may control to classify and display photo content stored in thedata storage 112 depending on emotional information as illustrated inFIGS. 15B and 15C . -
FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIGS. 19A, 19B, and 19C , according to various embodiments of the present disclosure, thefile management program 115 may control to display a thumbnail of a moving picture depending on emotional information included in the moving picture as illustrated inFIG. 19A . -
FIGS. 22A and 22B are views illustrating screen configuration for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIGS. 22A and 22B , according to various embodiments of the present disclosure, thefile management program 115 may control to classify and display emotional information included in electronic book content using astructure window 2201 as illustrated inFIG. 22A . - The
application 116 includes a software element for at least one application installed in theelectronic device 100. - The
processor unit 120 includes amemory interface 121, at least oneprocessor 122, and aperipheral interface 124. According to various embodiments of the present disclosure, thememory interface 121, the at least oneprocessor 122, and theperipheral interface 124 included in theprocessor unit 120 may be integrated in at least one integrated circuit, or implemented as separate elements. - The
memory interface 121 controls a memory access of an element such as theprocessor 122 or theperipheral interface 124. - The
peripheral interface 124 controls connection between an I/O peripheral of theelectronic device 100, and theprocessor 122 and thememory interface 121. - The
processor 122 controls theelectronic device 100 to provide various multimedia services using at least one software program. Theprocessor 122 executes at least one program stored in thememory 110 to provide a service corresponding to a relevant program. For example, theprocessor 122 may execute theemotion extract program 114 to add extracted user's emotional information to relevant content. As another example, theprocessor 122 may execute thefile management program 115 to manage content depending on the user's emotional information. - The
audio processor 130 provides an audio interface between the user and theelectronic device 100 via aspeaker 131 and amicrophone 132. - The
first camera unit 140 is positioned in the rear side of theelectronic device 100 to provide a collected image to theprocessor unit 120 by capturing an object, and thesecond camera unit 150 is positioned in the front side of theelectronic device 100 to provide a collected image to theprocessor unit 120 by capturing an object. For example, thefirst camera unit 140 and thesecond camera unit 150 may include a camera sensor for converting an optical signal to an electric signal, an image processor for converting an analog image signal to a digital image signal, and a signal processor for image-processing an image signal output from the image processor so that the image signal may be displayed on thedisplay unit 170. According to various embodiments of the present disclosure, the camera sensor may be a Charged Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) sensor, and the like. The signal processor may be implemented as a Digital Signal Processor (DSP). - The I/
O controller 160 provides an interface between an I/O unit such as thedisplay unit 170 and theinput unit 180, and the peripheral interface 123. - The
display unit 170 displays status information of theelectronic device 100, a character input by the user, a moving picture, a still picture, and the like. For example, thedisplay unit 170 displays information of an application driven by theprocessor 122. If an emotion display menu has been set, thedisplay unit 170 may additionally display a user's emotional information for content displayed on thedisplay unit 170 under control of theGUI program 113. For example, in the case in which the emotion display menu has been set when reproducing a moving picture, thedisplay unit 170 may displayemotional information FIGS. 18B and 18C . As another example, if the emotion display menu has been set when providing an electronic book service, thedisplay unit 170 may additionally display emotional information on electronic book content displayed on thedisplay unit 170 as illustrated inFIG. 22B . As another example, if the emotion display menu has been set when providing a shopping service, thedisplay unit 170 may additionally display emotional information on goods information. - The
input unit 180 provides input data generated by the user's selection to theprocessor unit 120 via the I/O controller 160. At this point, theinput unit 180 includes a keypad including at least one hardware button and a touchpad for detecting touch information. For example, theinput unit 180 provides touch information detected via the touchpad to theprocessor 122 via the I/O controller 160. - Additionally, the
electronic device 100 may include a communication system for performing a communication function for voice communication and data communication. The communication system may be divided to a plurality of sub-modules supporting different communication networks. For example, though not limited thereto, the communication network includes a Global System for Mobile communication (GSM) network, an Enhanced Data GSM Environment (EDGE) network, a Code Division Multiple Access (CDMA) network, a W-Code Division Multiple Access (W-CDMA) network, a Long Term Evolution (LTE) network, an Orthogonal Frequency Division Multiple Access (OFDMA) network, a wireless Local Area Network (LAN), a Bluetooth network, Near Field Communications (NFC), and/or the like. -
FIG. 2 is a detailed block diagram illustrating a processor according to an embodiment of the present disclosure. - Referring to
FIG. 2 , theprocessor 122 includes anapplication driver 200, anemotion extractor 210, afile manager 220, and adisplay controller 230. - The
application driver 200 executes at least oneapplication 116 stored in theprogram storage 111 to provide a service corresponding to the relevant program. For example, theapplication driver 200 may execute an application stored in theprogram storage 111 to reproduce a moving picture. As another example, theapplication driver 200 may execute an application stored in theprogram storage 111 to capture a photo or a moving picture using thefirst camera unit 140 or thesecond camera unit 150. As another example, theapplication driver 200 may execute an application stored in theprogram storage 111 to provide an electronic book service. As another example, theapplication driver 200 may execute an application stored in theprogram storage 111 to provide a shopping service. - The
emotion extractor 210 executes theemotion extract program 114 stored in theprogram storage 111 to extract a user's emotion. For example, theemotion extractor 210 estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via thefirst camera unit 140 or thesecond camera unit 150. After estimating movements of a plurality of facial muscles for estimating emotional information from a user's facial image, theemotion extractor 210 extracts the user's emotion with consideration of movement information of the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotion value, theemotion extract program 114 may recognize that theemotion extract program 114 has extracted the user's emotion for relevant content. - According to various embodiments of the present disclosure, the
emotion extractor 210 may selectively activate thefirst camera unit 140 or thesecond camera unit 150 for obtaining the user's image. For example, when taking a photograph using thefirst camera 140, theemotion extractor 210 activates thesecond camera unit 150. After activating thesecond camera unit 150, theemotion extractor 210 extracts the user's emotional information using a user image obtained via thesecond camera unit 150 at a point of capturing an image via thefirst camera unit 140. - As another example, when capturing a moving picture via the
first camera unit 140, theemotion extractor 210 activates thesecond camera unit 150. After activating thesecond camera unit 150, theemotion extractor 210 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 while capturing a moving picture using thefirst camera unit 140. - As another example, if a moving picture reproduction application is driven, the
emotion extractor 210 activates thesecond camera unit 150. After activating thesecond camera unit 150, theemotion extractor 210 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 while reproducing a moving picture. - As another example, when an electronic book service is being provided, the
emotion extractor 210 activates thesecond camera unit 150. After activating thesecond camera unit 150, theemotion extractor 210 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 while providing the electronic book service. - As another example, when a shopping service is being provided, the
emotion extractor 210 activates thesecond camera unit 150. After activating thesecond camera unit 150, theemotion extractor 210 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 at a point of determining purchasable goods information. - As another example, when a shopping service is provided, the
emotion extractor 210 activates thesecond camera unit 150. After activating thesecond camera unit 150, theemotion extractor 210 may extract the user's emotional information using a user image obtained via thesecond camera unit 150 at a point of purchasing goods. - The
emotion extractor 210 may execute theemotion extract program 114 to transmit extracted user's emotional information to thememory 110. Thememory 110 may map the user's emotional information to relevant content and store the same. - The
file manager 220 may execute thefile management program 115 stored in theprogram storage 111 to manage each content depending on emotional information of content. Specifically, thefile manager 220 may retrieve, classify, and reproduce each content using emotional information of content stored in thedata storage 112. For example, thefile manager 220 may control to classify and display photo content stored in thedata storage 112 according to emotional information as illustrated inFIG. 15B or 15C . As another example, thefile manager 220 may control to display a thumbnail of a moving picture depending on emotional information included in the moving picture as illustrated inFIG. 19A . As another example, thefile manager 220 may control to classify and display emotional information included in electronic book content using thestructure window 2201 as illustrated inFIG. 22A . - The
display controller 230 controls to execute theGUI program 113 stored in theprogram storage 111 to display a user interface on thedisplay unit 170 using graphics. For example, thedisplay controller 230 controls to display information of an application driven by theapplication driver 200 on thedisplay unit 170. If the emotion display menu has been set, thedisplay controller 230 may control to display the user's emotional information for content displayed on thedisplay unit 170. For example, if the emotion display menu has been set while reproducing a moving picture, thedisplay controller 230 may control to displayemotional information FIG. 18B or 18C . As another example, if the emotion display menu has been set when providing an electronic book service, thedisplay controller 230 may control to additionally display emotional information on electronic book content displayed on thedisplay unit 170 as illustrated inFIG. 22B . As another example, in the case in which the emotion display menu has been set when providing a shopping service, thedisplay controller 230 may control to additionally display emotional information on goods information. - In the above various embodiments of the present disclosure, the
electronic device 100 may add the user's emotional information to content and manage the content depending on the user's emotion using theprocessor 122 including theemotion extractor 210 and thefile manager 220. - According to various embodiments of the present disclosure, the
electronic device 100 may include a separate control module for adding the user's emotional information to content, and managing the content depending on the user's emotion. -
FIG. 3 is a flowchart illustrating a procedure for adding a user's emotional information to content in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 3 , atoperation 301, the electronic device drives a first application to provide a service. For example, the electronic device may drive a camera application to provide a photographing service or a moving picture capturing service using thefirst camera unit 140 or thesecond camera unit 150. As another example, the electronic device may drive a moving picture reproduction application to reproduce a moving picture. As another example, the electronic device may drive an electronic book application to provide an electronic book service. As another example, the electronic device may drive a mobile shopping application to provide a mobile shopping service. - At
operation 303, electronic device extracts emotional information of a user who uses a service corresponding to a first application using a user image provided via at least one camera. For example, when providing a photographing service or a moving picture capturing service via thefirst camera unit 140, the electronic device may activate thesecond camera unit 150 to extract emotional information from an obtained user image. As another example, when reproducing a moving picture, the electronic device may activate thesecond camera unit 150 to extract emotional information from an obtained user image. As another example, when providing an electronic book service, the electronic device may activate thesecond camera unit 150 to extract emotional information from an obtained user image. As another example, when providing a shopping service, the electronic device may activate thesecond camera unit 150 to extract emotional information from an obtained user image. - After extracting the user's emotional information at
operation 303, atoperation 305, the electronic device adds the user's emotional information to content corresponding to a first application and stores the same. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to content corresponding to a first application and store the same. As another example, the electronic device may generate and store metadata including the user's emotional information for content corresponding to the first application. -
FIG. 4 is a flowchart a procedure for adding a user's emotional information to a shot image in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 4 , atoperation 401, the electronic device determines whether an image capturing service using thefirst camera unit 140 is provided. For example, the electronic device determines whether an image capturing event using thefirst camera 140 occurs depending on touch information provided via theinput unit 180. - If the electronic device determines that the image capturing service using the
first camera unit 140 is not being provided atoperation 401, then the electronic continues to poll for an indication that the image capturing service using thefirst camera unit 140 is provided. - If the electronic device determines that the image capturing service using the
first camera unit 140 is being provided atoperation 401, then the electronic device proceeds tooperation 403 at which the electronic device drives thesecond camera unit 150. -
FIGS. 5A, 5B, 5C, and 5D are views illustrating screen configuration for adding a user's emotional information to a shot image in an electronic device according to an embodiment of the present disclosure. - Referring to
FIGS. 5A, 5B, 5C, and 5D , the emotion display menu has been set as illustrated inFIGS. 5A and 5B , in order to obtain an image of a user, the electronic device activates thesecond camera unit 150 positioned in the direction opposite to thefirst camera unit 140 that shoots an object when capturing an object. For example, when a capturing service using thefirst camera unit 140 is provided, the electronic device displays a preview screen obtained via thefirst camera unit 140 on thedisplay unit 170 as illustrated inFIG. 5A . When detecting selection of aset icon 501 included in the preview screen, the electronic device displays asetting menu 503. After displaying thesetting menu 503, when detecting selection of an emotion display menu (sense tag) 511 in thesetting menu 503 as illustrated inFIG. 5B , the electronic device recognizes the emotion display menu has been set. Accordingly, as illustrated inFIG. 5C , the electronic device may display a user'semotional information 521 extracted by thesecond camera unit 150 on the preview screen obtained via thefirst camera unit 140. - At
operation 405, the electronic device determines whether an image capturing event occurs. For example, the electronic device determines whether selection of acapturing icon 523 is detected. - If the electronic device determines that the image capturing event does not occur at
operation 405, then the electronic device continue to poll for an indication that the image capturing event occurs. - If the electronic device determines that the image capturing event occurs at
operation 405, then the electronic device proceeds tooperation 407 at which the electronic device obtains a capturing image via thefirst camera unit 140. - In addition, if the electronic device determines that the image capturing event occurs at
operation 405, then the electronic device may also proceeds tooperation 409 at which the electronic device extracts the user's emotional information from a user image obtained via thesecond camera unit 150 at a point at which an image capturing event has occurred. For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via thesecond camera unit 150. After extracting the user's emotional information from the user image, the electronic device extracts the user's emotion with consideration of the movement information of the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for the relevant shot image. - After extracting the captured image and the user's emotional information at
operations operation 411 at which the electronic device adds the user's emotional information extracted when obtaining the shot image to the shot image and stores the same. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to a shot image, or generate metadata including the user's emotional information for a shot image to store the user's emotional information for the shot image. As illustrated inFIG. 5D , the electronic device displays the shot image on thedisplay unit 170. For example, when obtaining the shot image, the electronic device may display the user'semotional information 531 on a partial region of thedisplay unit 170. - In the above various embodiments of the present disclosure, when a capturing service using the
first camera unit 140 of the electronic device is provided, the electronic device may extract the user's emotional information using a user image obtained via thesecond camera unit 150. - According to various embodiments of the present disclosure, in case of providing a capturing service using the
second camera unit 150 of the electronic device, the electronic device may extract the user's emotional information using a user image obtained via thefirst camera unit 140. -
FIG. 6 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 6 , atoperation 601, the electronic device determines whether a moving picture capturing service using thefirst camera unit 140 is provided. For example, the electronic device determines whether a moving picture capturing event using thefirst camera unit 140 occurs depending on touch information provided via theinput unit 180. - If the electronic device determines that the moving picture capturing service using the
first camera unit 140 is not being provided atoperation 601, then the electronic continues to poll for an indication that the moving picture capturing service using thefirst camera unit 140 is provided. - If the electronic device determines that the moving picture capturing service using the
first camera unit 140 is provided atoperation 601, then the electronic device proceeds tooperation 603 at which the electronic device drives thesecond camera unit 150. For example, if the emotion display menu has been set, in order to obtain a user's image, the electronic device activates thesecond camera unit 150 positioned in the direction opposite to thefirst camera unit 140 capturing a moving picture when capturing a moving picture. For example, if a moving picture capturing service using thefirst camera unit 140 is being provided, the electronic device displays a preview screen obtained via thefirst camera unit 140 on thedisplay unit 170. If selection of a setting icon included in the preview screen is detected, the electronic device displays a setting menu. When the electronic device detects selection of the emotion display menu in the setting menu, the electronic device recognizes the emotion display menu has been set. - At
operation 605, the electronic device determines whether a moving picture capturing event occurs. For example, the electronic device determines whether selection of a capturing icon displayed on the preview screen is detected. - If the electronic device determines that a moving picture capturing event does not occur at
operation 605, then the electronic device may continue to poll for an indication that a moving picture capturing event occurs. - If the electronic device determines that a moving picture capturing event occurs at
operation 605, then the electronic device proceed tooperation 607 at which the electronic device obtains a moving picture via thefirst camera unit 140. - Thereafter, the electronic device may proceed to
operation 613 at which the electronic device determines whether the moving picture capturing ends. - the electronic device determines that a moving picture capturing event occurs in
operation 605, then the electronic device may also proceed tooperation 609 at which the electronic device determines whether the user's emotional information is extracted from a user image obtained via thesecond camera unit 150 while the moving picture is shot. For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via thesecond camera unit 150. After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information of the facial muscles. At this point, when an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize the electronic device has extracted the user's emotion from a frame of the capturing point of the relevant moving picture. - If the electronic device determines that the user's emotional information has not been extracted at
operation 609, then the electronic device proceeds tooperation 613 at which the electronic device determines whether the moving picture capturing ends. - In contrast, if the electronic device determines that the emotional information of the user is extracted at
operation 609, then the electronic device proceeds tooperation 611 at which the electronic device adds the user's emotional information obtained while capturing a moving picture to the moving picture. The electronic device adds the user's emotional information to the moving picture so that a point of extracting the user's emotional information is displayed. -
FIGS. 9A, 9B, and 9C are views illustrating screen configuration for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure. - Referring to
FIGS. 9A, 9B, and 9C , the electronic device adds the user'semotional information 901 to a time search bar of the moving picture of the point of extracting the user's emotional information as illustrated inFIG. 9A . After adding the user'semotional information 901 to the time search bar, if the user's emotional information is extracted again, the electronic device may add the user'semotional information FIGS. 9B and 9C . According to various embodiments of the present disclosure, the electronic device may constantly display the previously added user's emotional information on the time search bar as illustrated inFIGS. 9B and 9C . - At
operation 613, the electronic device determines whether the moving picture capturing ends. - If the electronic device determines that the moving picture capturing does not end at
operation 613, then the electronic device proceeds tooperation 607 at which the electronic device obtains a moving picture via thefirst camera unit 140. - In addition, if the electronic device determines that the moving picture capturing does not end at
operation 613, then the electronic device also proceeds tooperation 609 at which the electronic device determines whether the user's emotional information is extracted from a user image obtained via thesecond camera unit 150 while the moving picture is captured. - In contrast, if the electronic device determines that the moving picture capturing ends at
operation 613, then the electronic device proceeds tooperation 615 at which the electronic device stores the moving picture to which the emotional information has been added. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to a frame of a point of extracting the user's emotional information among frames forming the moving picture, and store the same. As another example, the electronic device may generate metadata including the user's emotional information obtained during moving picture capture, and store the same together with the moving picture. According to various embodiments of the present disclosure, the metadata including the user's emotional information includes information of a point of extracting the user's emotional information during moving picture capture together. - In the above various embodiments of the present disclosure, the electronic device may add the user's emotional information to a time search bar of the moving picture every point of extracting the user's emotional information.
- According to various embodiments of the present disclosure, the electronic device may add the user's emotional information to the time search bar of the moving picture at a point at which the user's emotional information has changed. For example, in case of extracting the user's emotion of happiness at a first point of the moving picture and then extracting the user's emotion of happiness also at a second point, the electronic device may add only information of the user's emotion of happiness at the first point to the time search bar of the moving picture.
-
FIG. 7 is a flowchart illustrating a procedure for adding a user's emotional information to a moving picture in an electronic device according to another embodiment of the present disclosure. - Referring to
FIG. 7 , atoperation 701, the electronic device determines whether a moving picture capturing service using thefirst camera unit 140 is provided. For example, the electronic device determines whether a moving picture capturing event using thefirst camera unit 140 occurs depending on touch information provided via theinput unit 180. - If the electronic device determines that the moving picture capturing service using the
first camera unit 140 is not being provided atoperation 701, then the electronic continues to poll for an indication that the moving picture capturing service using thefirst camera unit 140 is provided. - If the electronic device determines that the moving picture capturing service using the
first camera unit 140 is being provided atoperation 701, then the electronic device proceeds tooperation 703 at which the electronic device drives thesecond camera unit 150. For example, if the emotion display menu has been set, in order to obtain a user's image, the electronic device activates thesecond camera unit 150 positioned in the direction opposite to thefirst camera unit 140 capturing a moving picture when capturing a moving picture. - At
operation 705, the electronic device determines whether a moving picture capturing event occurs. For example, the electronic device determines whether selection of a capturing icon displayed on the preview screen is detected. - If the electronic device determines that the moving picture capturing event occurs at
operation 705, then the electronic device proceeds tooperation 707 at which the electronic device obtains a moving picture via thefirst camera unit 140. - At
operation 715, the electronic device determines whether the moving picture capturing ends atoperation 715. - In addition, if the electronic device determines that the moving picture capturing event occurs at
operation 705, then the electronic device also proceeds tooperation 709 at which the electronic device determines whether an emotion extraction period arrives. According to various embodiments of the present disclosure, the emotion extraction period may change depending on the user's input information. - If the electronic device determines that an emotion extraction period does not arrive at
operation 709, then the electronic device may continue to poll for an indication that the emotion extraction period arrives. - If the electronic device determines that the emotion extraction period arrives at
operation 709, then the electronic device proceeds tooperation 711 at which the electronic device determines whether the user's emotional information is extracted from a user image obtained via thesecond camera unit 150. For example, the electronic device estimates movement of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via thesecond camera unit 150. After estimating movement of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information of the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for a frame of a point of capturing a relevant moving picture. - If the electronic device determines that the user's emotional information is not extracted from a user image obtained via the
second camera unit 150 atoperation 711, then the electronic device proceeds tooperation 715 at which the electronic device determines whether moving picture capturing ends. - In contrast, if the electronic device determines that the user's emotional information is extracted from a user image obtained via the
second camera unit 150 atoperation 711, then the electronic device proceeds tooperation 713 at which the electronic device adds the user's emotional information to the moving picture. The electronic device adds the user's emotional information to the moving picture so that a point of extracting the user's emotional information is displayed. For example, the electronic device adds the user's emotional information to a time search bar of the point of extracting the user's emotional information of the moving picture as illustrated inFIG. 9A . After adding the user's emotional information to a time search bar of the point of extracting the user's emotional information of the moving picture, when extracting the user's emotional information again, the electronic device may add the user's emotional information to a time search bar of a point of extracting the user's emotional information of the moving picture as illustrated inFIGS. 9B and 9C . According to various embodiments of the present disclosure, the electronic device may constantly display the previously added user's emotional information to the time search bar as illustrated inFIGS. 9B and 9C . - At
operation 715, the electronic device determines whether moving picture capturing ends. - If the electronic device determines that the moving picture capturing does not end at
operation 715, then the electronic device proceeds tooperation 707 at which the electronic device obtains a moving picture via thefirst camera unit 140. - In addition, if the electronic device determines that the moving picture capturing does not end at
operation 715, then the electronic device may also proceed tooperation 709 at which the electronic device determines whether an emotion extract period arrives. - If the electronic device determines that the moving picture capturing ends at
operation 715, then the electronic device proceeds tooperation 717 at which the electronic device stores the moving picture to which emotional information has been added. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to a frame of a point of extracting the user's emotional information among frames forming the moving picture, and store the same. As another example, the electronic device may generate metadata including the user's emotional information obtained during moving picture capturing, and store the same together with the moving picture. According to various embodiments of the present disclosure, the metadata including the user's emotional information includes information of a point of extracting the user's emotional information during the moving picture capturing together. - According to the above various embodiments of the present disclosure, the electronic device may add the user's emotional information to the time search bar of the moving picture every point of extracting the user's emotional information.
- According to various embodiments of the present disclosure, the electronic device may add the user's emotional information to the time search bar of the moving picture at a point at which the user's emotional information has changed. For example, if the user's emotion of happiness at a first emotion extraction period of the moving picture is extracted and then the user's emotion of happiness at a second emotion extraction period is extracted, the electronic device may add only the user's emotional information extracted at the first emotion extraction period to the time search bar of the moving picture.
- According to the above various embodiments of the present disclosure, if a moving picture capturing service using the
first camera unit 140 of the electronic device is being provided, the electronic device may extract the user's emotional information from a user image obtained via thesecond camera unit 150. - According to various embodiments of the present disclosure, if a moving picture capturing service using the
second camera unit 150 of the electronic device is being provided, the electronic device may extract the user's emotional information from a user image obtained via thefirst camera unit 140. -
FIG. 8 illustrates a procedure for adding a user's emotional information to a moving picture in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 8 , atoperation 801, the electronic device determines whether a moving picture reproduction service is provided. For example, the electronic device determines whether one of one or more moving picture files stored in thedata storage 112 is selected depending on touch information provided via theinput unit 180. - If the electronic device determines that the moving picture reproduction service is not being provided at
operation 801, then the electronic device may continue to poll for an indication that the moving picture reproduction service is being provided. - If the electronic device determines that the moving picture reproduction service is being provided at
operation 801, then the electronic device proceeds tooperation 803 at which the electronic device reproduces a moving picture selected for the moving picture reproduction service. For example, the electronic device displays the reproduced moving picture on thedisplay unit 170. - Thereafter, the electronic device proceeds to
operation 811 at which the electronic device determines whether the moving picture reproduction ends. - In addition, if the electronic device determines that the moving picture reproduction service is being provided at
operation 801, then the electronic device may also proceed tooperation 805 at which the electronic device drives thesecond camera unit 150. For example, in the case in which the emotion display menu has been set, the electronic device activates thesecond camera unit 150 positioned in the same direction as thedisplay unit 170 for displaying the moving picture being reproduced in order to obtain the user's image while reproducing the moving picture. Thereafter, the electronic device proceeds tooperation 807. - At
operation 807, the electronic device determines whether the user's emotional information is extracted from the user image obtained via thesecond camera unit 150. For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via thesecond camera unit 150. After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of the movement information of the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotion value, the electronic device may recognize that the electronic device has extracted the user's emotion for a frame of a point of capturing the relevant moving picture. - If the electronic device determines that the user's emotional information is not extracted at
operation 807, then the electronic device proceeds tooperation 811 at which the electronic device determines whether the moving picture reproduction ends. - If the electronic device determines that the user's emotional information is extracted at
operation 807, then the electronic device proceeds tooperation 809 at which the electronic device adds the user's emotional information to the moving picture. The electronic device adds the user's emotional information to the moving picture so that a point of extracting the user's emotional information is displayed. For example, the electronic device adds the user's emotional information to a time search bar of a point of extracting the user's emotional information of the moving picture as illustrated inFIG. 9A . After adding the user's emotional information to a time search bar of a point of extracting the user's emotional information of the moving picture, if the user's emotional information is extracted again, the electronic device may add the user's emotional information to the time search bar of the point of extracting the user's emotional information of the moving picture as illustrated inFIGS. 9B and 9C . According to various embodiments of the present disclosure, the electronic device may constantly display the previously added user's emotional information to the time search bar as illustrated inFIGS. 9B and 9C . - At
operation 811, the electronic device determines whether the moving picture reproduction ends. - If the electronic device determines that the moving picture reproduction does not end at
operation 811, then the electronic device proceeds tooperation 803 at which the electronic device reproduces the moving picture selected for the moving picture reproduction service. - In addition, if the electronic device determines that the moving picture reproduction does not end at
operation 811, then the electronic device may also proceed tooperation 807 at which the electronic device determines whether the user's emotional information is extracted. - If the electronic device determines that the moving picture reproduction ends at
operation 811, then the electronic device proceeds to operation 813 at which the electronic device stores the moving picture to which the emotional information has been added. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to a frame of a point of extracting the user's emotional information among frames forming the moving picture, and store the same. As another example, the electronic device may generate metadata including the user's emotional information obtained during moving picture capturing, and store the same together with the moving picture. According to various embodiments of the present disclosure, the metadata including the user's emotional information includes information of a point of extracting the user's emotional information during moving picture capturing together. - According to the above various embodiments of the present disclosure, the electronic device may add the user's emotional information to a time search bar of the moving picture every point of extracting the user's emotional information.
- According to various embodiments of the present disclosure, the electronic device may add the user's emotional information to the time search bar of the moving picture at a point at which the user's emotional information has changed. For example, if the user's emotion of happiness at a first point of the moving picture is extracted and then the user's emotion of happiness at a second point is extracted, the electronic device may add only information of the user's emotion of happiness extracted at the first point to the time search bar of the moving picture.
-
FIG. 10 is a flowchart illustrating a procedure for adding a user's emotional information to electronic book content in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 10 , atoperation 1001, the electronic device determines whether an electronic book service is provided. For example, the electronic device determines whether one of one or more electronic book files stored in thedata storage 112 is selected depending on touch information provided via theinput unit 180. - If the electronic device determines that an electronic book service is not provided at
operation 1001, then the electronic device may continue to poll for an indication that the electronic book service is provided. - If the electronic device determines that the electronic book service is being provided at
operation 1001, then the electronic device proceeds tooperation 1003 at which the electronic device displays electronic book content selected for the electronic book service on thedisplay unit 170. - Thereafter, the electronic device proceeds to
operation 1011 at which the electronic device determines whether the electronic book service ends. - In addition, if the electronic device determines that the electronic book service is being provided at
operation 1001, then the electronic device may also proceed tooperation 1005 at which the electronic device drives thesecond camera unit 150. For example, if the emotion display menu has been set, the electronic device activates thesecond camera unit 150 positioned in the same direction as thedisplay unit 170 for displaying the electronic book content in order to obtain a user's image while displaying the electronic book content. Thereafter, the electronic device proceeds tooperation 1007. - At
operation 1007, the electronic device determines whether the user's emotional information is extracted from the user image obtained via thesecond camera unit 150. For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via thesecond camera unit 150. After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information for the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for a relevant position of the electronic book content. - If the electronic device determines that the user's emotional information has not been extracted at
operation 1007, then the electronic device may proceed tooperation 1011 at which the electronic device determines whether the electronic book service ends. - In contrast, if the electronic device determines that the user's emotional information has been extracted at
operation 1007, then the electronic device proceeds tooperation 1009 at which the electronic device adds the user's emotional information to the electronic book content and displays the same. -
FIGS. 11A and 11B are views illustrating screen configuration for adding a user's emotional information to electronic book content in an electronic device according to an embodiment of the present disclosure. - Referring to
FIGS. 11A and 11B , the electronic device adds the user'semotional information 1101 to a page from which the user's emotional information has been extracted and displays the user'semotional information 1101 as illustrated inFIG. 11A . As another example, the electronic device may add the user's emotional information to a sentence or a paragraph from which the user's emotional information has been extracted and display the user'semotional information 1103 as illustrated inFIG. 11B . - At
operation 1011, the electronic device determines whether the electronic book service ends. - If the electronic device determines that the electronic book service does not end at
operation 1011, then the electronic device proceeds tooperation 1003 at which the electronic device displays the electronic book content on thedisplay unit 170. - In addition, if the electronic device determines that the electronic book service does not end at
operation 1011, then the electronic device may also proceed tooperation 1007 at which the electronic device determines whether the user's emotional information is extracted. - In contrast, if the electronic device determines that the electronic book service ends at
operation 1011, then the electronic device proceeds tooperation 1013 at which the electronic device stores electronic book content to which the emotional information has been added. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to the electronic book content and store the same. As another example, the electronic device may generate metadata including the user's emotional information obtained from the electronic book content and store the same together with the electronic book content. According to various embodiments of the present disclosure, the metadata including the user's emotional information includes information of a position at which the user's emotional information has been extracted from the electronic book content together. - As described above, the electronic device may add the user's emotional information to the electronic book content. If the electronic book contents is intended for learning, the electronic device may control a learning level of difficulty with consideration of the user's emotional information for the electronic book content.
-
FIG. 12 is a flowchart illustrating a procedure for adding a user's emotional information to shopping information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 12 , atoperation 1201, the electronic device determines whether a shopping service is provided. For example, the electronic device determines whether a shopping icon is selected depending on touch information provided via theinput unit 180. - If the electronic device determines that a shopping service is not being provided at
operation 1201, then the electronic device may continue to poll for an indication that the shopping service is being provided. - If the electronic device determines that the shopping service is being provided at
operation 1201, then the electronic device proceeds tooperation 1203 at which the electronic device displays purchasable goods information on thedisplay unit 170. Thereafter, the electronic device proceeds tooperation 1211. - At
operation 1211, the electronic device determines whether a shopping service ends. - In addition, if the electronic device determines that the shopping service is being provided at
operation 1201, then the electronic device may also proceed tooperation 1205 at which the electronic device drives thesecond camera unit 150. For example, if the emotion display menu has been set, the electronic device activates thesecond camera unit 150 positioned in the same direction as thedisplay unit 170 for displaying the goods information in order to obtain a user's image while displaying the purchasable goods information. Thereafter, the electronic device proceeds tooperation 1207. - At
operation 1207, the electronic device determines whether the user's emotional information is extracted from the user image obtained via thesecond camera unit 150. For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via thesecond camera unit 150. After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information for the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for the relevant goods information. - If the electronic device determines that the user's emotional information has not been extracted at
operation 1207, then the electronic device proceeds tooperation 1211 at which the electronic device determines whether the shopping service ends. - If the electronic device determines that the user's emotional information has been extracted at
operation 1207, then the electronic device proceeds tooperation 1209 at which the electronic device adds the user's emotional information to the goods information and displays the same. For example, the electronic device may add the user's emotional information to the goods information from which the user's emotional information has been extracted among a list of purchasable goods, and display the same. As another example, the electronic device may add the user's emotional information to goods detail information of goods from which the user's emotional information has been extracted among one or more purchasable goods, and display the same. - At
operation 1211, the electronic device determines whether the shopping service ends. - If the electronic device determines that the shopping service does not end at
operation 1211, then the electronic device proceeds tooperation 1203 at which the electronic device displays purchasable goods information on thedisplay unit 170. - In addition, if the electronic device determines that the shopping service does not end at
operation 1211, then the electronic device may also proceed tooperation 1207 at which the electronic device determines whether the user's emotional information is extracted. - In contrast, if the electronic device determines that the shopping service ends at
operation 1211, then the electronic device proceeds tooperation 1213 at which the electronic device stores the goods information to which the emotional information has been added. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to the goods information from which the user's emotional information has been extracted, and store the same. As another example, the electronic device may generate metadata including the user's emotional information obtained from the goods information, and store the same together with the goods information. -
FIG. 13 is a flowchart illustrating a procedure for adding a user's emotional information to shopping information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 13 , atoperation 1301, the electronic device determines whether a shopping service is provided. For example, the electronic device determines a shopping icon is selected depending on touch information provided via theinput unit 180. - If the electronic device determines that a shopping service is not being provided at
operation 1301, then the electronic device may continue to poll for an indication that the shopping service is provided. - If the electronic device determines that the shopping service is being provided at
operation 1301, then the electronic device proceeds tooperation 1303 at which the electronic device displays purchasable goods information on thedisplay unit 170. - At
operation 1305, the electronic device determines whether a goods purchase event occurs. - If the electronic device determines that the goods purchase event does not occur at
operation 1305, then the electronic device may continue to poll for an indication that the goods purchase event occur, while the electronic device displays the purchasable goods information on thedisplay unit 170. - In contrast, if the electronic device determines that the goods purchase event occurs at
operation 1305, then the electronic device may proceed tooperation 1307 at which the electronic device drives thesecond camera unit 150. For example, if the emotion display menu has been set, the electronic device activates thesecond camera unit 150 positioned in the same direction as thedisplay unit 170 for displaying goods information in order to obtain a user's image while displaying the purchasable goods information. - Thereafter, at
operation 1309, the electronic device determines whether the user's emotional information is extracted from the user image obtained via thesecond camera unit 150. For example, the electronic device estimates movements of a plurality of facial muscles for estimating emotional information from the user's facial image obtained via thesecond camera unit 150. After estimating movements of a plurality of facial muscles for estimating emotional information from the user's facial image, the electronic device extracts the user's emotion with consideration of movement information for the facial muscles. If an emotion estimation value calculated with consideration of the movement information of the facial muscles exceeds a reference emotional value, the electronic device may recognize that the electronic device has extracted the user's emotion for the relevant goods information. - If the electronic device determines that the user's emotional information has not been extracted, then the electronic device ends the procedure for adding the user's emotional information to shopping information.
- In contrast, if the electronic device determines that the user's emotional information has been extracted at
operation 1309, then the electronic device proceeds tooperation 1311 at which the electronic device adds the user's emotional information to the goods information and displays the same. For example, the electronic device may add the user's emotional information to goods information from which the user's emotional information has been extracted among a purchasable goods list and display the same. As another example, the electronic device may add the user's emotional information to goods detail information of goods from which the user's emotional information has been extracted among one or more purchasable goods, and display the same. Thereafter, the electronic device proceeds tooperation 1313. - At
operation 1313, the electronic device stores the goods information to which the emotional information has been added. For example, the electronic device may add an emotional tag corresponding to the user's emotional information to the goods information from which the user's emotional information has been extracted, and store the same. As another example, the electronic device may generate metadata including the user's emotional information obtained from the goods information, and store the same together with the goods information. - As described above, when adding the user's emotional information for the goods information, the electronic device may transmit the user's emotional information added to the goods information to a shopping server. In this case, the shopping server may display a user compliance rate for relevant goods with consideration of the user's emotional information for the specific goods collected from a plurality of electronic devices. For example, the electronic device may use the users' emotional information for marketing of the relevant goods.
- As described above, the electronic device may add the extracted user's emotional information to relevant content when using the content. Accordingly, the electronic device may manage at least one content stored in the
data storage 112 with consideration of the user's emotional information for each content. -
FIG. 14 is a flowchart illustrating a procedure for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 14 , atoperation 1401, the electronic device determines whether a photo display event occurs. For example, the electronic device determines whether selection of a photo display icon is detected depending on touch information provided via theinput unit 180. - If the electronic device determines that the a photo display event does not occur at
operation 1401, then the electronic device may continue to poll for an indication that a photo display event occurs. - If the electronic device determines that the photo display event occurs at
operation 1401, then the electronic device proceeds tooperation 1403 at which the electronic device displays a photo list for at least one photo stored in thedata storage 112 on thedisplay unit 170. -
FIGS. 15A, 15B, 15C, and 15D are views illustrating screen configuration for displaying a photo with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to 15A, 15B, 15C, and 15D, the electronic device displays the photo list for at least one photo stored in the
data storage 112 on thedisplay unit 170 as illustrated inFIG. 15A . - At
operation 1405, the electronic device determines whether a sort event corresponding to emotional information occurs. For example, the electronic device determines whether emotional information (“emotion”) 1503 is selected as asort condition 1501 for a photo as illustrated inFIG. 15A . - If the electronic device determines that a sort event corresponding to emotional information does not occur at
operation 1405, then the electronic device ends the procedure for displaying a photo with consideration of emotional information. - If the electronic device determines that a sort event corresponding to emotional information does occur at
operation 1405, the electronic device proceeds tooperation 1407 at which the electronic device determines the user's emotional information added to each photo stored in thedata storage 112. For example, the electronic device determines the user's emotional information for each photo via an emotional tag tagged to a photo. As another example, the electronic device may determine the user's emotional information for each photo with consideration of metadata for emotional information stored in thedata storage 112. Thereafter, the electronic device proceeds tooperation 1409. - At
operation 1409, the electronic device sorts and displays at least one photo file stored in thedata storage 112 depending on the user's emotional information. For example, as illustrated inFIG. 15B , the electronic device may group and display at least one photo file depending on the user's emotional information. After grouping and displaying at least one photo file depending on the user's emotional information, in the case in which selection of a “happiness”folder 1505 is detected as illustrated inFIG. 15B , the electronic device displays at least one file list including emotion of “happiness” on thedisplay unit 170 as illustrated inFIG. 15C . In addition, when selection of aspecific photo file 1507 is detected in a file list, the electronic device may display the selected photo file on thedisplay unit 170 as illustrated inFIG. 15D . Additionally, the electronic device may display a thumbnail of a moving picture together with a photo list depending on the user's emotional information as illustrated inFIG. 15C . When selection of a thumbnail of a movingpicture 1509 is detected, the electronic device may reproduce the moving picture for the thumbnail as illustrated inFIGS. 19B and 19C . The electronic device may reproduce the moving picture from a point displayed by the thumbnail. - According to the above various embodiments of the present disclosure, in the case in which the electronic device displays a photo list and then a sort event corresponding to emotional information occurs, the electronic device may determine emotional information for each photo to sort and display photos depending on the emotional information.
- According to various embodiments of the present disclosure, if emotional information is set as a basic sort condition, when a photo display event occurs at
operation 1401, the electronic device determines the user's emotional information added to each photo stored in thedata storage 112 atoperation 1407. Thereafter, atoperation 1409, the electronic device sorts and displays one or more photo files stored in thedata storage 112 depending on the user's emotional information. - According to various embodiments of the present disclosure, if the emotional information display menu has been set, when displaying a photo list, the electronic device may display an emotion display icon on a photo to which emotional information has been added. Specifically, when a photo display event occurs, the electronic device determines whether the emotional information display menu has been set. If the emotional information display menu has been set, the electronic device determines the user's emotional information added to each photo stored in the
data storage 112. After determining the user's emotional information added to each photo, the electronic device may display an emotion display icon on a photo to which emotional information has been added. -
FIG. 16 is a flowchart illustrating a procedure for displaying moving picture information with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 16 , atoperation 1601, the electronic device determines whether a moving picture reproduction service is provided. For example, the electronic device determines whether one of one or more moving picture files stored in thedata storage 112 is selected depending on touch information provided via theinput unit 180. - If the electronic device determines that the moving picture reproduction service is not being provided at
operation 1601, then the electronic device may continue to poll for an indication that the moving picture reproduction service is being provided. - If the electronic device determines that the moving picture reproduction service is being provided at
operation 1601, then the electronic device proceeds tooperation 1603 at which the electronic device reproduces the moving picture selected for the moving picture reproduction service. For example, the electronic device reproduces the moving picture and displays the same on thedisplay unit 170 as illustrated inFIG. 18A . - At
operation 1605, the electronic device determines whether an emotional information display event occurs. For example, the electronic device determines whether selection of the emotional information display menu is detected depending on touch information provided via theinput unit 180. - If the electronic device determines that the emotional information display event does not occur at
operation 1605, then the electronic device may end the procedure for displaying moving picture information with consideration of emotional information. For example, the electronic device constantly reproduces the moving picture selected for the moving picture reproduction service. - If the electronic device determines that the emotional information display event occurs at
operation 1605, then the electronic device proceeds tooperation 1607 at which the electronic device determines the user's emotional information added to the moving picture. For example, the electronic device may determine the user's emotional information added to a moving picture via an emotional tag tagged to a frame forming the moving picture. As another example, the electronic device may determine the user's emotional information for the moving picture with consideration of metadata for emotional information stored in thedata storage 112. Thereafter, the electronic device proceeds tooperation 1609. - At
operation 1609, the electronic device displays the user's emotional information when reproducing the moving picture. For example, as illustrated inFIG. 18B , the electronic device displays the user'semotional information FIG. 18C , if the user of the electronic device controls a reproduction point of the moving picture using the time search bar, the electronic device may display the user'semotional information 1805 at a point of extracting the user's emotion. Thereafter, the electronic device proceeds to operation 16113. - At
operation 1611, the electronic device determines whether selection of emotional information displayed on the time search bar is detected. - If the electronic device determines that selection of emotional information displayed on the time search bar is not detected at
operation 1611, then the electronic device may continue to poll for an indication that selection of emotional information displayed on the time search bar is detected. - In contrast, if the electronic device determines that selection of the emotional information displayed on the time search bar is detected at
operation 1611, then the electronic device may proceed tooperation 1613 at which the electronic device changes a reproduction point of the moving picture to the point at which the emotional information selected inoperation 1611 has been extracted. For example, if selection of emotional information “depression” 1801 is detected from the emotional information illustrated inFIG. 18B , the electronic device changes the reproduction point of the moving picture to the point of extracting the emotional information “depression” 1801. - According to the above various embodiments of the present disclosure, if the electronic device determines that an emotional information display event occurs at
operation 1605 while reproducing the moving picture atoperation 1603, then the electronic device may display the user's emotional information added to the moving picture on thedisplay unit 170. - According to various embodiments of the present disclosure, if the emotional information display menu has been set, then the electronic device may display the user's emotional information added to the moving picture from a moving picture reproduction point. In this case, the electronic device determines the emotional information added to the moving picture for reproduction before reproducing the moving picture.
-
FIG. 17 is a flowchart illustrating a procedure for displaying moving picture information with consideration of emotional information in an electronic device according to another embodiment of the present disclosure. - Referring to
FIG. 17 , atoperation 1701, the electronic device determines whether to provide a moving picture reproduction service. For example, the electronic device determines whether an icon of a moving picture reproduction application is selected depending on touch information provided via theinput unit 180. - If the electronic device determines that the moving picture reproduction service is not being provided at
operation 1701, then the electronic device may continue to poll for an indication that moving picture reproduction service is being provided. - If the electronic device determines that the moving picture reproduction service is being provided at
operation 1701, then the electronic device proceeds tooperation 1703 at which the electronic device displays a moving picture file list for at least one moving picture file stored in thedata storage 112 on thedisplay unit 170. Thereafter, the electronic device proceeds tooperation 1705. - At
operation 1705, the electronic device determines whether selection of a first moving picture file which is one of moving picture files in the moving picture file list is detected. - If the electronic device determines that selection of the first moving picture file is not detected at
operation 1705, then the electronic device may continue to poll for an indication that selection of the first moving picture file is detected. - If the electronic device determines that selection of the first moving picture file is detected at
operation 1705, then the electronic device may proceed tooperation 1707 at which the electronic device determines the user's emotional information included in the first moving picture file. For example, the electronic device determines the user's emotional information added to the first moving picture file via an emotional tag tagged to a frame forming the first moving picture file. As another example, the electronic device may determine the user's emotional information for the first moving picture file with consideration of metadata for emotional information stored in thedata storage 112. Thereafter, the electronic device proceeds tooperation 1709. - At
operation 1709, the electronic device displays a thumbnail for the user's emotional information added to the first moving picture file on thedisplay unit 170. -
FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a moving picture with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 19A , the electronic device displays a thumbnail of a point of extracting the user's emotional information from the first moving picture file on thedisplay unit 170. - At
operation 1711, the electronic device determines whether one of one or more thumbnails representing the user's emotional information for the first moving picture is selected. For example, as illustrated inFIG. 19A , the electronic device determines whether selection of afirst thumbnail 1901 is detected among thumbnails for the first moving picture. - If the electronic device determines that one of one or more thumbnails representing the user's emotional information for the first moving picture is not selected at
operation 1711, then the electronic device may continue to poll for an indication that one of one or more thumbnails representing the user's emotional information for the first moving picture is selected. - If the electronic device determines that selection of one of one or more thumbnails representing the user's emotional information for the first moving picture is detected at
operation 1711, then the electronic device proceeds tooperation 1713 at which the electronic device reproduces the moving picture from a point from which emotional information of the thumbnail has been extracted inoperation 1713. For example, as illustrated inFIGS. 19B and 19C , the electronic device reproduces the moving picture from a point from which emotional information of the thumbnail has been extracted. The electronic device may display the thumbnail information for the first moving picture or thumbnail information including the same emotional information as the thumbnail selected atoperation 1711 on apartial region 1903 of thedisplay unit 170 as illustrated inFIG. 19B . Meanwhile, as illustrated inFIG. 19C , the electronic device may displayemotional information - According to the above various embodiments of the present disclosure, the electronic device may display a thumbnail for each emotional information added to the first moving picture as illustrated in
FIG. 19A . - According to various embodiments of the present disclosure, the electronic device may display only a thumbnail of a point at which emotional information has changed inside the first moving picture.
-
FIG. 20 is a flowchart illustrating a procedure for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 20 , atoperation 2001, the electronic device determines whether an electronic book service is provided. For example, the electronic device determines whether one of electronic book content stored in thedata storage 112 is selected depending on touch information provided via theinput unit 180. - If the electronic device determines that the electronic book service is not being provided at
operation 2001, then the electronic device may continue to poll for an indication that the electronic book service is being provided. - If the electronic device determines that the electronic book service is being provided at
operation 2001, then the electronic device proceeds tooperation 2003 at which the electronic device displays electronic book content selected for the electronic book service on thedisplay unit 170. - At
operation 2005, the electronic device determines whether an emotional information display event occurs. For example, the electronic device determines whether selection of the emotional information display menu is detected depending on touch information provided via theinput unit 180. - If the electronic device determines that the emotional information display event does not occur at
operation 2005, then the electronic device proceeds tooperation 2003 at which the electronic device constantly displays electronic book content selected for the electronic book service on thedisplay unit 170. - In contrast, if the electronic device determines that the emotional information display event occurs at
operation 2005, then the electronic device proceeds tooperation 2007 at which the electronic device determines the user's emotional information added to the electronic book content. For example, the electronic device may determine the user's emotional information added to the electronic book content via an emotional tag tagged to the electronic book content. As another example, the electronic device may determine the user's emotional information added to the electronic book content with consideration of metadata for emotional information stored in thedata storage 112. Thereafter, the electronic device proceeds tooperation 2009. - At
operation 2009, the electronic device displays the user's emotional information including position information where the emotional information has been extracted from the electronic book content on thedisplay unit 170. For example, as illustrated inFIG. 22A , the electronic device displays astructure window 2201 for the emotional information added to the electronic book content on thedisplay unit 170 as illustrated inFIG. 22A . Thestructure window 2201 classifies and displays an extraction position of the user's emotional information for each emotion kind. - At
operation 2011, the electronic device determines whether selection of the emotional information displayed on thestructure window 2201 is detected. - If the electronic device determines that selection of one of one or more emotional information displayed on the
structure window 2201 is not detected atoperation 2011, then the electronic device may continue to poll for an indication that one of one or more emotional information is selected. - If the electronic device determines that selection of one of one or more emotional information displayed on the
structure window 2201 is detected atoperation 2011, then the electronic device proceeds tooperation 2013 at which the electronic device changes a display region of the electronic book content to the position from which the emotional information whose selection has been detected has been extracted. - According to the above various embodiments of the present disclosure, the electronic device may display the emotional information added to the electronic book content using the emotional
information structure window 2201. - According to various embodiments of the present disclosure, the electronic device may display the user's emotional information inside the electronic book content as illustrated in
FIG. 21 . -
FIG. 21 is a flowchart illustrating a procedure for displaying electronic book content with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 21 , atoperation 2101, the electronic device determines whether an electronic book service is provided. For example, the electronic device determines whether one of one or more electronic book content stored in thedata storage 112 is selected depending on touch information provided via theinput unit 180. - If the electronic device determines that an electronic book service is not provided at
operation 2101, then the electronic device may continue to poll for an indication that the electronic book service is provided. - If the electronic device determines that the electronic book service is being provided at
operation 2101, then the electronic device proceeds tooperation 2103 at which the electronic device determines whether the emotional display menu has been set. - If the electronic device determines that the emotional display menu has not been set at
operation 2103, then the electronic device proceeds tooperation 2109 at which the electronic device displays the electronic book content selected for the electronic book service on thedisplay unit 170. - In contrast, if the electronic device determines that the emotional display menu has been set at
operation 2103, then the electronic device proceeds tooperation 2105 at which the electronic device determines the user's emotional information added to the electronic book content. For example, the electronic device may determine the user's emotional information added to the electronic book content via an emotional tag tagged to the electronic book content. As another example, the electronic device may determine the user's emotional information added to the electronic book content with consideration of metadata for emotional information stored in thedata storage 112. Thereafter, the electronic device proceeds to operation 2107. - At operation 2107, the electronic device displays the electronic book content together with emotional information extracted from the electronic book content on the
display unit 170. The electronic device displays the user's emotional information at a position at which the emotional information has been extracted in the electronic book content. For example, as illustrated inFIG. 22B , the electronic device may display anemotional icon 2211 depending on the user's emotional information on a page that has extracted the user's emotional information in the electronic book content. As another example, as illustrated inFIG. 22B the electronic device may display anemotional icon 2213 corresponding to the user's emotional information on a paragraph where the user's emotional information has been extracted in the electronic book content. As another example, as illustrated inFIG. 22B the electronic device may mark ashade 2215 corresponding to the user's emotional information on a paragraph where the user's emotional information has been extracted in the electronic book content. At this point, the electronic device may determine at least one of the shape, the color, and the opacity of the shade depending on the user's emotional information. As another example, the electronic device may draw anunderline 2217 on a paragraph from which the user's emotional information has been extracted in the electronic book content depending on the user's emotional information as illustrated inFIG. 22B . At this point, the electronic device may determine at least one of the shape, the color, and the thickness of the underline depending on the user's emotional information. As another example, as illustrated inFIG. 22B the electronic device may mark aparenthesis 2219 on a paragraph where the user's emotional information has been extracted in the electronic book content depending on the user's emotional information. The electronic device may determine the shape of a parenthesis depending on the user's emotional information. -
FIG. 23 is a flowchart illustrating a procedure for displaying shopping information with consideration of emotional information in an electronic device according to an embodiment of the present disclosure. - Referring to
FIG. 23 , atoperation 2301, the electronic device determines whether a shopping service is provided. For example, the electronic device determines whether selection of an icon for a shopping application is detected depending on touch information provided via theinput unit 180. - If the electronic device determines that the shopping service is not provided at
operation 2301, the electronic device may continue to poll for an indication that the shopping service is provided. - If the electronic device determines that the shopping service is being provided at
operation 2301, then the electronic device proceeds tooperation 2303 at which the electronic device determines whether the emotion display menu has been set. - If the electronic device determines that the emotion display menu has not been set at
operation 2303, then the electronic device may proceed tooperation 2309 at which the electronic device displays a list of goods purchasable via the shopping service on thedisplay unit 170. - In contrast, if the electronic device determines that the emotion display menu has been set at
operation 2303, then the electronic device proceeds tooperation 2305 at which the electronic device determines the user's emotional information added to goods information. For example, the electronic device may determine the user's emotional information added to goods information via an emotional tag tagged to each goods information. As another example, the electronic device may determine the user's emotional information added to each goods information with consideration of metadata for emotional information stored in thedata storage 112. Thereafter, the electronic device proceeds tooperation 2307. - At
operation 2307, the electronic device displays the goods information including emotional information on thedisplay unit 170. For example, the electronic device may display an emotional icon on goods information to which the user's emotional information has been added in a purchasable goods list. As another example, in case of displaying detailed information of goods from which the user's emotional information has been extracted among one or more purchasable goods, the electronic device may display the user's emotional information. - According to the above various embodiments of the present disclosure, when using content, the electronic device may extract the user's emotion from image information of the user obtained via a camera.
- According to various embodiments of the present disclosure, the electronic device may measure a stimulus degree of the user's sympathetic nerve and parasympathetic nerve to estimate the user's emotion. In this case, the electronic device may further include a skin electricity measurement sensor for measuring the user's skin electricity in addition to the construction of the
electronic device 100 illustrated inFIG. 1 . For example, the electronic device may measure stimulus information of the sympathetic nerve with consideration of the user's skin electricity measured by the skin electricity measurement sensor. For example, the electronic device measures the user's skin electricity using the skin electricity measurement sensor while driving a music application. After measuring the user's skin electricity, theemotion extract program 114 of the electronic device may estimate the user's emotional information with consideration of the user's skin electricity measured by the skin electricity measurement sensor. If a skin electricity value exceeds a reference emotional value, theemotion extract program 114 may recognize that the electronic device has extracted the user's emotion for music content. Additionally, the electronic device may add relevant emotional information at a point of extracting the user's emotion while reproducing music content. - According to various embodiments of the present disclosure, the electronic device may estimate the user's emotion with consideration of the user's skin temperature change. In this case, the electronic device may further include a skin temperature measurement sensor for measuring the user's skin temperature in addition to the construction of the
electronic device 100 illustrated inFIG. 1 . For example, the electronic device may extract the user's emotion with consideration of the user's skin temperature change measured by the skin temperature measurement sensor. - According to various embodiments of the present disclosure, the electronic device may estimate the user's emotion with consideration of the user's movement pattern measured by a motion sensor such as an acceleration sensor, a gravity sensor, and the like. In this case, the electronic device may further include a motion sensor in addition to the construction of the
electronic device 100 illustrated inFIG. 1 . - As described above, the electronic device may provide a service corresponding to the user's emotion depending on content use by estimating the user's emotion and adding the same to relevant contents when using the content, and retrieve, classify, and reproduce at least one content depending on the user's emotion depending on content use.
- It will be appreciated that various embodiments of the present disclosure according to the claims and description in the specification can be realized in the form of hardware, software or a combination of hardware and software.
- Any such software may be stored in a non-transitory computer readable storage medium. The non-transitory computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present disclosure.
- Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like. It will be appreciated that the storage devices and storage media are various embodiments of non-transitory machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement various embodiments of the present disclosure. Accordingly, various embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.
- While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/796,120 US20180054564A1 (en) | 2013-01-04 | 2017-10-27 | Apparatus and method for providing user's emotional information in electronic device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130001087A KR102091848B1 (en) | 2013-01-04 | 2013-01-04 | Method and apparatus for providing emotion information of user in an electronic device |
KR10-2013-0001087 | 2013-01-04 | ||
US14/147,842 US9807298B2 (en) | 2013-01-04 | 2014-01-06 | Apparatus and method for providing user's emotional information in electronic device |
US15/796,120 US20180054564A1 (en) | 2013-01-04 | 2017-10-27 | Apparatus and method for providing user's emotional information in electronic device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/147,842 Continuation US9807298B2 (en) | 2013-01-04 | 2014-01-06 | Apparatus and method for providing user's emotional information in electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180054564A1 true US20180054564A1 (en) | 2018-02-22 |
Family
ID=51060690
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/147,842 Active US9807298B2 (en) | 2013-01-04 | 2014-01-06 | Apparatus and method for providing user's emotional information in electronic device |
US15/796,120 Abandoned US20180054564A1 (en) | 2013-01-04 | 2017-10-27 | Apparatus and method for providing user's emotional information in electronic device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/147,842 Active US9807298B2 (en) | 2013-01-04 | 2014-01-06 | Apparatus and method for providing user's emotional information in electronic device |
Country Status (2)
Country | Link |
---|---|
US (2) | US9807298B2 (en) |
KR (1) | KR102091848B1 (en) |
Families Citing this family (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6046931B2 (en) * | 2011-08-18 | 2016-12-21 | キヤノン株式会社 | Imaging apparatus and control method thereof |
KR20140089146A (en) * | 2013-01-04 | 2014-07-14 | 삼성전자주식회사 | Method for providing video communication and an electronic device thereof |
KR102091848B1 (en) * | 2013-01-04 | 2020-03-20 | 삼성전자주식회사 | Method and apparatus for providing emotion information of user in an electronic device |
US9569424B2 (en) * | 2013-02-21 | 2017-02-14 | Nuance Communications, Inc. | Emotion detection in voicemail |
AP00894S1 (en) * | 2013-02-23 | 2017-03-20 | Samsung Electronics Co Ltd | Display screen or portion thereof with animated graphical user interface |
USD776124S1 (en) * | 2013-09-03 | 2017-01-10 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with animated graphical user interface |
USD749626S1 (en) * | 2013-09-03 | 2016-02-16 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with animated graphical user interface |
US10405786B2 (en) * | 2013-10-09 | 2019-09-10 | Nedim T. SAHIN | Systems, environment and methods for evaluation and management of autism spectrum disorder using a wearable data collection device |
USD750125S1 (en) | 2013-12-30 | 2016-02-23 | Beijing Qihoo Technology Co., Ltd. | Display screen or portion thereof with animated icon for optimizing computer device resources |
KR102302871B1 (en) * | 2014-09-05 | 2021-09-16 | 삼성전자주식회사 | Method and device to monitor and analyzing bio signal of user |
TWI533240B (en) * | 2014-12-31 | 2016-05-11 | 拓邁科技股份有限公司 | Methods and systems for displaying data, and related computer program prodcuts |
CN106331586A (en) * | 2015-06-16 | 2017-01-11 | 杭州萤石网络有限公司 | Smart household video monitoring method and system |
KR102344063B1 (en) * | 2015-06-29 | 2021-12-28 | 엘지전자 주식회사 | Mobile terminal |
KR102361568B1 (en) | 2015-07-28 | 2022-02-10 | 삼성전자주식회사 | Apparatus and method for controlling a display |
USD771653S1 (en) * | 2015-07-29 | 2016-11-15 | Apple Inc. | Display screen or portion thereof with animated graphical user interface |
TWI597980B (en) * | 2015-08-10 | 2017-09-01 | 宏達國際電子股份有限公司 | Video menagement method and system thereof |
CN106502712A (en) | 2015-09-07 | 2017-03-15 | 北京三星通信技术研究有限公司 | APP improved methods and system based on user operation |
USD819042S1 (en) * | 2015-10-07 | 2018-05-29 | MAQUET CARDIOPULMONARY GmbH | Display screen or portion thereof with graphical user interface for a medical device |
CN105867781A (en) * | 2015-10-28 | 2016-08-17 | 乐视移动智能信息技术(北京)有限公司 | Photographic processing method and device, and equipment |
KR20190025549A (en) | 2016-05-06 | 2019-03-11 | 더 보드 어브 트러스티스 어브 더 리랜드 스탠포드 주니어 유니버시티 | Movable and wearable video capture and feedback flat-forms for the treatment of mental disorders |
US10921952B2 (en) | 2016-05-11 | 2021-02-16 | Sap Se | Dynamic button with visual indication of application action result |
USD827657S1 (en) * | 2016-06-03 | 2018-09-04 | Visa International Service Association | Display screen with animated graphical user interface |
USD835670S1 (en) * | 2016-09-22 | 2018-12-11 | Sap Se | Display screen or portion thereof with a graphical user interface having a transitional icon |
USD835671S1 (en) * | 2016-09-22 | 2018-12-11 | Sap Se | Display screen or portion thereof with a graphical user interface having a transitional icon |
USD836134S1 (en) * | 2016-09-22 | 2018-12-18 | Sap Se | Display screen or portion thereof with a graphical user interface having a transitional icon |
USD835669S1 (en) * | 2016-09-22 | 2018-12-11 | Sap Se | Display screen or portion thereof with a graphical user interface having a transitional icon |
USD816698S1 (en) * | 2016-11-10 | 2018-05-01 | Koninklijke Philips N.V. | Display screen with animated graphical user interface |
US10083162B2 (en) | 2016-11-28 | 2018-09-25 | Microsoft Technology Licensing, Llc | Constructing a narrative based on a collection of images |
US10636175B2 (en) * | 2016-12-22 | 2020-04-28 | Facebook, Inc. | Dynamic mask application |
USD851111S1 (en) * | 2017-09-09 | 2019-06-11 | Apple Inc. | Electronic device with graphical user interface |
USD910046S1 (en) * | 2017-09-29 | 2021-02-09 | Apple Inc. | Electronic device with graphical user interface |
CN108039988B (en) * | 2017-10-31 | 2021-04-30 | 珠海格力电器股份有限公司 | Equipment control processing method and device |
USD879122S1 (en) | 2017-11-30 | 2020-03-24 | MAQUET CARDIOPULMONARY GmbH | Display screen or portion thereof with graphical user interface for a clamp display of a cardiopulmonary bypass machine system |
US11249945B2 (en) * | 2017-12-14 | 2022-02-15 | International Business Machines Corporation | Cognitive data descriptors |
USD865000S1 (en) * | 2018-03-08 | 2019-10-29 | Capital One Services, Llc | Display screen with animated graphical user interface |
USD865001S1 (en) * | 2018-03-08 | 2019-10-29 | Capital One Services, Llc | Display screen with animated graphical user interface |
CN108960402A (en) * | 2018-06-11 | 2018-12-07 | 上海乐言信息科技有限公司 | A kind of mixed strategy formula emotion towards chat robots pacifies system |
USD938968S1 (en) | 2018-09-06 | 2021-12-21 | Apple Inc. | Electronic device with animated graphical user interface |
US10891969B2 (en) * | 2018-10-19 | 2021-01-12 | Microsoft Technology Licensing, Llc | Transforming audio content into images |
USD930031S1 (en) * | 2018-12-18 | 2021-09-07 | Spotify Ab | Media player display screen with graphical user interface |
USD910649S1 (en) * | 2018-12-20 | 2021-02-16 | Facebook, Inc. | Display screen with a graphical user interface |
US11157549B2 (en) | 2019-03-06 | 2021-10-26 | International Business Machines Corporation | Emotional experience metadata on recorded images |
USD924912S1 (en) | 2019-09-09 | 2021-07-13 | Apple Inc. | Display screen or portion thereof with graphical user interface |
WO2022230070A1 (en) * | 2021-04-27 | 2022-11-03 | 株式会社I’mbesideyou | Video analysis system |
JPWO2022230069A1 (en) * | 2021-04-27 | 2022-11-03 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090317060A1 (en) * | 2008-06-24 | 2009-12-24 | Samsung Electronics Co., Ltd. | Method and apparatus for processing multimedia |
US9807298B2 (en) * | 2013-01-04 | 2017-10-31 | Samsung Electronics Co., Ltd. | Apparatus and method for providing user's emotional information in electronic device |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7233684B2 (en) * | 2002-11-25 | 2007-06-19 | Eastman Kodak Company | Imaging method and system using affective information |
JP2007036874A (en) | 2005-07-28 | 2007-02-08 | Univ Of Tokyo | Viewer information measurement system and matching system employing same |
US8209182B2 (en) | 2005-11-30 | 2012-06-26 | University Of Southern California | Emotion recognition system |
KR100828371B1 (en) * | 2006-10-27 | 2008-05-08 | 삼성전자주식회사 | Method and Apparatus of generating meta data of content |
JP2009005094A (en) | 2007-06-21 | 2009-01-08 | Mitsubishi Electric Corp | Mobile terminal |
KR101444103B1 (en) * | 2008-03-14 | 2014-09-26 | 삼성전자주식회사 | Media signal generating method and apparatus using state information |
JP4659088B2 (en) | 2008-12-22 | 2011-03-30 | 京セラ株式会社 | Mobile device with camera |
US9277021B2 (en) | 2009-08-21 | 2016-03-01 | Avaya Inc. | Sending a user associated telecommunication address |
KR101708682B1 (en) * | 2010-03-03 | 2017-02-21 | 엘지전자 주식회사 | Apparatus for displaying image and and method for operationg the same |
KR20110115906A (en) * | 2010-04-16 | 2011-10-24 | 엘지전자 주식회사 | Method for displaying contents watching information, display apparatus and remote controller thereof |
KR101679860B1 (en) | 2010-07-14 | 2016-11-25 | 엘지전자 주식회사 | Compressor |
KR20120064563A (en) | 2010-12-09 | 2012-06-19 | 한국전자통신연구원 | Apparatus for controlling facial expression of virtual human using heterogeneous data |
US20130038756A1 (en) * | 2011-08-08 | 2013-02-14 | Samsung Electronics Co., Ltd. | Life-logging and memory sharing |
-
2013
- 2013-01-04 KR KR1020130001087A patent/KR102091848B1/en active IP Right Grant
-
2014
- 2014-01-06 US US14/147,842 patent/US9807298B2/en active Active
-
2017
- 2017-10-27 US US15/796,120 patent/US20180054564A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090317060A1 (en) * | 2008-06-24 | 2009-12-24 | Samsung Electronics Co., Ltd. | Method and apparatus for processing multimedia |
US9210366B2 (en) * | 2008-06-24 | 2015-12-08 | Samsung Electronics Co., Ltd. | Method and apparatus for processing multimedia |
US9807298B2 (en) * | 2013-01-04 | 2017-10-31 | Samsung Electronics Co., Ltd. | Apparatus and method for providing user's emotional information in electronic device |
Also Published As
Publication number | Publication date |
---|---|
US9807298B2 (en) | 2017-10-31 |
KR20140089454A (en) | 2014-07-15 |
US20140192229A1 (en) | 2014-07-10 |
KR102091848B1 (en) | 2020-03-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180054564A1 (en) | Apparatus and method for providing user's emotional information in electronic device | |
US11483268B2 (en) | Content navigation with automated curation | |
CN106575361B (en) | Method for providing visual sound image and electronic equipment for implementing the method | |
WO2021017932A1 (en) | Image display method and electronic device | |
US9767359B2 (en) | Method for recognizing a specific object inside an image and electronic device thereof | |
US9449107B2 (en) | Method and system for gesture based searching | |
US10003785B2 (en) | Method and apparatus for generating images | |
US20160132222A1 (en) | Apparatus and method for using blank area in screen | |
US20160127653A1 (en) | Electronic Device and Method for Providing Filter in Electronic Device | |
US10691402B2 (en) | Multimedia data processing method of electronic device and electronic device thereof | |
US20190236450A1 (en) | Multimodal machine learning selector | |
US20160306505A1 (en) | Computer-implemented methods and systems for automatically creating and displaying instant presentations from selected visual content items | |
US10175863B2 (en) | Video content providing scheme | |
US10055813B2 (en) | Electronic device and operation method thereof | |
KR102244248B1 (en) | Operating Method For content and Electronic Device supporting the same | |
KR102376700B1 (en) | Method and Apparatus for Generating a Video Content | |
US11477143B2 (en) | Trending content view count | |
WO2019105457A1 (en) | Image processing method, computer device and computer readable storage medium | |
US11297027B1 (en) | Automated image processing and insight presentation | |
CN105354231A (en) | Image selection method and apparatus, and image processing method and apparatus | |
EP3123352B1 (en) | Data sharing method and electronic device thereof | |
US10691717B2 (en) | Method and apparatus for managing data | |
KR102316846B1 (en) | Method for sorting a media content and electronic device implementing the same | |
CN104951445B (en) | Webpage processing method and device | |
CN108009251A (en) | A kind of image file searching method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |