KR20170034260A - Mobile terminal perform a life log and controlling method thereof - Google Patents
Mobile terminal perform a life log and controlling method thereof Download PDFInfo
- Publication number
- KR20170034260A KR20170034260A KR1020150132685A KR20150132685A KR20170034260A KR 20170034260 A KR20170034260 A KR 20170034260A KR 1020150132685 A KR1020150132685 A KR 1020150132685A KR 20150132685 A KR20150132685 A KR 20150132685A KR 20170034260 A KR20170034260 A KR 20170034260A
- Authority
- KR
- South Korea
- Prior art keywords
- user
- information
- mobile terminal
- image
- electronic device
- Prior art date
Links
Images
Classifications
-
- H04M1/72522—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Dermatology (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention relates to a mobile terminal for performing a life log and a control method thereof, and more particularly, to a mobile terminal for performing life log and a control method thereof, including a wireless communication unit for receiving vital signal information from a first electronic device; And an emotional quantity of a user based on the bio-signal information, generating a control signal for capturing a user in a set time interval using the calculated emotion index, and transmitting the generated control signal to the wireless communication unit Wherein the image is classified into categories according to the emotion state of the user and / or the event information of the image based on the emotion index and the context information of the image, , The life log can be performed by a plurality of categories based on voice and place information.
Description
The present invention relates to a method for performing a lifelog application and a mobile terminal therefor.
A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal. In recent years, such terminals are required to provide useful information to a user by using data from a plurality of wearable devices attached to the user's body.
In this regard, conventionally, there has been a problem in providing healthcare to the user based on the user's heart rate or the like.
The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to enable a mobile terminal to perform a life-log based on bio-signals and photographed images from surrounding electronic devices.
According to an aspect of the present invention, there is provided a wireless communication apparatus including: a wireless communication unit for receiving vital signal information from a first electronic device; And an emotional quantity of a user based on the bio-signal information, generating a control signal for capturing a user in a set time interval using the calculated emotion index, and transmitting the generated control signal to the wireless communication unit Wherein the image is classified into categories according to the emotion state of the user and / or the event information of the image based on the emotion index and the context information of the image, , The life log can be performed by a plurality of categories based on voice and place information.
According to one embodiment, the first electronic device is a wearable device attached to the body of the user, the second electronic device is a lifelog camera, and the context information of the image is information of the user Wherein the emotional index comprises at least one of facial expression, gesture, voice, and place information of the image, including the facial expression of the user including the facial expression, Lt; / RTI >
According to an embodiment, the control unit may control the wireless communication unit to transmit a request message to the social network server requesting information on an event related to the time period uploaded to the social network server by the user.
According to one embodiment, the information about the event may be information about the person who accompanied the user during the time period, the food consumed, and the place visited.
According to one embodiment, the display unit may further include a display unit that receives an execution input of the application from the user and outputs a screen associated with the application.
According to one embodiment, the screen may include a first area in which an upper menu selectable in the application is displayed, and a second area in which information related to an event related to the selected upper menu is displayed.
According to one embodiment, the upper menu includes at least one of a diary, a relationship management, a favorite food, and a visiting place, and when the selected menu is the diary, Index and at least a portion of the image.
According to an embodiment, when the selected menu is the network management, the control unit may maintain the distance of at least one of the first and second electronic devices and the mobile terminals of the user and another user within a predetermined distance for a predetermined time or longer And generates a network map based on the number of occurrences of the contact event for a certain period of time.
According to one embodiment, when the selected menu is the preferred food, the control unit controls the display unit to display the food and the place visited by the user in the order of the re-calculated emotion index, And the information about the visited place can be extracted based on the image from the second electronic device or the event uploaded to the social network server.
According to an embodiment, when the selected menu is the visited place, the control unit controls the display unit to display an event in the place and the place visited by the user in the order of the re-computed emotion index, Information about the place and the event at the place can be extracted based on the image from the second electronic device or the event uploaded to the social network server.
According to an embodiment of the present invention, the controller may determine whether the re-computed emotion index is less than or equal to a first reference value that the user feels is not good, Information related to at least a part of the image in the interval or a contact of another user having a high human connection score based on the human network management may be displayed on the display unit or the display unit of the first electronic device.
According to another aspect of the present invention, there is provided an electronic device including: a wireless communication unit for transmitting vital signal information to a mobile terminal; A control signal for requesting a message including information capable of changing the emotion index from the mobile terminal using the emotional quantity of the user calculated based on the bio-signal information and the image of the user, And transmitting the generated information through the wireless communication unit; And a display unit for displaying the message received from the mobile terminal.
According to an embodiment of the present invention, the message may include at least one of an image of the image at a time interval greater than or equal to a second reference value at which the user feels comfortable, Information associated with some or contacts of other users with high network scores based on networking management.
According to another aspect of the present invention, there is provided a method for performing a life log of a mobile terminal, the method comprising: receiving biological signal information from a first electronic device; A control signal generating step of determining a user's emotional state based on the bio-signal information and generating a control signal for controlling the second electronic device based on the determined emotional state; Receiving an image captured by the second electronic device from the second electronic device using the control signal; And an emotion index recalculation and judgment process for re-performing the emotion index based on the facial expression of the user included in the image.
According to an exemplary embodiment, a lifelog information extracting process for extracting a person, voice, and place information included in the image and a lifelog information extracting process for extracting a lifelog information based on at least one of the extracted person, And a life log execution process for performing the life log.
Effects of the mobile terminal and the control method according to the present invention will be described as follows.
According to at least one of the embodiments of the present invention, a mobile terminal of the present invention including a wireless communication unit, a control unit, and a display unit is connected to a life- log) can be performed.
According to at least one embodiment of the present invention, there is an advantage that the life log can be performed by a plurality of categories based on the person, voice, and place information included in the photographed image.
FIG. 1A is a block diagram for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
2 is a perspective view illustrating an example of a watch-type mobile terminal according to another embodiment of the present invention.
3 is a block diagram of a mobile terminal for performing the life log according to the present invention.
FIG. 4 illustrates a screen of a mobile terminal that performs a life log according to the first embodiment of the present invention.
FIG. 5 shows a screen of a mobile terminal performing a life log according to a second embodiment of the present invention.
FIG. 6 illustrates a screen of a mobile terminal performing a life log according to the third embodiment of the present invention.
FIG. 7 shows a screen of a mobile terminal performing a life log according to the fourth embodiment of the present invention.
FIG. 8 shows a screen of a mobile terminal performing a life log according to the fifth embodiment of the present invention.
FIG. 9 shows a block diagram of an electronic device for performing the life log according to the present invention.
FIG. 10 shows a screen of an electronic device that performs a life log according to an embodiment of the present invention.
11 is a flowchart of a method of performing a life log of a mobile terminal according to the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.
1A to 1C are block diagrams for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams showing an example of a mobile terminal according to the present invention in different directions.
The
The
The
The
The
The
The
In addition, the
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the various components of the
First, referring to the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.
The
The short-
Here, the other
The
Next, the
The
The
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
Also, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
The
The
In addition to vibration, the
The
The
The signal output from the
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
The
Meanwhile, as described above, the
In addition, the
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
2 is a perspective view showing an example of a watch-type
2, a watch-type
The
The watch-type
A
The
Here, a sound signal may be output through the
The
On the other hand, the
The
As described above, there is an increased need to provide useful information to the user using data from a number of wearable devices attached to the body of the user, including the presented
In order to solve such a problem, the present invention provides a mobile terminal (100) having a watch part (200), a wireless communication part (110) for receiving information from a wearable camera, and a control part (180) ).
3 and 13, the mobile terminal of the present invention including the
3 is a block diagram of a mobile terminal for performing the life log according to the present invention.
The
The
Also, the
The
The
The image / audio
Here, the photographed image may include a voice by persons included in the image. The video / audio
The main
According to another embodiment, at least a part of functions or functions performed by the
Meanwhile, the
Here, the information on the event may be information on the user who accompanied the user during the time period, the food he / she consumed, and the place visited.
The
4 to 9 illustrate a mobile terminal or an electronic device that performs a life log according to embodiments of the present invention. The life log may be performed by the
FIG. 4 illustrates a screen of a mobile terminal that performs a life log according to the first embodiment of the present invention.
Here, the screen includes a first area 155 in which an upper menu selectable by the application is displayed, and a second area 156 in which information related to an event related to the selected upper menu is displayed. In addition, the upper menu may include at least one of diary, personal relationship management, favorite food, and visited place. Icons corresponding to the diary, the linkage management, the preferred food, and the visiting place corresponding to the upper menu are displayed in the first area 155.
Referring to FIG. 4, when the selected menu is the diary, the emotion index is calculated to be at least a part of an image photographed through the second electronic device in a time interval that is less than a first reference value, Images can be displayed on items marked 'Blue'.
In addition, the images corresponding to at least a part of the images photographed through the second electronic device in the time period in which the emotion index is higher than the second reference value that the user feels good are displayed on the items indicated as 'Red' have.
The images displayed on the items 'Red' and 'Blue' may be listed in chronological order or may be listed in the order of the emotion index. In addition, the images may be stored in at least one of the storage space of the
FIG. 5 shows a screen of a mobile terminal performing a life log according to a second embodiment of the present invention.
5A, if the upper menu selected in the first area 155 is a 'diary', the information on the event displayed in the second area 156 may include information about the user's emotion Exponent and at least a portion of the image.
For example, the emotion index may be displayed as a score for each day and night, and a score of a time interval in which the user feels best in day and night may be displayed so as to be distinguished from other scores. Also, it is possible to display a part (e.g., highlight image) of the image photographed through the second
According to FIG. 5B, the emotion index can be displayed by a radial schedule table for each day and night time displayed in the
FIG. 6 illustrates a screen of a mobile terminal performing a life log according to the third embodiment of the present invention.
6 (a), when the selected menu is the network management, the
Meanwhile, the network map can be generated using the recognized person information through face recognition or voice recognition of the characters included in the image photographed from the second electronic device. Herein, the network map can be displayed in a two-dimensional or three-dimensional form, and the distance to the acquaintances displayed around the 'I' is determined by the number of occurrences of contact events, the number of contacts due to telephone or message exchange, May be determined on the basis of the likelihood information through < RTI ID = 0.0 > At this time, as the number of contact, the number of contact, and the likelihood increase, the distance from the acquaintances decreases inversely.
In addition, it is possible to classify the acquaintances in groups on the basis of 'I', and the acquaintances can be managed on a group basis.
According to FIG. 6 (b), if the icon corresponding to the acquaintance is touched or clicked, the videos including the acquaintance may be displayed in the order of favorability. Also, if the icon corresponding to the acquaintance is touched or clicked (eg, a long press) in a different manner, the group including the acquaintances may be displayed in the order of likelihood including moving images including the group circle.
FIG. 7 shows a screen of a mobile terminal performing a life log according to the fourth embodiment of the present invention.
7, if the selected menu is a favorite food, the
Here, the information on the food and the place visited can be extracted based on the image from the second
The image of the consumed food shown in Fig. 7 can be obtained from an image from the second
On the other hand, each food item may include an image of the food, an emotion index, location information, and a distance from the user. Also, when a touch input from a user is received in the area corresponding to the food item, a web page for the restaurant or a map from the current location to the restaurant may be displayed. Further, when a touch input from a user is received in an area for an image related to the food, a moving picture photographed at the time of visiting a restaurant may be reproduced.
FIG. 8 shows a screen of a mobile terminal performing a life log according to the fifth embodiment of the present invention.
8, when the selected menu is a visited place, the
Here, the information about the visited place and the event at the place can be extracted based on the image from the second electronic device or the event uploaded to the social network server.
The image of the visited place shown in Fig. 8 may include at least one of the image from the second
On the other hand, the item of each visited place may include an image related to the place, an emotion index, and event information. Further, when a touch input from the user is received in an area corresponding to the item of the visited place, a web page about the place of visit, a map from the current position to the place of visit, detailed event information, and the like may be displayed. In addition, when a touch input from the user is received in an area for the image of the visited place, the photographed moving image at the place can be reproduced.
FIG. 9 shows a block diagram of an electronic device for performing the life log according to the present invention. The
The
The
The
Here, if the emotion index is equal to or less than a first reference value that the user feels bad, the message can be received. Here, the message may include information associated with at least a part of the image in a time interval that is equal to or greater than a second reference value that the user feels comfortable with. Or the message may include contacts of other users with a high network score based on network management.
The
FIG. 10 shows a screen of an electronic device that performs a life log according to an embodiment of the present invention.
Referring to FIG. 10 (a), the controller of the mobile terminal can determine whether the recalculated emotion index is below a first reference value that the user feels is not feeling well. At this time, the control unit of the
Referring to FIG. 10 (b), the control unit of the mobile terminal can give a deduction point to other persons identified within a time interval in which the emotion index is equal to or less than the first reference value. If the emotion provided is equal to or greater than the reference value, the control unit of the mobile terminal can display the contact of the identified person (e.g., a friend who does not like) on the
The
11 is a flowchart of a method of performing a life log of a mobile terminal according to the present invention.
The life log performing method includes a biological signal information receiving step S1110, a control signal generating step S1120, a photographed image receiving step S1130, and an emotion index re-calculation and judgment step S1140. In addition, the life log executing method may further include a life log information extracting process (S1150) and a life log executing process (S1160).
The emotion index re-calculation and determination process S1140, the life log information extraction process S1150, and the life log execution process S1160 may be performed by the mobile terminal or the external server, or may be performed by the mobile terminal and the external server . This is because performing at least a part of the processes in the external server can shorten the processing time and reduce the power consumption compared to the case in the mobile terminal.
The biological signal information receiving step S1110 receives vital signal information from the first electronic device.
The control signal generation step S1120 determines a user's emotional state based on the bio-signal information, and generates a control signal for controlling the second electronic device based on the determined emotional state. Further, it is possible to generate an emotional quantity based on the bio-signal information, and generate a control signal for controlling the second electronic device to photograph a user in a set time interval using the calculated emotion index.
The photographed image receiving step S1130 receives an image photographed by the user at a time interval in which the calculated emotion index is out of a specific range, from the second electronic device.
The emotion index re-calculation and judgment process (S1140) recalculates the emotion index based on the facial expression of the user included in the image. Also, it is determined whether the recalculated emotion index is out of a specific range.
The life log information extraction process S1150 extracts the person, voice, and place information included in the image.
The life log execution step S1160 performs a life log based on a plurality of criteria for each time interval based on at least one of the extracted person, voice, and place information.
The plurality of criteria include at least one of a diary, a relationship management, a favorite food, a visit place, and a friend recommendation / rejection, and may be selected from the above.
If the selected criterion is a diary, the lifelog performed includes at least a part of the user's emotion index and image classified by time.
If the selected criterion is a linkage management, the lifelog performed is a linkage map generated based on the number of occurrences of the contact information and the person information identified through the character analysis and voice analysis included in the stored image.
If the selected criterion is a preferred food, the lifelog performed is a food and a place visited by the user in the order of the emotion index.
When the selected criterion is a visiting place, the performed life log is an event at the place and the place visited by the user in the order of the emotion index.
If the selected criterion is a friend recommendation / rejection, the lifeplog performed may include displaying the contact information of a favorite friend according to the emotion index, or refusing to update the SNS news of the friend who is not satisfied.
It should be noted that the respective processes shown in FIG. 11 can be combined with the features described in the configurations and methods described above with reference to FIGS.
As described above, according to at least one embodiment of the present invention, the mobile terminal of the present invention including the wireless communication unit, the control unit, and the display unit uses the bio-signals and the captured images from the first and second electronic devices It has the advantage of being able to perform a life-log.
According to at least one embodiment of the present invention, there is an advantage that the life log can be performed by a plurality of categories based on the person, voice, and place information included in the photographed image.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). In addition, the computer may include a
100: mobile terminal 110: wireless communication unit
151: Display unit 155: First area
156: second area 200: first electronic device
300: second electronic device
Claims (15)
A wireless communication unit for receiving vital signal information from a first electronic device; And
A controller for generating an emotional quantity of the user based on the bio-signal information, generating a control signal for capturing a user at a predetermined time interval using the calculated emotion index, And a control unit for transmitting,
Wherein the image is classified into categories according to the emotion state of the user and / or the event information of the image based on the emotion index and the context information of the image.
Wherein the first electronic device is a wearable device attached to the user's body, the second electronic device is a lifelog camera,
Wherein the context information of the image includes at least one of facial expression, gesture, voice, and place information of the image including the user included in the image,
Wherein the emotion index is re-computed based on a facial expression of the user included in the image.
Wherein the control unit controls the wireless communication unit to transmit a request message to the social network server requesting information related to an event related to the time period uploaded to the social network server by the user.
Wherein the information on the event is information on the user who accompanied the user during the time period, the food consumed, and the place visited.
Further comprising: a display unit for receiving an execution input of an application from the user and outputting a screen associated with the application.
Wherein the screen includes a first area in which an upper menu selectable in the application is displayed and a second area in which information related to an event associated with the selected upper menu is displayed.
Wherein the upper menu includes at least one of a diary, a linkage management, a favorite food, and a visiting place, and when the selected menu is the diary, the information on the event includes at least an emotion index of the user And a portion thereof.
If the selected menu is the network management, the control unit determines whether or not a contact event in which the distance between the user and at least one of the first and second electronic devices and the mobile terminals is maintained within a predetermined distance for a predetermined time or longer And generates a network map based on the number of occurrences of the contact event for a certain period of time.
When the selected menu is the favorite food, the control unit controls the display unit to display the food and the place visited by the user in the order of the recalculated emotion index,
Wherein the information on the consumed food and the visited place is extracted based on an image from the second electronic device or an event uploaded to the social network server.
When the selected menu is the visited place, the control unit controls the display unit to display an event at the place and the place visited by the user in the order of the re-computed emotion index,
Wherein the information about the visited place and the event at the place is extracted based on the image from the second electronic device or the event uploaded to the social network server.
Wherein the control unit determines whether the re-computed emotion index is less than or equal to a first reference value that the user feels is not good or not, and if the user does not feel comfortable with the image at a time interval greater than or equal to a second reference value, And controls the display unit or the display unit of the first electronic device to display a contact of another user having at least a part of information or a high networking score based on the networking management.
A wireless communication unit for transmitting vital signal information to a mobile terminal;
A control signal for requesting a message including information capable of changing the emotion index from the mobile terminal using the emotional quantity of the user calculated based on the bio-signal information and the image of the user, And transmitting the generated information through the wireless communication unit; And
And a display unit for displaying the message received from the mobile terminal.
The message may include information associated with at least a portion of the image in a time interval greater than or equal to a second reference value, in which the user is determined to be comfortable, when the emotion index is less than or equal to a first reference value, ≪ / RTI > wherein the electronic device includes contacts of another user having a high network score based on management.
A biological signal information receiving step of receiving vital signal information from a first electronic device;
A control signal generating step of determining a user's emotional state based on the bio-signal information and generating a control signal for controlling the second electronic device based on the determined emotional state;
Receiving an image captured by the second electronic device from the second electronic device using the control signal; And
And an emotion index recalculation and judgment process for re-executing the emotion index based on a facial expression of the user included in the image.
A life log information extraction process for extracting the person, voice, and place information included in the image; And
And performing a lifelog based on a plurality of criteria for each time interval based on at least one of the extracted person, voice, and place information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150132685A KR101777609B1 (en) | 2015-09-18 | 2015-09-18 | Mobile terminal perform a life log and controlling method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150132685A KR101777609B1 (en) | 2015-09-18 | 2015-09-18 | Mobile terminal perform a life log and controlling method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170034260A true KR20170034260A (en) | 2017-03-28 |
KR101777609B1 KR101777609B1 (en) | 2017-09-13 |
Family
ID=58495663
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150132685A KR101777609B1 (en) | 2015-09-18 | 2015-09-18 | Mobile terminal perform a life log and controlling method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101777609B1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101878155B1 (en) * | 2017-12-19 | 2018-07-13 | 허수범 | Method for controlling of mobile terminal |
KR20190023852A (en) * | 2017-08-30 | 2019-03-08 | (주)와이브레인 | Method of predicting mentality by associating response data and context data and device implementing thereof |
KR20190038958A (en) * | 2017-10-01 | 2019-04-10 | (주)씨어스테크놀로지 | Apparatus for presenting Life-Log, method thereof and computer recordable medium storing program to perform the method |
KR20190057236A (en) * | 2017-08-30 | 2019-05-28 | (주)와이브레인 | Method of predicting mentality by associating response data and context data and device implementing thereof |
KR20210007749A (en) | 2019-07-12 | 2021-01-20 | 이성애 | Inflatable balloon hanger |
WO2021256889A1 (en) * | 2020-06-19 | 2021-12-23 | 주식회사 코클리어닷에이아이 | Lifelog device utilizing audio recognition, and method therefor |
EP4202578A1 (en) * | 2021-12-22 | 2023-06-28 | Swatch Ag | Method and system for keeping a first user and a second user continuously informed of their respective emotional states |
US11974250B2 (en) | 2021-12-22 | 2024-04-30 | Swatch Ag | Method and system for keeping a first user and a second user continuously informed of their respective emotional states |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4200370B2 (en) * | 2003-08-12 | 2008-12-24 | ソニー株式会社 | Recording apparatus, recording / reproducing apparatus, reproducing apparatus, recording method, recording / reproducing method, and reproducing method |
JP5515192B2 (en) * | 2005-02-17 | 2014-06-11 | セイコーエプソン株式会社 | Image recording apparatus, image recording method, and control program |
KR100903348B1 (en) * | 2007-11-28 | 2009-06-23 | 중앙대학교 산학협력단 | Emotion recognition mothod and system based on feature fusion |
-
2015
- 2015-09-18 KR KR1020150132685A patent/KR101777609B1/en active IP Right Grant
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190023852A (en) * | 2017-08-30 | 2019-03-08 | (주)와이브레인 | Method of predicting mentality by associating response data and context data and device implementing thereof |
KR20190057236A (en) * | 2017-08-30 | 2019-05-28 | (주)와이브레인 | Method of predicting mentality by associating response data and context data and device implementing thereof |
KR20190038958A (en) * | 2017-10-01 | 2019-04-10 | (주)씨어스테크놀로지 | Apparatus for presenting Life-Log, method thereof and computer recordable medium storing program to perform the method |
KR101878155B1 (en) * | 2017-12-19 | 2018-07-13 | 허수범 | Method for controlling of mobile terminal |
KR20210007749A (en) | 2019-07-12 | 2021-01-20 | 이성애 | Inflatable balloon hanger |
WO2021256889A1 (en) * | 2020-06-19 | 2021-12-23 | 주식회사 코클리어닷에이아이 | Lifelog device utilizing audio recognition, and method therefor |
EP4202578A1 (en) * | 2021-12-22 | 2023-06-28 | Swatch Ag | Method and system for keeping a first user and a second user continuously informed of their respective emotional states |
US11974250B2 (en) | 2021-12-22 | 2024-04-30 | Swatch Ag | Method and system for keeping a first user and a second user continuously informed of their respective emotional states |
Also Published As
Publication number | Publication date |
---|---|
KR101777609B1 (en) | 2017-09-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101777609B1 (en) | Mobile terminal perform a life log and controlling method thereof | |
US20180097925A1 (en) | Mobile terminal | |
KR101971736B1 (en) | Mobile terminal and method for controlling the same | |
KR20170029978A (en) | Mobile terminal and method for controlling the same | |
KR20170081391A (en) | Mobile terminal and method for controlling the same | |
KR20170002038A (en) | Mobile terminal | |
KR20160071263A (en) | Mobile terminal and method for controlling the same | |
KR20180012751A (en) | Wearable terminal that displays optimized screen according to the situation | |
US20180249056A1 (en) | Mobile terminal and method for controlling same | |
KR20170006014A (en) | Mobile terminal and method for controlling the same | |
KR102157598B1 (en) | Mobile terminal and method for controlling the same | |
KR20170082036A (en) | Mobile terminal | |
KR20160007051A (en) | Mobile terminal and method for controlling the same | |
KR20170024445A (en) | Mobile terminal and method for controlling the same | |
KR20170022690A (en) | Mobile terminal and method for controlling the same | |
KR20170071017A (en) | Mobile terminal and method for controlling the same | |
KR20160125647A (en) | Mobile terminal and method for controlling the same | |
KR20160043842A (en) | Mobile terminal | |
KR20160142671A (en) | Watch type mobile terminal and method for controlling the same | |
KR102179818B1 (en) | Mobile terminal and method for controlling the same | |
KR20180079051A (en) | Mobile terninal and method for controlling the same | |
KR20170141847A (en) | Wearable device and method for controlling the same | |
KR20170049116A (en) | Mobile terminal and method for controlling the same | |
KR20170059684A (en) | Mobile terminal | |
KR20160068390A (en) | Mobile terminal and method for controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |