KR101748670B1 - Mobile terminal and emotion sharing method thereof - Google Patents
Mobile terminal and emotion sharing method thereof Download PDFInfo
- Publication number
- KR101748670B1 KR101748670B1 KR1020150164952A KR20150164952A KR101748670B1 KR 101748670 B1 KR101748670 B1 KR 101748670B1 KR 1020150164952 A KR1020150164952 A KR 1020150164952A KR 20150164952 A KR20150164952 A KR 20150164952A KR 101748670 B1 KR101748670 B1 KR 101748670B1
- Authority
- KR
- South Korea
- Prior art keywords
- user
- mobile terminal
- mode
- touch
- emotional
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 230000008451 emotion Effects 0.000 title claims abstract description 45
- 230000002996 emotional effect Effects 0.000 claims abstract description 71
- 230000004044 response Effects 0.000 claims abstract description 9
- 230000008859 change Effects 0.000 claims description 7
- 230000035945 sensitivity Effects 0.000 claims 2
- 230000006870 function Effects 0.000 description 63
- 239000003795 chemical substances by application Substances 0.000 description 28
- 238000004891 communication Methods 0.000 description 28
- 238000010295 mobile communication Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 230000000694 effects Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 230000007774 longterm Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 239000004984 smart glass Substances 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 239000010408 film Substances 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 230000036651 mood Effects 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 239000010454 slate Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 229910001220 stainless steel Inorganic materials 0.000 description 2
- 239000010935 stainless steel Substances 0.000 description 2
- 229920003002 synthetic resin Polymers 0.000 description 2
- 239000000057 synthetic resin Substances 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000035622 drinking Effects 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- 230000003760 hair shine Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 238000001646 magnetic resonance method Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000035755 proliferation Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009774 resonance method Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H04N5/23216—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- G06Q50/30—
-
- H04N5/23293—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Human Resources & Organizations (AREA)
- Primary Health Care (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention relates to a mobile terminal and its emotional sharing method that can share emotion and atmosphere among users in real time, Recognizing a user included in a preview screen in a shooting mode; Analyzing a previously stored picture based on the recognized user to determine a person to be shared, and outputting emotional content asking whether to share the current situation with the determined person to be shared; And performing real-time connection with the sharing subject according to a user's response to the emotion.
Description
The present invention relates to a mobile terminal and its emotional sharing method that can share emotion and atmosphere among users in real time.
A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.
Typically, users save photos taken with the camera and upload them to a blog or social network service (SNS) to share with others when needed. In addition, users share text, photos, music, videos, and favorite sites with each other through social network services that keep them in touch with colleagues, friends, and loved ones. Especially, with the proliferation of smartphones, many people are taking pictures of their daily life anytime and anywhere and sharing them in connection with SNS service.
However, the conventional sharing method as described above is more developed than the function of sharing pictures in the form of e-mail or MMS (Multimedia Messaging Service). However, the sharing method has not greatly changed by simple photo sharing function, It is not merely sharing the mood and emotion in real time, but merely sharing a fragmentary image.
It is an object of the present invention to provide a mobile terminal and a method of sharing emotion of the user that can share the atmosphere and emotion felt by a user in real time.
It is another object of the present invention to provide a mobile terminal and a method of sharing emotion of the user, which can analyze the data history of the user and recommend emotional contents according to the user's situation.
According to an aspect of the present invention, there is provided a mobile terminal including: a display unit displaying a preview screen; And a control unit for recognizing the user included in the preview screen upon entering the shooting mode and analyzing the previously stored pictures to determine a sharing object and outputting emotional contents for sharing the emotion felt by the user at the current position with the sharing object .
According to another aspect of the present invention, there is provided a method of sharing emotion in a mobile terminal, Recognizing a user included in a preview screen in a shooting mode; Analyzing a previously stored picture based on the recognized user to determine a person to be shared, and outputting emotional content asking whether to share the current situation with the determined person to be shared; And performing real-time connection with the sharing subject according to a user's response to the emotion.
In the present invention, when a predetermined event (photographing, chatting, and the like) is generated, the emotional agent function is activated, and then the emotional contents for sharing the contents generated by the event in real time or in non- It is possible to share with other users in real time, and it is possible to process content generated by an event more conveniently by inputting a response to the provided emotional contents by voice.
FIG. 1A is a block diagram for explaining a mobile terminal according to the present invention; FIG.
FIGS. 1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions. FIG.
2 is a flowchart illustrating a method of sharing emotional contents of a mobile terminal according to an embodiment of the present invention.
3 is a diagram illustrating a method of sharing emotion in a shooting mode according to an embodiment of a method for sharing emotion of a mobile terminal according to the present invention.
4 is an example of a previously photographed and stored photograph.
FIG. 5 illustrates an embodiment of providing emotional contents in the form of text in the present invention.
FIG. 6 illustrates an embodiment in which emotion is shared in real time using a video call in the present invention.
Figure 7 is an embodiment sharing photographs taken during emotional sharing in the present invention.
Figure 8 is an embodiment sharing photographs related to photographic images during emotional sharing in the present invention.
Figure 9 shows an embodiment in which additional pictures requested by a sentient sharer are transmitted.
10 is a diagram illustrating a method of sharing emotion in a chat mode according to an embodiment of a method for sharing emotion of a mobile terminal according to the present invention.
11 is a view illustrating a method of sharing emotion in a self-camera mode according to an embodiment of a method for sharing emotion of a mobile terminal according to the present invention.
12 is an embodiment for providing shared contents in the emotion sharing method according to the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like or similar elements are denoted by the same reference numerals, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.
1A to 1C are block diagrams for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams showing an example of a mobile terminal according to the present invention in different directions.
The
The
The
The
The
The
The
In addition, the
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the various components of the
First, referring to the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.
The
The short-
Here, the other
The
Next, the
The
The
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, do.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
Also, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
The
The
In addition to vibration, the
The
The
The signal output from the
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
The
Meanwhile, as described above, the
In addition, the
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
Referring to FIGS. 1B and 1C, the disclosed
Here, the terminal body can be understood as a concept of referring to the
The
A
In some cases, electronic components may also be mounted on the
As shown, when the
These
The
Meanwhile, the
The
1B and 1C, a
However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the
The
The
In addition, the
The
The touch sensor may be a film having a touch pattern and disposed between the
In this way, the
The first
The
The
The
The first and
In this figure, the
The contents input by the first and
On the other hand, a rear input unit (not shown) may be provided on the rear surface of the terminal body as another example of the
The rear input unit may be disposed so as to overlap with the
When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. When the
Meanwhile, the
The
The
And a
The
The
And a second
The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1A) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the
The terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the
The
The
The
Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
The present invention proposes a method of sharing emotions in real time with other users who are highly familiar with the user's mood and emotion due to the event when a user performs a specific event (photographing, chatting).
The intimacy may analyze a pre-stored picture or chat history to indicate a user who has photographed the user a predetermined number of times or exchanged texts.
The present invention provides a method for recommending emotional contents suitable for an event situation when a user performs an event. The sensible contents may include voice, video, and text as intelligent information for analyzing the images related to the shooting (general shooting, self camera shooting) and the photographed images and providing them to the user.
The emotional content may include voice, video, and text as intelligent information analyzed and provided to a user by analyzing the chat history.
The present invention can perform an emotion agent function to perform emotion sharing when an event occurs. The emotional agent function may be set by a user in a menu or automatically when an event occurs.
The emotional agent function may be performed in voice or text form
In order to perform the emotional agent function, the present invention may include a voice recognition function and a pattern analysis function for analyzing user information (information input by the user and information performed by the user).
Also, in order to perform the emotional agent function, the present invention can use a voice / text conversion function, that is, a function of converting text input into voice or converting voice to text.
2 is a flowchart illustrating a method for sharing emotion of a mobile terminal according to an embodiment of the present invention.
As shown in FIG. 2, when the user enters a predetermined operation mode in a predetermined event (e.g., chat, chat), the
The operation mode may be a shooting mode and a chat mode, and may include the shooting mode and the self mode.
When the emotional agent function is activated, the
The
In one embodiment, when the operation mode is the photographing mode, the
In another embodiment, when the operation mode is the photographing mode, the
As a result of the analysis, if there are more than a predetermined number of photographs or chat contents included in the recognized users, and there is a missing user (absentee) among them, the
For example, the emotional content may include voice or text as sensory recommendation information for 'real-time connection' in order to share the current event situation to the absentee. In response to the output of the emotional contents, the user can perform a voice or touch selection to permit sharing.
Accordingly, the
In one embodiment, when the operation mode is the photographing mode, the
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, various embodiments of a method for sharing emotional contents of a mobile terminal according to an embodiment of the present invention will be described with reference to the accompanying drawings.
FIG. 3 shows an embodiment of a method for sharing emotion of a mobile terminal according to the present invention, which shows a method of sharing emotion in a shooting mode, and FIG. 4 shows an example of a previously photographed image.
As shown in FIG. 3, when the user enters the photographing mode in a predetermined event (e.g., ceremony), the
When the emotional agent function is activated, the
As shown in FIG. 4, when a plurality of pictures (for example, 10 or more pictures) are taken by the users A, B, and C included in the current preview screen together with the user D, , The
FIG. 5 is a view for providing emotional contents in the form of text in the present invention.
Referring to FIG. 5, if there are a plurality of photographs taken by the users A, B, and C participating in the current ceremony together with the user D who missed the ceremony, the
5, the
When the user A allows the user A to share with the user D by voice, the
Hereinafter, an operation of capturing a user A's terminal on a cradle and performing a capturing operation through the voice of the user A will be described. However, the present invention is not limited to this, and one of the participants (friend, friend) may take a picture with the terminal of the user A and perform the photographing operation through the operation of the photographer or the voice of the user A.
FIG. 6 shows an embodiment of sharing sensibility in real time using a video call in the present invention.
6, when the user A selects the sharing using the video call with the user D by voice, the
Accordingly, the
In addition, even when a normal telephone call is selected as a sharing method, the
In another embodiment, when the user selects chat as the sharing method, the
Figure 7 is an embodiment of sharing photographs taken during emotional sharing in the present invention.
As shown in FIG. 7, when the user A, B, C and the user D share the emotion (enjoyable / pleasant atmosphere) of the event through the video call, The user can press the photographing button or the user A can perform photographing according to the voice.
When the photographing is completed, the
FIG. 8 shows an embodiment of sharing photographs related to photographic images during emotional sharing in the present invention, and FIG. 9 shows an embodiment of transmitting additional photographs requested by the emotional sharer.
8 and 9, when a photograph photographed during emotion sharing is shared with other users B, C, and D, the
In response to the photo transfer, the user B can request the user A to transfer photographs related to the users A, B, C, and D that he does not have.
In response to the photograph transfer request of the user B, the
FIG. 10 illustrates a method of sharing emotion in a chat mode according to an exemplary embodiment of a method for sharing emotion of a mobile terminal according to the present invention.
As shown in FIG. 10, when the user A enters the chat mode and conducts a chat with the user B, the
When the emotional agent function is activated, the
11, the
Accordingly, since the user C enters the chat room in real time, the users A, B, and C can share the same emotion (enjoyable atmosphere for the birthday pie) through chatting.
The emotional content sharing method according to the present invention can be applied not only to shooting through a camera mounted on the rear side of a terminal but also to shooting a self camera (hereinafter referred to as a self-portrait) through a camera provided on the front side.
FIG. 11 illustrates a method of sharing emotion in a self-camera mode according to an embodiment of the emotion sharing method of a mobile terminal according to the present invention.
As shown in FIG. 11, when the user B enters the self-portrait mode to take a self-portrait in the memory of a head (Perman + cut), the
In addition, the
The
In general, when you want to take a picture, you need to take a picture while adjusting the shooting mode (Portrait mode / Landscape mode) / option so that you can take better pictures. However, taking pictures while continuously adjusting the shooting mode (portrait mode, landscape mode) / options whenever shooting is a very difficult, complicated and cumbersome task for the general user. Therefore, the present invention can activate the emotional function to adjust the camera setting change simply by voice.
12 is an embodiment for providing shared contents in the emotion sharing method according to the present invention.
As shown in FIG. 12, the user can activate the emotional agent function in order to take a landscape photograph in the photographing mode. The emotional agent function may be activated individually by the user at the time of entering the shooting mode or automatically when the mode is entered. In addition, the emotional agent function may be activated by tapping one side of a screen in a preview screen.
When the emotional agent function is activated, the
If the user requests at least one camera setting, for example, the illumination adjustment of the screen, with respect to the emotional comment of the
In another embodiment, the
In another embodiment, the
As described above, according to the present invention, when a predetermined event (shooting, chatting, and the like) occurs, the emotional agent function is activated, and then the emotional contents for sharing the contents generated by the event in real time or non- The emotion can be shared with other users in real time, and the contents generated by the event can be processed more conveniently by inputting a response to the provided emotional contents by voice.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a
50: Sensibility agent function icon 51: Popup
60: Video call screen 122: Microphone
151: Display unit 152: Acoustic output unit
170: memory 180:
Claims (13)
(a) if a photographing mode or a chatting mode is activated in a predetermined event, after recognizing at least one user participating in the corresponding mode,
(b) analyzing a history of pre-stored photographs or chat content related to the recognized user to determine a shared subject to share emotion associated with the event,
(c) a control unit for automatically outputting emotional contents recommending real-time connection with the determined object to be shared,
The control unit
And determines a user who has not photographed or chatted in the current session among the users who have photographed or chatted for a predetermined number of times or more as a target of sharing.
A normal photographing mode, and a selftime mode.
And outputs it as an audio or text form as an emotion note.
The emotional comment is connected to another user in real time according to a voice input of the user,
Wherein the real-time connection includes a video call, a telephone connection, and a chat connection.
Wherein when the camera setting is requested by the user before the photographing, the plurality of pre-stored photographs are analyzed and the camera setting values of the photographs photographed most by the user are selected and automatically applied.
Color, and focus of the mobile terminal.
Performs shooting according to a user's voice input,
When the photographing is completed, outputting a sensitivity sentence asking whether to share the photographed photograph to all users.
When the photographing is performed in the self-portrait mode, a self-portrait photograph is compared with a photo of the stored user, and when a singular point between the two photographs is found, a sensitivity mark for asking for a change of the profile photograph is outputted. And changes the profile.
Hair length, makeup, and clothes.
Recognizing at least one user participating in each mode when an entrance of the photographing mode or the chatting mode is detected;
Analyzing a history of pre-stored photographs or chat content related to the recognized user to determine a shared subject to share emotion associated with the event;
Automatically outputting emotional contents recommending a real-time connection with the determined object to be shared; And
And performing a real-time connection with the determined object to be shared according to a user's response to the output emotional content,
Wherein the sharing object is determined as a user who has not photographed or chatted in the present, out of the users who have photographed or chatted over a predetermined number of times or more by analyzing the stored photograph or chat data.
It is outputted as audio or text in the form of emotion,
Wherein the real-time connection includes a video call, a telephone connection, and a chat connection.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150164952A KR101748670B1 (en) | 2015-11-24 | 2015-11-24 | Mobile terminal and emotion sharing method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150164952A KR101748670B1 (en) | 2015-11-24 | 2015-11-24 | Mobile terminal and emotion sharing method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170060470A KR20170060470A (en) | 2017-06-01 |
KR101748670B1 true KR101748670B1 (en) | 2017-07-03 |
Family
ID=59221801
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150164952A KR101748670B1 (en) | 2015-11-24 | 2015-11-24 | Mobile terminal and emotion sharing method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101748670B1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008311749A (en) | 2007-06-12 | 2008-12-25 | Canon Inc | Image transmitter and method thereof |
JP2009200621A (en) * | 2008-02-19 | 2009-09-03 | Panasonic Corp | Imaging apparatus with image transmission/reception function |
JP2009206774A (en) | 2008-02-27 | 2009-09-10 | Canon Inc | System and device for transmitting image, and control method |
JP2010252374A (en) * | 2010-06-16 | 2010-11-04 | Casio Computer Co Ltd | Camera, art of shooting great photographs, and programs |
-
2015
- 2015-11-24 KR KR1020150164952A patent/KR101748670B1/en active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008311749A (en) | 2007-06-12 | 2008-12-25 | Canon Inc | Image transmitter and method thereof |
JP2009200621A (en) * | 2008-02-19 | 2009-09-03 | Panasonic Corp | Imaging apparatus with image transmission/reception function |
JP2009206774A (en) | 2008-02-27 | 2009-09-10 | Canon Inc | System and device for transmitting image, and control method |
JP2010252374A (en) * | 2010-06-16 | 2010-11-04 | Casio Computer Co Ltd | Camera, art of shooting great photographs, and programs |
Also Published As
Publication number | Publication date |
---|---|
KR20170060470A (en) | 2017-06-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105549839B (en) | Mobile terminal and control method thereof | |
KR101678861B1 (en) | Mobile terminal and method for controlling the same | |
KR102155093B1 (en) | Mobile terminal and method for controlling the same | |
KR20180048142A (en) | Mobile terminal and method for controlling the same | |
KR20160076201A (en) | Mobile terminal and method for controlling the same | |
KR20180057366A (en) | Mobile terminal and method for controlling the same | |
KR20160031886A (en) | Mobile terminal and control method for the mobile terminal | |
KR20170131101A (en) | Mobile terminal and method for controlling the same | |
KR20170029837A (en) | Mobile terminal and method for controlling the same | |
KR20160074334A (en) | Mobile terminal and method for controlling the same | |
KR101598710B1 (en) | Mobile terminal and method for controlling the same | |
KR20170025177A (en) | Mobile terminal and method for controlling the same | |
KR20180023310A (en) | Mobile terminal and method for controlling the same | |
KR20160150421A (en) | Mobile terminal and method for controlling the same | |
KR20170016752A (en) | Mobile terminal and method for controlling the same | |
KR20170046969A (en) | Mobile device and, the method thereof | |
KR20160087969A (en) | Mobile terminal and dual lcd co-processing method thereof | |
KR20160092820A (en) | Mobile terminal and method for controlling the same | |
KR101748670B1 (en) | Mobile terminal and emotion sharing method thereof | |
KR20170071017A (en) | Mobile terminal and method for controlling the same | |
KR20170039994A (en) | Mobile terminal and method for controlling the same | |
KR20170035755A (en) | Mobile terminal and method for controlling the same | |
KR20160031336A (en) | Mobile terminal and method for controlling the same | |
KR20150093519A (en) | Mobile terminal and method for controlling the same | |
KR20160046205A (en) | Mobile terminal and method for controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GRNT | Written decision to grant |