KR20130001826A - Mobile terminal and control method therof - Google Patents

Mobile terminal and control method therof Download PDF

Info

Publication number
KR20130001826A
KR20130001826A KR1020110062704A KR20110062704A KR20130001826A KR 20130001826 A KR20130001826 A KR 20130001826A KR 1020110062704 A KR1020110062704 A KR 1020110062704A KR 20110062704 A KR20110062704 A KR 20110062704A KR 20130001826 A KR20130001826 A KR 20130001826A
Authority
KR
South Korea
Prior art keywords
electronic device
content
display
mobile terminal
area
Prior art date
Application number
KR1020110062704A
Other languages
Korean (ko)
Inventor
이지선
이정빈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110062704A priority Critical patent/KR20130001826A/en
Priority to US13/335,187 priority patent/US9207853B2/en
Publication of KR20130001826A publication Critical patent/KR20130001826A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/02Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
    • H04W84/10Small scale networks; Flat hierarchical networks
    • H04W84/12WLAN [Wireless Local Area Networks]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W92/00Interfaces specially adapted for wireless communication networks
    • H04W92/16Interfaces between hierarchically similar devices
    • H04W92/18Interfaces between hierarchically similar devices between terminal devices

Abstract

PURPOSE: A mobile terminal capable of being intuitionally used by a user and a control method thereof are provided to alter display positions of a first and a second electronic device according to an exchange condition of content data. CONSTITUTION: A wireless communications unit(110) exchanges content data with at least one other electronic devices. A control unit(180) displays information about the other electronic devices communicated through the wireless communications unit by distinguishing a first electronic device and a second electronic device. The first electronic device has a managing attribute about the content. The second electronic device has a rendering attribute of the content. The control unit alters display positions of the first and the second electronic device according to an exchange condition of the content data. [Reference numerals] (110) Wireless communications unit; (111) Broadcast receiving module; (112) Mobile communication module; (113) Wireless internet module; (114) Local area communication module; (115) Location information module; (120) A/V input unit; (121) Camera; (122) Microphone; (130) User input unit; (140) Sensing unit; (150) Output unit; (151) Display module; (152) Sound output module; (153) Alarm unit; (154) Haptic module; (160) Memory; (170) Interface unit; (180) Control unit; (181) Multimedia module; (190) Power supply unit

Description

MOBILE TERMINAL AND CONTROL METHOD THEROF}

The present invention relates to a mobile terminal and a control method thereof. More specifically, the mobile terminal enables a user to intuitively use the mobile terminal by changing the display position of the first and second electronic devices according to the exchange state of the content data. And a control method thereof.

A terminal such as a personal computer, a notebook computer, or a mobile phone has various functions, for example, a multimedia device having a complex function such as photographing or photographing of a moving picture, reproduction of music or a moving picture file, (Multimedia player).

The terminal can move It may be divided into a mobile terminal and a stationary terminal, depending on whether it is present. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

In order to support and enhance the functionality of the terminal, it is contemplated to improve the structural and / or software portion of the terminal.

Recently, a variety of terminals including mobile terminals provide a complex and various functions, and the menu structure is also becoming complicated. In addition, a function of displaying various digital documents including a web page is added to a mobile terminal.

The present invention relates to a mobile terminal and a control method thereof in which a user can intuitively use the mobile terminal by changing the display position of the first and second electronic devices according to the exchange state of the content data.

Mobile terminal according to an embodiment of the present invention for realizing the above object, a display; A wireless communication unit communicating with at least one other electronic device to exchange content data; And classifying information about at least one other electronic device communicated through the wireless communication unit into a first electronic device having a management property for a content and a second electronic device having a property for rendering the content. And a control unit for displaying the first and second electronic devices according to the exchange state of the content data.

The exchange state of the content data may include at least one of a play state of playing the content, a download state of acquiring and storing the content, and an upload state of transmitting and storing the content.

The exchange state of the content data includes a transmission direction of the content data from at least one of the first and second electronic devices to at least another one, and the controller controls the first data based on the transmission direction of the content data. , 2 The display position of the electronic device can be changed and displayed.

The controller divides the display into a first area, which is an upper area of the display, and a second area, which is a lower area of the display, and selects one of the first area and the second area based on a transmission direction of the content data. Each of the first electronic device and the second electronic device may be displayed.

The control unit may display an indicator indicating a transmission direction of the content data on a boundary between the first and second areas.

The controller may display a list of contents stored in a selected electronic device among the first electronic devices on the display.

The controller may change at least one of a size and a resolution of content data displayed on the display according to the type of the second electronic device.

The control unit may display at least one of an exchange state of the content data and a type of the exchanged content data.

The controller may display at least one of colors and shapes of the first and second objects differently according to at least one of whether the first and second electronic devices are selected and a communication state with the first and second electronic devices. Can be.

When the control unit acquires the selection signal for the first and second electronic devices, the controller may change the display of the first and second objects by reflecting the attributes of the selected first and second electronic devices.

When the controller acquires a selection signal for the first object or the second object, the controller may search for an electronic device corresponding to the object from which the selection signal is obtained through the wireless communication unit.

The apparatus may further include a memory in which index information of the first and second electronic devices capable of communicating at a specific location is stored, and the controller may search for the first and second communicable electronic devices based on the stored index information.

The controller is configured to display a list of contents stored in the first electronic device on the display, and when the selection signal for at least one of the displayed contents list is obtained, the control unit obtains the selection signal from the second electronic device. The content corresponding to the list may be played.

The selection signal may be a touch operation of touching and dragging at least one of the displayed contents list to a region where the second object is displayed.

The apparatus may further include a memory in which access information about the at least one other electronic device communicated through the wireless communication unit at a specific location is stored. Communication can be initiated.

In addition, a mobile terminal according to an embodiment of the present invention for realizing the above object, a display; A wireless communication unit communicating with at least one other electronic device; And a first object for selecting a first electronic device having a management attribute for a content and a second object for selecting a second electronic device having a property for rendering the content, in accordance with an exchange state of the content. It may include a control unit for changing the display position of the first and second objects.

The exchange state of the content data may include at least one of a play state of playing the content, a download state of acquiring and storing the content, and an upload state of transmitting and storing the content.

The control unit may divide the display into a plurality of areas and display the first and second objects in the same area among the divided areas.

The controller may display at least one of colors and shapes of the first and second objects differently according to at least one of whether the first and second electronic devices are selected and a communication state with the first and second electronic devices. Can be.

When the control unit acquires the selection signal for the first and second electronic devices, the controller may change the display of the first and second objects by reflecting the attributes of the selected first and second electronic devices.

In addition, the control method of a mobile terminal according to an embodiment of the present invention for realizing the above object, the first object corresponding to the first electronic device having a management attribute for the content and the content (r) displaying a second object corresponding to a second electronic device having a rendering property; Transferring the content from the first electronic device to the second electronic device in response to an input to the first and second objects; And changing the display position of the first and second objects according to the exchange state of the content.

The changing may include dividing the display into a first area and a second area; And displaying the first object and the second object in one of the first area and the second area based on a transmission direction of the content.

The displaying may include obtaining a first selection signal for the first object; And searching the first electronic device to display a list of the first electronic device.

Acquiring a second selection signal for at least one of the displayed first electromagnetic list; And displaying a list of the contents stored in the first electronic device corresponding to the obtained second selection signal.

The method may further include changing the display of the first object by reflecting the property of the first electronic device corresponding to the acquired second selection signal.

The transmitting may include obtaining a third selection signal for the second object; Searching for the second electronic device to display a list of the second electronic device; And acquiring a fourth selection signal for at least one of the displayed list of second electronic devices.

The method may further include changing the display of the second object by reflecting the attributes of the second electronic device corresponding to the acquired fourth selection signal.

The mobile terminal and its control method according to the present invention have an effect that the user can intuitively use the mobile terminal by changing the display position of the first and second electronic devices according to the exchange state of the content data.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2A is a front perspective view of a mobile terminal according to an embodiment of the present invention.
2B is a rear perspective view of a mobile terminal according to an embodiment of the present invention.
3 is a diagram illustrating a mobile terminal according to another embodiment of the present invention.
4 is a conceptual diagram illustrating a proximity depth of a proximity sensor.
FIG. 5 is a structural diagram illustrating a service network associated with FIG. 1 mobile terminal.
6 is a conceptual diagram of a DLNA network.
7 illustrates a functional layer of a DLNA.
8 to 11 are flowcharts illustrating an operation process of a mobile terminal according to an embodiment of the present invention.
12 to 15 are diagrams illustrating a DMS selection process of a mobile terminal according to an embodiment of the present invention.
16 illustrates a DMS selection process of a mobile terminal according to an embodiment of the present invention.
17 and 18 illustrate a first object display form of a mobile terminal according to an embodiment of the present invention.
19 to 25 are diagrams illustrating a DMR selection process of a mobile terminal according to an embodiment of the present invention.
26 and 27 illustrate a content selection process of a mobile terminal according to an embodiment of the present invention.
28 and 29 are diagrams illustrating a DMS selection process of a mobile terminal according to an embodiment of the present invention.
30 to 32 illustrate a DMS or a DMR selection process of a mobile terminal according to an embodiment of the present invention.
33 illustrates an operation of a mobile terminal according to an embodiment of the present invention.
34 through 36 are diagrams illustrating a process of selecting a DMS and a DMP of a mobile terminal according to an embodiment of the present invention.
37 and 38 are diagrams illustrating a process of downloading contents between terminals by a control operation of a mobile terminal according to an embodiment of the present invention.
39 is a diagram illustrating a content upload process between terminals by a control operation of a mobile terminal according to one embodiment of the present invention.
40 is a diagram illustrating a playlist display process of a mobile terminal according to one embodiment of the present invention.
41 is a view illustrating a content reproduction process of a mobile terminal according to an embodiment of the present invention.
42 is a diagram illustrating a process of setting a content download location of a mobile terminal according to one embodiment of the present invention.
43 and 44 are diagrams illustrating a content transmission process of a mobile terminal according to an embodiment of the present invention.
45 is a diagram illustrating a multi-content transmission process of a mobile terminal according to one embodiment of the present invention.
46 is a diagram illustrating a control process through a widget of a mobile terminal according to one embodiment of the present invention.

The above objects, features and advantages of the present invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings. It is to be understood, however, that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and similarities. Like reference numerals designate like elements throughout the specification. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. In addition, the numbers (eg, first, second, etc.) used in the description process of the present specification are merely identification symbols for distinguishing one component from another component.

Hereinafter, a mobile terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory unit 160, An interface unit 170, a control unit 180, a power supply unit 190, and the like. The components shown in Fig. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, an EPG (Electronic Program Guide) of a DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of a DVBH (Digital Video BroadcastHandheld).

The broadcast receiving module 111 receives broadcast signals using various broadcasting systems. In particular, the broadcast receiving module 111 may be a digital multimedia broadcasting broadcasting (DMBT), a digital multimedia broadcasting satellite (DMBS), a media forward link only (MediaFLO), a digital video broadcasting ), And ISDBT (Integrated Services Digital Broadcast Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems that provide broadcast signals as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory unit 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 refers to a module for wireless Internet access, and the wireless Internet module 113 can be embedded in the mobile terminal 100 or externally. WLAN (WiFi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short range communication module 114 refers to a module for short range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

The location information module 115 is a module for confirming or obtaining the location of the mobile terminal. A typical example of the location information module is a GPS (Global Position System) module. According to the current technology, the GPS module 115 calculates information on a distance (distance) from three or more satellites to one point (entity), information on the time when the distance information is measured, , Three-dimensional position information according to latitude, longitude, and altitude of one point (individual) in one hour can be calculated. Further, a method of calculating position and time information using three satellites and correcting the error of the calculated position and time information using another satellite is also used. The GPS module 115 continues to calculate the current position in real time and uses it to calculate speed information.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory unit 160 or may be transmitted to the outside through the wireless communication unit 110. [ The camera 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence or absence of user contact, the orientation of the mobile terminal, And generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, it may be responsible for a sensing function related to whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor.

The output unit 150 is for generating output related to the visual, auditory or tactile sense and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154 .

The display unit 151 displays and outputs information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may be a liquid crystal display, a thin film transistor liquid crystal display, an organic light emitting diode, a flexible display, a 3D display And may include at least one.

Some of these displays may be transparent or light transmissive so that they can be seen through. This may be referred to as a transparent display. A typical example of the transparent display is a transparent LCD or the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another, or may be disposed integrally with one another, and may be disposed on different surfaces, respectively.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter, abbreviated as “touch screen”), the display unit 151 is an output device. It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

Referring to FIG. 1, a proximity sensor may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.

And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory unit 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 outputs an acoustic signal related to a function (e.g., a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal can also be output through the display unit 151 or the audio output module 152.

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may be configured to perform various functions such as an effect of stimulation by a pin arrangement vertically moving with respect to a contact skin surface, an effect of stimulation by air spraying force or suction force through a jet opening or a suction opening, A variety of tactile effects such as an effect of stimulation through contact of an electrode, an effect of stimulation by an electrostatic force, and an effect of reproducing a cold sensation using a heat absorbing or exothermic element can be generated.

The haptic module 154 may not only deliver the haptic effect through direct contact, but also implement the haptic effect through the muscle sense of the user's finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The memory unit 160 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory unit 160 may store data related to vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory unit 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory) At least one of a random access memory (RAM), a static random access memory (SRAM), a read only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read- Lt; / RTI > type of storage medium. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory unit 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or receives power from the external device to transfer the data to each component in the mobile terminal 100 or to transmit data in the mobile terminal 100 to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input by the user to the cradle may be transmitted It can be a passage to be transmitted to the terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.

The control unit 180 typically controls the overall operation of the mobile terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of processors, controllers, microcontrollers, microprocessors, and electrical units for performing functions. In some cases such embodiments may be implemented using a controller 180 < / RTI >

According to a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that perform at least one function or operation. The software code may be implemented by a software application written in a suitable programming language. Also, the software codes may be stored in the memory unit 160 and executed by the control unit 180. [

2A is a perspective view of an example of a mobile terminal or a mobile terminal according to the present invention.

The disclosed mobile terminal 100 has a bar-shaped terminal body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).

The display unit 151, the audio output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, and the interface 170 may be disposed in the terminal body, mainly the front case 101. have.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 152 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132, the interface 170, and the like are disposed on the side surfaces of the front case 101 and the rear case 102.

The user input unit 130 is operated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132.

The manipulation units 131 and 132 may also be collectively referred to as manipulating portions, and may be employed in any manner as long as the user operates the tactile manner with a tactile feeling.

The contents input by the manipulation units 131 and 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 To the touch recognition mode of the touch screen.

FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 'may be further mounted on the rear surface of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to the camera 121 (see FIG. 2A), and may be a camera having different pixels from the camera 121.

For example, the camera 121 may have a low pixel so that the face of the user can be photographed and transmitted to the other party in case of a video call or the like, and the camera 121 ' It is preferable to have a large number of pixels. The cameras 121 and 121 'may be installed in the terminal body such that they can be rotated or popped up.

A flash 123 and a mirror 124 are further disposed adjacent to the camera 121 '. The flash 123 illuminates the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to illuminate the user's own face or the like when the user intends to shoot himself / herself (self-photographing) using the camera 121 '.

An acoustic output 152 'may be additionally disposed on the rear surface of the terminal body. The sound output unit 152 'may implement the stereo function together with the sound output unit 152 (see FIG. 2A), and may be used for the implementation of the speakerphone mode during a call.

In addition to the antenna for a call or the like, a broadcast signal reception antenna 124 may be additionally disposed on the side of the terminal body. The antenna 124 constituting a part of the broadcast receiving module 111 (see FIG. 1) may be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the portable terminal 100 is mounted on the terminal body. The power supply unit 190 may be embedded in the terminal body or may be directly detachable from the outside of the terminal body.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may also be of a light transmission type like the display unit 151. [ In this case, if the display unit 151 is configured to output time information on both sides, the time information can be recognized through the touch pad 135 as well. The information output on both sides may be all controlled by the touch pad 135. [ Alternatively, a display may be additionally mounted on the touch pad 135, and a touch screen may be disposed on the rear case 102.

The touch pad 135 operates in association with the display unit 151 of the front case 101. The touch pad 135 may be disposed parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

3 is a diagram illustrating a mobile terminal according to another embodiment of the present invention.

As shown in the drawing, the present invention can be applied to various types of mobile terminals 100. That is, the present invention is not limited to a specific type of mobile terminal 100, but means that the present invention can be applied to various types of mobile terminals 100 if communication is possible.

As shown in (a) of FIG. 3, the mobile terminal 100 according to the present invention may be a tablet PC. The tablet PC may be an electronic device that may receive an input from a user through a touch operation on the display 151 of the large screen without a separate keyboard.

As shown in (b) of FIG. 3, the mobile terminal 100 according to the present invention is a mobile terminal 100 in the form of an e-book reader capable of displaying an e-book. Can be.

4 is a conceptual diagram illustrating a proximity depth of a proximity sensor.

As shown in FIG. 4, when a pointer such as a user's finger is close to the touch screen, the proximity sensor disposed in or near the touch screen detects this and outputs a proximity signal.

The proximity sensor may be configured to output different proximity signals according to a distance between the proximity-touched pointer and the touch screen (hereinafter referred to as "proximity depth").

A distance at which the proximity signal is output when the pointer approaches the touch screen is referred to as a detection distance. In short, by using a plurality of proximity sensors having different detection distances, the proximity signals output from the proximity sensors are compared, .

In FIG. 4, for example, a cross section of a touch screen on which three proximity sensors capable of sensing three proximity depths is disposed is illustrated. Of course, proximity sensors that detect less than three or more than four proximity depths are also possible.

Specifically, when the pointer is completely in contact with the touch screen (d0), it is recognized as a contact touch. If the pointer is located on the touch screen at a distance less than the distance d1, it is recognized as a proximity touch of the first proximity depth.

If the pointer is located on the touch screen at a distance d1 or less and less than the distance d2, the pointer is recognized as a proximity touch of the second proximity depth. When the pointer is spaced apart from the d2 distance by more than d3 distance on the touch screen, it is recognized as a proximity touch of a third proximity depth. If the pointer is located at a distance d3 or more on the touch screen, it is recognized that the proximity touch is released.

Accordingly, the controller 180 can recognize the proximity touch as various input signals according to the proximity and proximity positions of the pointer to the touch screen, and perform various operation controls according to the various input signals.

5 is a structural diagram illustrating a service network according to an embodiment of the present invention, which illustrates a service network for sharing content among electronic devices.

Referring to FIG. 5, the mobile terminal 100 is connected to at least one external electronic device 200 in which an image display function is implemented through a network, and the external electronic device 200 may display content on the external electronic device 200. Content is shared with the external electronic device 200 by delivering the content to the electronic device 200 or by receiving the content from the external electronic device 200 and displaying the content on the screen.

In FIG. 5, a case where the mobile terminal 100 is a mobile phone and the external electronic device 200 is a television and a laptop computer is described as an example, but the present invention is not limited thereto. According to the present invention, the mobile terminal 100 and the external electronic device are a mobile phone, a television, a laptop computer, a smart phone, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a desktop, and the like. The computer may be implemented as a desktop computer, a set-top box, a personal video recorder (PVR), an electronic picture frame, or the like.

Referring back to FIG. 5, in order for the mobile terminal 100 to share content with the external electronic device 200, the mobile terminal 100 and the external terminal 200 may be compatible with each other. It is necessary to configure a platform of the external electronic device 200. To this end, the electronic devices 100 and 200 related to the embodiment of the present invention may configure a platform based on a digital living network alliance (DLNA).

According to DLNA, IPv4 can be used as the network stack, and Ethernet, Wireless Local Network (WLAN) (802.11a / b / g), Wireless Fidelity (Wi-Fi), Bluetooth (Bluetooth), In addition, a communication method capable of IP connection can be used.

In addition, according to the DLNA, the discovery and control of the electronic device may be based on UPnP, in particular, UPnP AV Architecture and UPnP Device Architecture. For example, a simple service discovery protocol (SSDP) may be used for discovery of an electronic device. In addition, a simple object access protocol (SOAP) may be used for controlling the electronic device.

In addition, according to DLNA, HTTP, RTP, and the like can be used for media transmission, and JPEG, LPCM, MPEG2, MP3, MPEG4, and the like can be used as a media format.

In addition, according to DLNA, digital media server (DMS), digital media player (DMP), digital media renderer (DMR) and digital media controller (DMC) types Can support electronic devices.

6 shows a conceptual diagram of a DLNA network.

The DLNA is a representative name of a standardization mechanism that enables sharing of contents such as music, moving pictures, and still pictures among electronic devices through a network.

The DLNA is based on the Universal Plug and Play (UPnP) protocol.

The DLNA network 200 includes a digital media server 210 (digital media server (DMS)), a digital media player 220 (digital media player, DMP), a digital media renderer 230 (digital media renderer, DMR), and It may include a digital media controller 240 (DMC).

The DLNA network may include one or more of the DMS 210, the DMP 220, the DMR 230, and the DMC 240, respectively. In this case, the DLNA may provide a standard so that each of the devices are compatible with each other. In addition, the DLNA network may provide a standard for interoperability between the DMS 210, the DMP 220, the DMR 230, and the DMC 240.

The DMS 210 may provide digital media content. That is, the DMS 210 may store and manage content. The DMS 210 may receive various commands from the DMC 240 and perform a command. For example, when the DMS 210 receives a play command, the DMS 210 may search for content to be played and provide the content to the DMR 230. The DMS 210 may include, for example, a PC, a personal video recorder (PVR), and a set top box.

The DMP 220 may control the content or the electronic device, and allow the content to be played back. That is, the DMP 220 may perform the functions of the DMR 230 for regeneration and the DMC 240 for control. The DMP 220 may include, for example, a TV, a DTV, and a home theater.

The DMR 230 may play content. The content provided from the DMS 210 may be played. The DMR 230 may include, for example, an electron frame.

The DMC 240 may provide a control function. The DMC 240 may include, for example, a mobile phone and a PDA.

In addition, the DLNA network may consist of the DMS 210, the DMR 230, and the DMC 240, or the DMP 220 and the DMR 230.

In addition, the DMS 210, the DMP 220, the DMR 230, and the DMC 240 may be terms that functionally distinguish electronic devices. For example, when the mobile phone has a control function as well as a playback function, the mobile phone may correspond to the DMP 220. When the DTV 100 manages contents, the DMP 220 may be used as well as the DMP 220. FIG. It may also correspond to).

7 illustrates functional components in accordance with DLNA.

The functional component according to the DLNA includes a media formats layer, a media transport layer, a device discovery & control and media management layer, a network stack layer, a network It may include a network connectivity layer.

The network connection layer may include a physical layer and a link layer of the network. The network connection layer may include Ethernet, Wi-Fi, and Bluetooth. In addition, a communication medium capable of IP connection can be used.

The network stack layer can use the IPv4 protocol.

Device discovery and control and media management layer may be based on UPnP, in particular UPnP AV Architecture and UPnP Device Architecture. For example, a simple service discovery protocol (SSDP) may be used for device discovery. You can also use the simple object access protocol (SOAP) for control.

The media transport layer may use HTTP 1.0 / 1.1. For streaming playback. Alternatively, real-time transport protocol (RTP) can be used.

The media format layer may use images, audio, AV media, and XHTML (Extensible Hypertext Markup Language) documents.

8 to 11 are flowcharts illustrating an operation process of a mobile terminal according to an embodiment of the present invention.

As shown in these figures, the mobile terminal 100 according to an embodiment of the present invention may allow the control related to the rendering of content stored in the other electronic device 200 to be intuitively performed.

In operation S10, a content sharing application may be driven.

The content may be data stored in the mobile terminal 100 or another electronic device 200. For example, the content may be a digitized still image, a video, various documents, or the like. In order to share or exchange data between the mobile terminal 100 and another electronic device 200 or another electronic device 200, a content sharing application may be required. When the content sharing application is executed, an environment in which data can be shared with other electronic devices in a DLNA environment or the like may be provided.

When the content sharing application is driven, an operation (S20) of displaying an object corresponding to a digital media server (DMS) and a digital media renderer (DMR) may be performed.

The DMS may be an electronic device having a managing property for content. The management attribute for the content may mean an attribute for generating, storing, and managing index data for storing the content itself or quickly and easily accessing the content. An electronic device having a management attribute for content may be, for example, a server that contains a storage medium. The controller (180 of FIG. 1) may display a first object (O1 of FIG. 12) corresponding to the DMS on the display 151.

The DMR may be an electronic device having a property of rendering content. An attribute for rendering content may mean an attribute for displaying and playing content. The electronic device having the property of rendering the content may be, for example, a display 151, a sound output module (152 of FIG. 1), or the like. The controller (180 of FIG. 1) may display a second object (O2 of FIG. 12) corresponding to the DMR on the display 151.

By displaying the first object (O1 of FIG. 12) corresponding to the DMS and the second object (O2 of FIG. 12) corresponding to the DMR on the display 151, the user can intuitively select the DMS and the DMR. For example, if the user wants to select a DMS, the first object (O1 of FIG. 12) may be touched, and if the user wants to select DMR, the second object (O2 of FIG. 12) may be touched. it means. Furthermore, content can be easily transferred from a specific DMS to a specific DMR through a touch operation of selecting content stored in a specific DMS and dragging and dropping the content to a second object (O2 of FIG. 12).

In operation S30, a user input for the displayed object (O1 or O2 of FIG. 12) may be acquired.

As described above, the objects O1 and O2 of FIG. 12 may be icons corresponding to the DMS and the DMR. When the objects O1 and O2 of FIG. 12 are displayed, the user may perform a touch operation thereto. The touch gesture of the user may include a proximity touch gesture.

In operation S40, it may be determined whether information of the DMS and the DMR exists.

Upon obtaining a user input for the displayed object (O1, O2 of FIG. 12), the controller (180 of FIG. 1) corresponds to the object (O1, O2 of FIG. 12) selected by the user through the wireless communication unit (110 of FIG. 1). The process of searching for the DMS and the DMR may be performed. In order to search for the DMS and the DMR, the controller (180 of FIG. 1) may perform an operation of transmitting a radio wave of a predetermined frequency band and waiting for a response for a predetermined time. Therefore, the search for DMS and DMR may require more than a certain time.

The controller (180 of FIG. 1) of the mobile terminal 100 according to an embodiment of the present invention may store information of DMS and DMR, which have been retrieved in the past, in the memory (160 of FIG. 1). If the information of the DMS and the DMR is stored in the memory 160 of FIG. 1, the controller (180 of FIG. 1) may know the existence of the DMS and the DMR based on the stored information. Therefore, the conventional search process can be omitted or the search time can be shortened.

If the information of the DMS does not exist, step S50 of searching for the DMS may proceed.

If the DMS is found, step S61 of displaying a list of the found DMSs may proceed.

There may be a plurality of DMS around the mobile terminal 100. For example, in the vicinity of the user having the mobile terminal 100, it means that there may be various storage media capable of wirelessly communicating with the mobile terminal 100.

The controller (180 of FIG. 1) may display the searched list of DMSs on the display 151.

When the selection (S62) of the list of the displayed DMS is made, a step (S63) of changing and displaying the first object (O1 of FIG. 12) corresponding to the DMS to correspond to the selected DMS may be performed.

The controller (180 of FIG. 1) may change the display of the first object (O1 of FIG. 12) to reflect the current state. For example, when the DMS is not selected and when a specific DMS is selected, the display of the first object O1 of FIG. 12 may be different. In addition, the display of the first object (O1 in FIG. 12) may be different according to the type of the selected DMS. Therefore, the user can easily know whether the connection with the specific DMS and / or the connection state is easy even by looking at the shape of the first object (O1 in FIG. 12).

In operation S64, the content list stored in the selected DMS may be displayed.

As described above, the DMS may be an electronic device having a management property for content. That is, the DMS may store various contents. The controller (180 of FIG. 1) may display the content list acquired through the wireless communication unit (110 of FIG. 1) on the display 151.

If there is no information on the DMR (S40), the step of retrieving the DMR (S70) may proceed.

If the DMR is found, step S81 of displaying a list of the found DMRs may proceed.

There may be a plurality of DMRs around the mobile terminal 100. For example, in the vicinity of the user having the mobile terminal 100, there may be a TV, an audio, a computer, or the like that can wirelessly communicate with the mobile terminal 100.

The controller (180 of FIG. 1) may display the searched list of DMRs on the display 151.

When a selection (S82) of the list of displayed DMRs is made, a step (S83) of changing and displaying a second object (O2 of FIG. 12) corresponding to the DMR to correspond to the selected DMR may be performed.

The controller (180 of FIG. 1) may change the display of the second object (O2 of FIG. 12) to reflect the current state. For example, when the DMR is not selected and when a specific DMR is selected, the display of the second object (O2 in FIG. 12) may be different. In addition, the display of the second object (O2 in FIG. 12) may be different depending on the type of the selected DMR. Accordingly, the user can easily know whether the connection with the specific DMR and / or the connection state is easy even by looking at the shape of the second object (O2 in FIG. 12).

If the process of displaying the information of the DMS and / or DMR is in progress, the step S90 of executing the selected function may be performed.

The executed function may be a function corresponding to a touch operation input by a user based on the displayed DMS and / or DMR information, which will be described in detail with reference to FIG. 11.

In operation S91, a selection signal for at least one of the displayed contents list may be acquired.

The selection signal may be a touch operation of touching a content list and dragging the content list. For example, after selecting a list of images stored in the selected DMS, the touch operation of dragging the second object (O2 in FIG. 12) to the area where the second object is located may be the above-described selection signal.

When the selection signal is obtained, a step (S92) of generating a control signal to reproduce the selected content in the selected DMR may proceed.

The controller (180 of FIG. 1) of the mobile terminal 100 may serve as a DMC for controlling the DMS and the DMR. Accordingly, when a selection signal corresponding to a specific function is input, the controller 180 of FIG. 1 may transmit a control signal to the DMS and / or the DMR to execute the function. For example, when receiving a selection signal for reproducing a specific video in a specific DMR, the controller 180 of FIG. 1 may transmit a control signal to a DMS and / or a selected DMR in which the specific video is stored. The DMS and / or DMR receiving the control signal from the control unit 180 of FIG. 1 may cause the specific video to be played in the selected DMR correspondingly.

12 to 15 are diagrams illustrating a DMS selection process of a mobile terminal according to an embodiment of the present invention.

As shown in FIG. 12, the display 151 of the mobile terminal 100 according to an embodiment of the present invention may be divided into first, second, and third areas A1, A2, and A3.

The first area A1 is an area in which the first, second, and third objects O1, O2, and O3 are displayed. The first area A1 may be a predetermined area above the display 151. The first, second, and third objects O1, O2, and O3 displayed in the first area A1 may be selected from other types of electronic devices 200, a connection state with other electronic devices 200, and other electronic devices 200. It can visually express whether or not to share the content with.

As described above, the first object O1 may be an icon corresponding to the DMS among other electronic devices 200. When the user selects the first object O1, the search for the DMS around the mobile terminal 100 may be performed, the searched DMS may be displayed, the DMS may be selected, or the selection of the DMS may be changed. The color or shape of the first object O1 may be changed according to the selection step as described above. For example, if it is not connected or selected, it may be achromatic, but if it is connected or selected, it may be changed to colored. In addition, although the general shape before the connection or selection, when the connection or selection is made, the shape may be changed to an icon corresponding to the connection or other selected electronic device 200. The first object O1 that changes according to the current state will be described in more detail in the corresponding part.

As described above, the second object 02 may be an icon corresponding to the DMR among other electronic devices 200. When the user selects the second object O2, the search for the DMR around the mobile terminal 100 may be performed, the searched DMR may be displayed, the DMR may be selected, or the selection of the DMR may be changed. Similarly to the case of the first object O2, the second object O1 may change its color or shape at each stage of selection.

The third object 03 may be an icon indicating whether to share content between the selected DMS and the DMR. For example, the third object O3 is usually expressed in achromatic colors, but may be expressed in chromatic colors when content starts to be shared. Alternatively, when the content starts to be shared, an animation effect may be applied to allow the user to intuitively recognize that the communication is in progress.

The second area A2 may be an area that informs the first information I1 of the current state of the mobile terminal 100 or the action to be taken by the user by text or symbol.

The third area A3 may be an area in which information of the DMS and / or the DMR is displayed. For example, when the first object O1 is selected, it may be an area for displaying a list of searched first objects O1 or displaying a list of contents stored in the selected DMS.

As shown in FIG. 13A, the user may select the first object O1 using the finger F or the like.

As shown in FIG. 13B, when a user input to the first object O1 is obtained, the controller (180 of FIG. 1) may be configured for another electronic device 200 around the mobile terminal 100. You can perform a search. To inform that the search for the other electronic device 200 is performed, the controller 180 of FIG. 1 may display the animated image An1.

As shown in FIG. 14A, when a search for another electronic device 200 is completed, the controller (180 of FIG. 1) displays a list of other electronic devices 200 found in the third area A3. I can display it. The list of other electronic devices 200 may display icons and names corresponding to the found other electronic devices 200.

The icon and name corresponding to the other electronic device 200 may be obtained from another electronic device or pre-stored in the memory 160 of FIG. 1. For example, a user of the mobile terminal 100 may acquire and display an icon and a name set in a specific other electronic device 200 or correspond to a unique number such as a MAC address assigned to a specific other electronic device 200. It means that the set icon and name can be displayed.

In addition to displaying a list of other electronic devices 200, the controller (180 of FIG. 1) may display second information I2 indicating a user's next action. In addition, a rescan icon RI corresponding to a function of performing a rescan of another electronic device 200 may be displayed.

As shown in FIG. 14B, the user may select any one of the list of other electronic devices 200 displayed in the third area A3.

As shown in FIG. 15, when a camera is selected from a list of other electronic devices 200, the controller (180 of FIG. 1) may change the first object (O1 of FIG. 14) to a camera object O11. Since the first object (O1 of FIG. 14) corresponding to the DMS is changed to the camera object O11, the user can clearly recognize that the camera is selected as the DMS.

The icon representing the camera object O11 may be an icon obtained from the camera or preset and stored by the user. Furthermore, the user may arbitrarily name and store the camera. For example, if there are a plurality of cameras, this means that the user may give a nickname so that the user can easily identify them.

When a camera is selected as the DMS, the controller (180 of FIG. 1) may display a list of contents included in the camera in the third area A3. The list of contents may be displayed in different forms according to the type of contents. For example, in the case of content related to an image, a list in the form of a thumbnail may be displayed. Furthermore, in the case of a video, a play button display indicating that the video is a thumbnail may be added to the thumbnail.

In the mobile terminal 100 according to an embodiment of the present invention, the content obtained from the DMS may be displayed regardless of the unique property of the content. For example, it means that the content of the still image property and the content of the video property can be simultaneously displayed in thumbnail form.

Based on the current state where the camera is selected and the content is displayed, the third information I3 for allowing the user to select the displayed content may be further displayed.

16 illustrates a DMS selection process of a mobile terminal according to an embodiment of the present invention.

As shown in the drawing, in the third area A3 of the display 151 of the mobile terminal 100 according to an embodiment of the present invention, a specific type of content is collectively included in addition to the searched other electronic devices 200. A menu for searching can be displayed. For example, an 'all music' menu corresponding to a function of collectively displaying music files included in the searched DMS and an 'all movie' menu corresponding to a function of collectively displaying image files included in the DMS are displayed. It can be. Therefore, it is expected to improve the usability compared to the case of sequentially selecting each list corresponding to the other electronic device 200 to find a specific content.

17 and 18 illustrate a first object display form of a mobile terminal according to an embodiment of the present invention.

As shown in these figures, the controller (180 of FIG. 1) of the mobile terminal 100 according to an embodiment of the present invention can change the display of the first object O1 in various ways according to the connection state with the DMS. have.

When the DMS is not selected, the first object O1 may be displayed with a default icon and a default name.

When the DMS is selected, the first object O1 may be replaced with an icon corresponding to the selected DMS. For example, an icon representing the camera object O11, an icon representing the notebook object O12, an icon representing another mobile terminal object O13, or the like may be displayed. By displaying an icon corresponding to the selected DMS, the user can intuitively recognize that a specific DMS is selected.

The name of the displayed icon can be set by the user as described above. For example, the user of the mobile terminal 100 may modify the name of a specific other electronic device 200, such as the name of the other mobile terminal object O13 indicated as 'Shin's phone'.

As illustrated in FIG. 18, an animation effect may be applied to the first object O1 to clearly reflect the current state.

As shown in FIGS. 18A to 18D, only the icon core CO is displayed, or the first to third rings RO1 to RO3 are sequentially displayed and displayed on the outer circumferential portion of the icon core CO. You can give an animation effect to release. For example, when the DMS is not selected, the first object O1 having the form as shown in FIG. 18A is displayed, and when a DMS is searched for or a connection is attempted, FIGS. This means that the first object O1 having the same form as d) can be displayed.

19 to 25 are diagrams illustrating a DMR selection process of a mobile terminal according to an embodiment of the present invention.

As shown in these drawings, the controller (180 of FIG. 1) of the mobile terminal 100 according to an embodiment of the present invention is another electronic that can perform the DMR function through the selection of the second object (O2) The device 200 can be easily selected.

As shown in FIG. 19A, the user may select the second object O2 using the finger F or the like.

As shown in FIG. 19B, when the user selects the second object O2, the controller (180 of FIG. 1) searches for and displays another electronic device 200 around the mobile terminal 100. It may be displayed on the lendi 151.

As shown in (a) of FIG. 20, the user may select a list of devices that want to render content from the displayed DMR list.

As shown in FIG. 20B, when the user selects a list of specific devices, the controller 180 of FIG. 1 may change the second object O2 to an icon corresponding to the selected device. For example, when the user selects TV as the DMR, the second object O2 may be changed to a TV object O21. Thus, the user can intuitively recognize the currently selected device.

As shown in (a) of FIG. 21, in a state where a camera object O11 corresponding to the selected DMS and a TV object O21 corresponding to the selected DMR are displayed, the user selects a specific content to execute the content in the DMR. You can do that. That is, the user may perform an operation of touching any one of the contents list of the camera, which is the selected DMR, by using the finger F and the like, and dragging it to the first area A1. When the drag and drop touch operation of the user is performed, the controller 180 of FIG. 1 may cause the content to be executed on the TV of the selected DMR.

As shown in (b) of FIG. 21, execution of specific content may be performed through a touch operation on the content. That is, the user may perform an operation of touching a specific content with the finger F or the like to allow the content to be executed.

As illustrated in (a) of FIG. 22, the user may perform an operation of selecting specific content by using the finger F or the like. However, the content may not be immediately executed by an operation of touching a specific content. That is, the selection of the content is made by the touch operation.

As shown in FIG. 22B, the user may perform an operation of touching the third object O3 of the first area A1. When the user touches the third object, content that has been selected from the camera to the TV may be transmitted.

As shown in FIG. 23A, the user can select desired content with the finger F or the like.

As shown in (b) of FIG. 23, a selection identification SI may be added to content selected by a user. The user may select a plurality of contents desired by the user using the finger F or the like.

As illustrated in (a) of FIG. 24, when a user selects a plurality of contents, the controller (180 of FIG. 1) may display the selected contents in the form of a thumbnail ST superimposed on a specific position.

As shown in FIG. 24B, the user may perform a touch operation of dragging and dropping the overlapped thumbnail ST to the first area A1. When the overlapped thumbnail ST is dragged and dropped to the first area A1, the controller (180 of FIG. 1) may allow a plurality of selected contents to be played on a TV that is a DMR.

As shown in FIG. 25, the mobile terminal 100 according to an embodiment of the present invention may serve as a DMC for the camera 200a and the TV 200b which are other electronic devices 200. That is, it means that the camera 200a and the TV 200b may be controlled so that the contents of the camera 200a, which is a DMS, may be transmitted to and reproduced by the TV 200b, which is a DMR. Content to be played on the TV 200b may be directly transmitted from the camera 200a to the TV 200b or transmitted from the camera 200a to the TV 200b via the mobile terminal 100.

26 to 28 illustrate a content selection process of a mobile terminal according to an embodiment of the present invention.

As shown in these figures, the mobile terminal 100 according to an embodiment of the present invention may selectively store a list of preferred contents.

As shown in FIG. 26A, a camera may be selected as the DMS. In the third area A3, a list of contents included in the camera, which is a DMS, may be displayed.

The fourth area A4 may be an area for dragging and dropping a list of content that the user prefers. That is, if the user drags and drops specific content to the fourth area A4, the content list can be stored separately and played back when necessary.

As illustrated in FIG. 26B, the user may touch and drag a specific content to the fourth area A4. The user may drag and drop the plurality of contents to the fourth area A4 while the camera is selected by the DMS. Furthermore, even after selecting another DMS, the content may be dragged and dropped to the fourth area. The content dragged and dropped by the user to the fourth area may be stored in a separate DMS or memory (160 of FIG. 1) of the mobile terminal 100. For example, it means that the index information of the content selected by the user can be stored in a separate DMS, so that specific content can be accessed based on the index information if necessary.

As shown in FIG. 27A, the user may touch the fourth area A4 with the finger F or the like.

As illustrated in FIG. 27B, when the user touches the fourth area A4, the controller (180 of FIG. 1) may display the content included in the preference list in the third area A3. . In addition, the preference list icon O14 may be displayed to visually express that the user has touched the fourth area A4.

28 and 29 are diagrams illustrating a DMS selection process of a mobile terminal according to an embodiment of the present invention.

As shown in these figures, the mobile terminal 100 according to an embodiment of the present invention can select a plurality of DMRs to reproduce content. In addition, the controller 180 of FIG. 1 may allow the content to be played back appropriately for the attribute of the selected DMR.

As illustrated in (a) of FIG. 28, the DMR list may be displayed in the third area A3. The user may select a TV with the finger F or the like. When the user selects a TV, a TV object O21 corresponding to the TV may be displayed.

As shown in FIG. 28B, the user may further select a speaker from the DMR list of the third region A3. When the user selects a speaker, the controller 180 of FIG. 1 may further display a speaker object O22 corresponding thereto.

As illustrated in FIG. 29, the controller 180 of FIG. 1 may allow the selected content of the plurality of DMRs, the TV 200b and the speaker 200c, to be played. At this time, it can be appropriately reproduced according to the attributes of each DMR. For example, when the selected content is a video, the controller (180 of FIG. 1) may allow an image to be played on the TV 200b and a sound to be played on the speaker 200c. Therefore, optimized rendering may be possible according to the selected DMR.

When the selected content is a video, the controller 180 of FIG. 1 may enable continuous viewing even when the video being played is transferred to another electronic device 200. That is, the seamless playback of the moving picture can be made possible by passing the index of the playback point of the moving picture together with the moving picture.

30 to 32 illustrate a DMS or a DMR selection process of a mobile terminal according to an embodiment of the present invention.

As shown in these figures, the mobile terminal 100 according to an embodiment of the present invention may display the first and second objects O1 and O2 corresponding to the DMS and the DMR in various ways.

As illustrated in FIG. 30, the first object O1 may be located in the first area A1, and the second object O2 may be located in the fourth area A4. That is, the first and second objects O1 and O2 may be located at the top and bottom of the display 151, respectively.

As illustrated in FIG. 31A, the controller 180 of FIG. 1 may list icons corresponding to the first and second objects O1 and O2 in the third area A3. For example, it is possible to list the searched DMS on the left side of the third area A3 and the searched DMR on the right side of the third area A3. The user may select one of the listed DMSs using the finger F or the like.

As shown in (b) of FIG. 31, the user may perform a drag touch operation connecting one of the DMSs and one of the DMRs. When the user connects the DMS and the DMR, the information can be exchanged between the connected devices.

33 illustrates an operation of a mobile terminal according to an embodiment of the present invention.

As shown in the drawing, the mobile terminal 100 according to an embodiment of the present invention may store information of another electronic device 200 connected at each place.

The mobile terminal 100 may be located in Area A and Area B. For example, Area A may be a home of a mobile terminal 100 user, and Area B may be a company of a mobile terminal 100 user.

The mobile terminal 100 may store information of the camera 200a and the TV 200b connected in the area A. FIG. In addition, the mobile terminal 100 can store information of the notebook computer 200c and another mobile terminal 200d connected in the Area B. FIG. By storing the information of the other electronic device 200 connected at each place, when the mobile terminal enters the place again, the search process for the other accessible electronic device 200 may be omitted or the search time may be shortened. Can be.

The control unit 180 of FIG. 1 of the mobile terminal 100 may automatically connect to another electronic device 200 finally connected at a specific point. For example, if the area A is finally connected to the camera 200a, the next time the area A is entered, it means that the first connection to the camera 200a is possible.

34 through 36 are diagrams illustrating a process of selecting a DMS and a DMP of a mobile terminal according to an embodiment of the present invention.

As shown in these figures, the mobile terminal 100 according to an embodiment of the present invention may allow the selection of DMS and DMP to be made more intuitive.

As illustrated in (a) of FIG. 34, the controller 150 of the mobile terminal 100 may proceed to search for another electronic device. That is, it means that other electronic devices located around the mobile terminal 100 and capable of communicating with the mobile terminal 100 can be searched for.

The status selection bar SA may be displayed on the upper side of the display 151. The status selection bar SA may include a play menu, a download menu, and an upload menu.

The play menu may be a menu for executing a function of causing the content of the first electronic device to be played on the second electronic device. The function of the play menu may be described by taking an example where the first electronic device is a camera and the second electronic device is an electronic picture frame. That is, when the video content is stored in the camera and the video content is to be played back in the electronic picture frame, the user can select a play menu.

When the play menu is selected, the controller 150 displays an electronic device capable of storing content and serving as a DMS on an upper side of the display 151 and displaying an electronic device capable of serving as a DMP displaying content. It may be displayed on the lower side of the display 151. Accordingly, the user can intuitively recognize that the content is transmitted and displayed from the electronic device displayed on the lower side of the display 151 to the electronic device displayed on the upper side.

The download menu and the upload menu may cause the content of the first electronic device to be transmitted to the second electronic device or vice versa. When the first electronic device is a camera and the second electronic device is an electronic picture frame, selecting a download menu and an upload menu may transmit image content of the camera to the electronic picture frame and vice versa. Hereinafter, the operation when the play menu is selected will be described.

As shown in (b) of FIG. 34, when the user selects the play menu of the status selection bar SA, the controller 150 may divide the display 151 into the first and second areas A1 and A2. can do. In this case, the first area A1 may be an upper area of the display 151 and the second area A2 may be an area of a lower side of the display 151.

The first area A1 may be an area for displaying an electronic device for playing content. For example, an electronic device having a DMP and / or DMR attribute may be displayed in the first area A1. The electronic device may be displayed in the play device area PD of the first area A1.

The play device area PD may display electronic devices of DMP and / or DMR attributes under certain rules. For example, it means that the electronic device of the retrieved DMP and / or DMR attribute may be displayed with the name, or the electronic device of the retrieved DMP and / or DMR attribute may be displayed with an icon representing the attribute. In such a case, the portion marked with A may have a TV-shaped icon, the portion indicated with B may have an electronic frame-shaped icon, and the portion marked with C may have a monitor-shaped icon.

The second area A2 may be an area for displaying an electronic device storing content. For example, it means that electronic devices having a DMS attribute may be displayed in the second area A2. FIG. 34B shows a state where a search for an electronic device storing content is in progress. The controller 150 may display that the user is searching, so that the user may immediately recognize the current state.

As shown in FIG. 34C, when the search for the electronic device having the DMS attribute to be displayed in the second area A2 is completed, the electronic devices corresponding to the first and second areas A1 and A2 are displayed. can do. For example, it means that the electronic device having the DMS attribute found in the server device area RD of the second area A2 can be displayed.

Since the first area A1 is displayed at the top of the display 151 and the second area A2 is displayed at the bottom of the display 151, the user intuitively recognizes which attribute of the electronic device is displayed in each area. can do. That is, it means that the user can intuitively know where the source of the data of the content is and where it moves from the source. For example, the user can intuitively recognize that the content moves from the K, L, M electronic devices shown in the second area A2 to the A, B, C electronic devices shown in the first area A1. have. The movement of the content data can be expressed more reliably through the direction indicator DI located between the first and second areas A1 and A2.

The direction indicator DI allows the first and second regions A1 and A2 to be distinguished from each other, and can directly display a direction in which data moves through a triangle shape displayed in the middle thereof. That is, as shown in FIG. 34C, when the play menu is selected, the K, L, and M electronic devices in the second area A2 are transferred to the A, B, and C electronic devices in the first area A1. It may be directly indicating that the content is moving.

As shown in (a) of FIG. 35, the play device area PD of the first area A1 and the server device area RD of the second area A2 may be displayed while the play menu is selected. have. As described above, since the play device area PD is displayed on the upper side and the server device area RD on the lower side, the user can intuitively recognize the direction of movement of the content.

The user may select at least one of the electronic devices displayed on the first area A1 by using the finger F or the like. For example, this means that a touch operation for selecting the B electronic device may be performed. Although A to C and K to M are shown in the form of a rectangle, this is for convenience of understanding. That is, as described above, it means that the electronic device can be displayed by the name and / or icon.

As illustrated in (b) of FIG. 35, a device corresponding to the electronic device selected by the user may be expressed in different colors to indicate that the device is selected. The user may select at least one of the electronic devices displayed in the second area A2 in order to select a specific DMS.

As shown in (c) of FIG. 35, when the user selects a specific DMS in the second area A2, the second area A2 may be expanded and a list of contents stored in the specific DMS may be displayed. That is, the second area A2 is expanded, and as the second area A2 is expanded, the first area A1 may be reduced. Reduction and expansion of the first and second regions A1 and A2 may be performed continuously by an animation effect.

In the expanded second area A2, a list of contents stored in the selected DMS may be displayed. For example, when the M electronic device is selected as the DMS, it means that the content included in the M electronic device can be displayed.

The content displayed in the extended second area A2 may be in the form of a folder such as F1 to F4 and in the form of a file such as P1 to P12. For example, if a specific folder is selected from F1 to F4, the contents of the file included in the folder are displayed in the second area A2, and if a specific file is selected from P1 to P12, the file may be displayed. do.

As shown in FIG. 35D, the user may select at least one of the contents displayed in the second area A2 by using the finger F or the like.

As illustrated in (a) of FIG. 36, the content selected by the user may be transmitted to B, which is an electronic device having a pre-selected DMP and / or DMR attribute. The transmission may be a form in which data of the entire selected content is transmitted to the electronic device B, a form in which the selected content is streamed to the electronic device B, and the like.

The controller 150 may express in the form of animation that the selected content is being transmitted to the B electronic device. For example, it means that the selected P6 file may have the effect of flying to the location of the B electronic device.

As shown in FIG. 36B, the display 151 may display a P6 file being transmitted. Thus, the user can intuitively recognize which file is currently being transmitted.

37 and 38 are diagrams illustrating a process of downloading contents between terminals by a control operation of a mobile terminal according to an embodiment of the present invention.

As shown in the drawing, when the user selects the download menu of the status selection bar SA, the controller 150 of the mobile terminal 100 changes the screen to be optimized for the selected download menu. Can be displayed.

As shown in (a) of FIG. 37, the play menu of the state selection bar SA may be selected.

As shown in (b) of FIG. 37, the user may select a download menu from the status selection bar SA.

As shown in (c) of FIG. 37, when the user selects the download menu, the screen displayed on the display 151 may be changed. In this case, the layout of the found electronic device may be appropriately changed to correspond to the term download. For example, an electronic device in which content is stored is displayed in a first area A1 on the upper side of the display 151, and an electronic device to download content is displayed in a second area A2 on a lower side of the display 151. It can be displayed. That is, the electronic device of the DMS attribute is displayed in the transmission device region FD which is the first region A1, and the electronic device having the DMP and / or DMR attribute is displayed in the download device region DD which is the second region A2. It can be.

As shown in FIG. 37D, the user may select the L electronic device in the second area A2 and select the B electronic device in the first area A1. In order to correspond to the direction intuited by the term “download”, an intuitive screen configuration may be possible by displaying an electronic device storing content data on the upper side and an electronic device to download the content data on the lower side.

As illustrated in (a) of FIG. 38, when the user selects the electronic device B in the first area A1, the first region A1 may be enlarged and a list of contents stored in the electronic device B may be displayed. .

As shown in FIG. 38 (b), the user can select the P4 content with the finger F or the like.

As illustrated in (c) of FIG. 38, when a user selects a specific content, an animation effect of moving the content to the L electronic device of the second area A2 may be expressed.

39 is a diagram illustrating a content upload process between terminals by a control operation of a mobile terminal according to one embodiment of the present invention.

As shown, the controller 150 of the mobile terminal 100 according to an embodiment of the present invention may change the configuration of the screen to correspond to the user selecting the upload menu.

As illustrated in (a) of FIG. 39, the mobile terminal 100 may be in a state where a play menu is currently selected.

As shown in (b) of FIG. 39, the user may select an upload menu from the status selection bar SA. When the user selects the upload menu, an electronic device to transmit content data is displayed in the second area A2 below the display 151, and content data is transmitted in the first area A1 above the display 151. The electronic device to be received may be displayed. That is, since the screen of the display 151 is divided into upper and lower sides so as to correspond to the meaning of the term upload, the intuitive operation of the mobile terminal 100 may be possible.

As shown in (c) of FIG. 39, the user selects the electronic device B in the upload device area UD of the first area A1, and then selects M in the transmission device area FD of the second area A2. The operation of selecting an electronic device may be performed.

As shown in FIG. 39D, when the user selects the M electronic device in the second area A2, the second area A2 is expanded and a list of contents stored in the M electronic device is displayed. Can be. The user can select the appropriate content from the displayed list to cause the B electronic device to send the content.

40 is a diagram illustrating a playlist display process of a mobile terminal according to one embodiment of the present invention.

As shown in the drawing, the controller 150 of the mobile terminal 100 according to an embodiment of the present invention may display a list of contents currently being played or played back in the past.

As shown in FIG. 40A, a play menu may be selected. With the play menu selected, the user can select an action of selecting P6 content to transmit to the B electronic device.

As shown in FIG. 40B, the display 151 may display P6 content currently being transmitted and reproduced.

As shown in FIG. 40C, the display 151 displays electronic devices having DMR and / or DMP attributes in the first area A1 regardless of the content transfer, and displays in the second area A2. A list of content may be displayed. In addition, a playlist menu may be displayed in the third area A3. The user may select a playlist menu displayed in the third area A3.

As shown in (d) of FIG. 40, when the user selects the playlist menu, a playlist pop-up window PP may be displayed on the display 151. In the playlist pop-up window PP, information about a file currently being played and / or information about a version that has been played in the past may be displayed.

The user may perform an operation of selecting a specific content displayed in the playlist pop-up window PP so that transmission and / or reproduction of the specific content is performed without a separate operation.

41 is a view illustrating a content reproduction process of a mobile terminal according to an embodiment of the present invention.

As shown in the drawing, the controller 150 of the mobile terminal 100 according to an embodiment of the present invention may change the state displayed on the display 151 according to the role of the mobile terminal 100. .

As shown in FIG. 41A, the user can select P6 content to be transmitted to the electronic device B. FIG. In this case, the B electronic device may be the mobile terminal 100 itself which received the input, or may be another electronic device. The mobile terminal 100 according to an embodiment of the present invention may display the screen differently when the electronic device B receives the input from the mobile terminal 100 itself and when the electronic device 100 is another electronic device.

As illustrated in (b) of FIG. 41, when the electronic device B is the mobile terminal 100 itself, the P6 content may be displayed relatively large.

When the mobile terminal 100 simultaneously plays the role of the DMS and the role of the DMP, it may not be necessary to use an external communication channel for the transmission of the selected content. Accordingly, since the limitation of the transmission speed may be extremely low, a resolution sufficient to display the selected content on the front of the display 151 may be obtained.

As illustrated in (c) of FIG. 41, when the electronic device B is an electronic device other than the mobile terminal 100, the P6 content may be displayed relatively small.

When transmitting content to an electronic device other than the mobile terminal 100, an external communication channel should be used. Therefore, transmission speed may be limited. Therefore, when it is necessary to display the content being transmitted in the mobile terminal 100, the resolution, etc. can be lowered to minimize the load required for communication.

42 is a diagram illustrating a process of setting a content download location of a mobile terminal according to one embodiment of the present invention.

As shown in the drawing, the mobile terminal 100 according to an embodiment of the present invention can designate a specific location where the content to be transmitted is stored.

As illustrated in (a) of FIG. 42, the user may perform an operation of selecting and transmitting P4 content to the L electronic device.

As shown in (b) of FIG. 42, when the finger F is positioned on the L electronic device while the user transmits to the L electronic device, the controller 150 displays a folder pop-up window PF on the display 151. Can be displayed.

The controller 150 may display information for selecting where to store the selected content in the folder pop-up window PF. For example, as shown, it means that the AA, BB, and CC folders may be present in the L electronic device. Information about the folder can be obtained from the L electronic device.

As illustrated in (c) of FIG. 42, the user may move the finger F on a specific folder among folders displayed in the folder pop-up window PF. When the user selects a specific folder and moves the finger F, the selected content may be stored in the folder. That is, the P4 content may be stored in the AA folder of the L electronic device.

43 and 44 are diagrams illustrating a content transmission process of a mobile terminal according to an embodiment of the present invention.

As shown in these figures, the mobile terminal 100 according to an embodiment of the present invention may display a content delivery process or a delivery state.

As illustrated in FIG. 43A, the user may perform a touch operation for selecting P4 content in the first area A1 and transmitting the P4 content to the L electronic device in the second area A2.

As illustrated in (b) of FIG. 43, when the user selects and transmits specific content, the controller 150 displays a progress bar displaying a transmission state of the corresponding content in the third area A3. PB) can be displayed.

The display of the progress bar PB may vary depending on the delivery state of the selected content. Therefore, the user can intuitively recognize the transmission state of the content.

As illustrated in (a) of FIG. 44, the display area 151 may display a functional area SI on one side of an upper end.

As shown in (b) of FIG. 44, the user can select the functional area SI with the finger F or the like.

As shown in (c) of FIG. 44, when the user selects the functional area SI, the controller 150 may display the progress information popup window SP on the display 151.

The progress information popup window SP may display the status of content currently being played, downloaded, or uploaded. For example, it means that the progress, the name of the content in progress, and the like can be displayed.

45 is a diagram illustrating a multi-content transmission process of a mobile terminal according to one embodiment of the present invention.

As shown in the drawing, the mobile terminal 100 according to an embodiment of the present invention may select and transmit a plurality of contents.

As illustrated in FIG. 45A, the user may perform a touch operation of selecting P4 content displayed in the first area A1 and transmitting the P4 content displayed on the second area A2 to the L electronic device.

As illustrated in FIG. 45B, when the P4 content is selected and transmitted to the L electronic device, the controller 150 may display an indicator that the P4 content is being transmitted on the L electronic device. have.

With the indicator displayed on the L electronic device, the user can perform a touch operation to send P6 content back to the L electronic device.

As illustrated in (c) of FIG. 45, when a touch operation of selecting and transmitting P6 content to the L electronic device is performed, the controller 150 may further display an indicator that the P6 content is being transmitted on the L electronic device. Can be.

In a state where a plurality of indicators are displayed on the L electronic device, the user may perform a touch operation for transmitting P9 content back to the L electronic device.

As illustrated in (d) of FIG. 45, when performing a touch operation of selecting P9 content and transmitting the P9 content to the L electronic device, the controller 150 displays an indicator indicating that the P9 content is being transmitted on the L electronic device. Can display more.

By being able to display information about content being transmitted to a particular electronic device, the user can intuitively recognize the delivery status of the content.

46 is a diagram illustrating a control process through a widget of a mobile terminal according to one embodiment of the present invention.

As shown in the drawing, the mobile terminal 100 according to an embodiment of the present invention may display a transmission state of content through a widget.

As illustrated in (a) of FIG. 46, the controller 150 may display an icon or a widget capable of executing a specific function on the background screen WD of the mobile terminal 100. Among the widgets displayed on the desktop (WD), there may be a content widget (SWD) capable of transmitting or sharing content.

As illustrated in (b) of FIG. 46, the content widget SWD may visually indicate that a specific operation is performed by a user's touch operation. That is, the display of the content widget SWD may be changed by the user's touch operation.

As shown in FIG. 46B, the content widget SWD may include a first widget area SWD1 and a second widget area SWD2.

The first widget area SWD1 may be an area for selecting whether or not to activate the content sharing or transmission function and indicate whether the selected widget is activated. The user may select activation and deactivation by touching the first widget area SWD1. In addition, the display of the first widget area SWD1 may be changed from ON to OFF to intuitively display the current state.

The second widget area SWD2 may be an area for allowing specific settings for content sharing or transmission function to be performed. The user may set a specific function of the content sharing or transmission function according to the present invention by touching the second widget area SWD2.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit and scope of the invention. Accordingly, such modifications or variations are intended to fall within the scope of the appended claims.

100: mobile terminal 110: wireless communication unit
140: sensing unit 151: display unit
180:

Claims (27)

display;
A wireless communication unit communicating with at least one other electronic device to exchange content data; And
The information on at least one other electronic device communicated through the wireless communication unit is divided into a first electronic device having a management property for content and a second electronic device having a property for rendering the content. And a control unit which displays a display position and changes the display position of the first and second electronic devices according to the exchange state of the content data.
The method according to claim 1,
The exchange state of the content data is
And a play state for reproducing the content, a download state for acquiring and storing the content, and an upload state for transmitting and storing the content.
The method according to claim 1,
The exchange state of the content data is
A transmission direction of the content data from at least one of the first and second electronic devices to at least another one,
The control unit,
And changing a display position of the first and second electronic devices based on a transmission direction of the content data.
The method of claim 3,
The control unit,
The display is divided into a first area, which is an upper area of the display, and a second area, which is a lower area of the display,
And displaying the first electronic device and the second electronic device in either one of the first area and the second area based on a transmission direction of the content data.
5. The method of claim 4,
The control unit,
And an indicator indicating a transmission direction of the content data on a boundary between the first and second areas.
The method according to claim 1,
The control unit,
And a list of contents stored in the selected electronic device among the first electronic devices on the display.
The method according to claim 1,
The control unit,
And at least one of a size and a resolution of content data displayed on the display in accordance with the type of the second electronic device.
The method according to claim 1,
The control unit,
And at least one of an exchange state of the content data and a type of the content data to be exchanged.
The method according to claim 1,
The control unit,
And at least one of colors and shapes of the first and second electronic devices are displayed differently according to whether at least one of the first and second electronic devices is selected and a communication state with the first and second electronic devices.
10. The method of claim 9,
The control unit,
When acquiring selection signals for the first and second electronic devices,
A mobile terminal for changing the display to reflect the properties of the selected first and second electronic devices.
The method according to claim 1,
The control unit,
And when the selection signal for the first electronic device or the second electronic device is acquired, searching for the electronic device that has obtained the selection signal through the wireless communication unit.
The method according to claim 1,
Further comprising a memory in which the index information for the first and second electronic devices that can communicate at a specific location,
The control unit,
And searching for the first and second communicable electronic devices based on the stored index information.
The method according to claim 1,
The control unit,
Displays a list of content stored in the first electronic device on the display,
And obtaining a selection signal for at least one of the list of displayed contents, so that the content corresponding to the list of contents for which the selection signal is obtained is reproduced by the second electronic device.
The method of claim 13,
The selection signal is,
And a touch operation of touching and dropping at least one of the displayed content list to drag and drop the second electronic device to the displayed area.
The method according to claim 1,
The apparatus may further include a memory in which access information about the at least one other electronic device communicating through the wireless communication unit at a specific location is stored.
The control unit,
The mobile terminal, upon entering the specific location, initiates communication with another electronic device corresponding to the connection information.
display;
A wireless communication unit communicating with at least one other electronic device; And
Display a first object for selection of a first electronic device having a management attribute for a content and a second object for selection of a second electronic device having an attribute for rendering the content, and for the first and second objects And a control unit configured to transfer the content from the first electronic device to the second electronic device in response to an input, and to change the display position of the first and second objects according to an exchange state of the content.
17. The method of claim 16,
The exchange state of the content data is
And a play state for reproducing the content, a download state for acquiring and storing the content, and an upload state for transmitting and storing the content.
17. The method of claim 16,
The control unit,
And dividing the display into a plurality of regions, and displaying the first and second objects in the same region among the divided regions.
17. The method of claim 16,
The control unit,
And displaying at least one of colors and shapes of the first and second objects differently depending on at least one of whether the first and second electronic devices are selected and a communication state with the first and second electronic devices.
17. The method of claim 16,
The control unit,
When acquiring selection signals for the first and second electronic devices,
A mobile terminal for changing the display of the first and second objects by reflecting the attributes of the selected first and second electronic devices.
Displaying a first object corresponding to a first electronic device having a management attribute for a content and a second object corresponding to a second electronic device having a property of rendering the content;
Transferring the content from the first electronic device to the second electronic device in response to an input to the first and second objects; And
And changing the display position of the first and second objects according to the exchange state of the contents.
The method of claim 21,
Wherein the modifying comprises:
Dividing the display into a first area and a second area; And
And displaying the first object and the second object in one of the first area and the second area, respectively, based on a transmission direction of the content.
The method of claim 21,
The displaying step,
Acquiring a first selection signal for the first object; And
And searching for the first electronic device and displaying a list of the first electronic devices.
24. The method of claim 23,
Acquiring a second selection signal for at least one of the displayed first electromagnetic list; And
And displaying a list of the contents stored in a first electronic device corresponding to the acquired second selection signal.
25. The method of claim 24,
And when the second selection signal is acquired, changing the display of the first object by reflecting the attributes of the first electronic device corresponding to the acquired second selection signal.
The method of claim 21,
The delivering step,
Acquiring a third selection signal for the second object;
Searching for the second electronic device to display a list of the second electronic device; And
And obtaining a fourth selection signal for at least one of the displayed list of second electronic devices.
27. The method of claim 26,
And when the fourth selection signal is acquired, changing the display of the second object by reflecting the attributes of the second electronic device corresponding to the acquired fourth selection signal.
KR1020110062704A 2011-06-28 2011-06-28 Mobile terminal and control method therof KR20130001826A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020110062704A KR20130001826A (en) 2011-06-28 2011-06-28 Mobile terminal and control method therof
US13/335,187 US9207853B2 (en) 2011-06-28 2011-12-22 Mobile terminal and method of controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110062704A KR20130001826A (en) 2011-06-28 2011-06-28 Mobile terminal and control method therof

Publications (1)

Publication Number Publication Date
KR20130001826A true KR20130001826A (en) 2013-01-07

Family

ID=47834695

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110062704A KR20130001826A (en) 2011-06-28 2011-06-28 Mobile terminal and control method therof

Country Status (1)

Country Link
KR (1) KR20130001826A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150047684A (en) * 2013-10-24 2015-05-06 에스케이플래닛 주식회사 Method for managing contents based on cloud computing, system and apparatus thereof
KR20150048260A (en) * 2013-10-24 2015-05-07 에스케이플래닛 주식회사 Method for sharing contents based on cloud computing, system and service apparatus thereof
KR20160070571A (en) * 2014-12-10 2016-06-20 삼성전자주식회사 Method for controlling and an electronic device thereof
KR20170030988A (en) * 2015-09-10 2017-03-20 엘지전자 주식회사 Mobile terminal and method of controlling the same
WO2017122920A1 (en) * 2016-01-13 2017-07-20 삼성전자 주식회사 Content display method and electronic device for performing same
KR20190054295A (en) * 2017-11-13 2019-05-22 삼성전자주식회사 Display apparauts and control method thereof
CN112083987A (en) * 2014-06-23 2020-12-15 谷歌有限责任公司 Remote invocation of mobile device actions

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150047684A (en) * 2013-10-24 2015-05-06 에스케이플래닛 주식회사 Method for managing contents based on cloud computing, system and apparatus thereof
KR20150048260A (en) * 2013-10-24 2015-05-07 에스케이플래닛 주식회사 Method for sharing contents based on cloud computing, system and service apparatus thereof
CN112083987A (en) * 2014-06-23 2020-12-15 谷歌有限责任公司 Remote invocation of mobile device actions
KR20160070571A (en) * 2014-12-10 2016-06-20 삼성전자주식회사 Method for controlling and an electronic device thereof
KR20170030988A (en) * 2015-09-10 2017-03-20 엘지전자 주식회사 Mobile terminal and method of controlling the same
WO2017122920A1 (en) * 2016-01-13 2017-07-20 삼성전자 주식회사 Content display method and electronic device for performing same
US10960295B2 (en) 2016-01-13 2021-03-30 Samsung Electronics Co., Ltd. Content display method and electronic device for performing same
KR20190054295A (en) * 2017-11-13 2019-05-22 삼성전자주식회사 Display apparauts and control method thereof

Similar Documents

Publication Publication Date Title
KR101757870B1 (en) Mobile terminal and control method therof
US9207853B2 (en) Mobile terminal and method of controlling the same
US8675024B2 (en) Mobile terminal and displaying method thereof
KR101690232B1 (en) Electronic Device And Method Of Controlling The Same
US9678650B2 (en) Method and device for controlling streaming of media data
KR101917696B1 (en) Mobile terminal and control method thereof
KR20130050762A (en) Mobile terminal and method for controlling operation thereof
KR20110123099A (en) Mobile terminal and control method thereof
KR20130032192A (en) Mobile device and method for controlling play of contents in mobile device
KR20130001826A (en) Mobile terminal and control method therof
KR20150049900A (en) Control device and operating method thereof
KR101875744B1 (en) Electonic device and method for controlling of the same
KR20110128487A (en) Electronic device and contents sharing method for electronic device
KR20130062437A (en) Mobile terminal and control method thereof
KR20120026189A (en) Mobile terminal and control method therof
KR20140044659A (en) Mobile terminal and control method thereof
KR20120105318A (en) Method for sharing of presentation data and mobile terminal using this method
KR101973641B1 (en) Mobile terminal and control method for mobile terminal
KR20110064628A (en) Method for controlling display of multimedia data through mobile terminal and mobile terminal thereof
KR20130030691A (en) Mobile terminal and electronic device control system using the same
KR101748153B1 (en) Method for displaying information in home network and mobile terminal using this method
KR101771458B1 (en) Method for transmitting data and mobile terminal using this method
KR20140133160A (en) Terminal and method for controlling the same
KR20120061615A (en) Method for controlling user interface through home network and mobile terminal using this method
KR20120029545A (en) Mobile terminal and control method therof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination