KR20130119172A - Mobile terminal and method for controlling thereof - Google Patents

Mobile terminal and method for controlling thereof Download PDF

Info

Publication number
KR20130119172A
KR20130119172A KR1020120042124A KR20120042124A KR20130119172A KR 20130119172 A KR20130119172 A KR 20130119172A KR 1020120042124 A KR1020120042124 A KR 1020120042124A KR 20120042124 A KR20120042124 A KR 20120042124A KR 20130119172 A KR20130119172 A KR 20130119172A
Authority
KR
South Korea
Prior art keywords
mobile terminal
application
area
display
displayed
Prior art date
Application number
KR1020120042124A
Other languages
Korean (ko)
Other versions
KR101952682B1 (en
Inventor
황금성
김지은
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120042124A priority Critical patent/KR101952682B1/en
Publication of KR20130119172A publication Critical patent/KR20130119172A/en
Application granted granted Critical
Publication of KR101952682B1 publication Critical patent/KR101952682B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • H04M1/72527With means for supporting locally a plurality of applications to increase the functionality provided by interfacing with an external accessory
    • H04M1/7253With means for supporting locally a plurality of applications to increase the functionality provided by interfacing with an external accessory using a two-way short-range wireless interface
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports

Abstract

PURPOSE: A mobile terminal and a controlling method thereof are provided to display execution screens of a plurality of applications at the same time, in sharing contents images of the mobile terminal through an external device. CONSTITUTION: An application is executed in a mobile terminal (S520). An execution screen of the application is displayed in a first area of a first display (S530). An application additional command is inputted through the mobile terminal (S540). The execution screen of the application of the first area moves to a second area of the first display (S550). An additional application is executed in the mobile terminal (S560). The execution screen of the additional application is displayed in the first area (S570). [Reference numerals] (AA) Start; (BB) End; (S510) Activate a sharing mode and connect an external device; (S520) Execute an application; (S530) Display an execution screen of the application in a first area of a first display; (S540) Input an application additional command; (S550) Move the execution screen of the application of the first area to a second area of the first display; (S560) Execute an additional application; (S570) Display the execution screen of the additional application in the first area; (S580) Input a command to input additional applications?; (S590) Execute an additional function through an application displayed in the first and second areas

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THEREOF [0002]

The present invention relates to a mobile terminal, and more particularly, to a mobile terminal and a control method thereof capable of displaying at least some of the contents of the mobile terminal through one or more external digital devices.

The terminal can move And can be divided into a mobile / portable terminal and a stationary terminal depending on whether the mobile terminal is a mobile terminal or a mobile terminal. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

Recently, an image sharing function for displaying an image displayed on a mobile terminal through a display device connected to the mobile terminal has been provided. However, since the video sharing function is generally performed in a form of displaying a screen displayed on a mobile terminal through an external device, it is also called mirroring. As a result, when a general mirroring method displays an application running on the full screen among a plurality of running applications, it is difficult to simultaneously display the screens of other running applications.

The present invention provides a mobile terminal and a control method thereof capable of simultaneously displaying the execution screens of a plurality of applications in sharing a content image of the mobile terminal through an external device.

Another object of the present invention is to provide a mobile terminal and a method of controlling the same, which can more conveniently select a plurality of applications whose execution images are to be shared through an external device.

In addition, the present invention is to provide a mobile terminal and a control method thereof that can more conveniently control a plurality of content that is shared through an external device.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, unless further departing from the spirit and scope of the invention as defined by the appended claims. It will be possible.

According to an aspect of the present invention, there is provided a control method of a mobile terminal, the method including: setting a data path with an external device including a first display; Executing the application in the mobile terminal; Displaying an execution screen of the application on a first area of the first display; Inputting an application addition command through the mobile terminal; Moving an application execution screen of the first area to a second area of the first display; Executing an additional application on the mobile terminal; And displaying an execution screen of the additional application in the first area.

In addition, a control method of a mobile terminal according to another aspect of an embodiment of the present invention for realizing the above object comprises the steps of: setting a data path with an external device including a first display; Executing the first application on the mobile terminal; Displaying an execution screen of the application on a first area of the first display; Manipulating a specific key button in the mobile terminal; Executing a second application on the mobile terminal; And displaying an execution screen of the second application on a second area of the first display.

In addition, a control method of a mobile terminal according to another aspect of an embodiment of the present invention for realizing the above object comprises: setting a data path with a first external device including a first display; Executing the first application on the mobile terminal; Displaying an execution screen of the application on a first area of the first display; Receiving execution information of a second application from a second external device; And displaying an execution screen corresponding to execution information of the second application in a second area of the first display.

Through the mobile terminal according to at least one embodiment of the present invention configured as described above, a user may conveniently share execution screens of a plurality of applications to an external device.

In addition, an application for displaying an execution screen may be selected through an external device with a simple operation.

In addition, it is possible to conveniently change or control the display state of a plurality of applications shared through the external device.

The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2 is a front perspective view of a mobile terminal according to an embodiment of the present invention.
3 is a front view of a mobile terminal for explaining an operation state of the mobile terminal according to the present invention.
4 illustrates an example of a connection type between digital devices applicable to embodiments of the present invention.
5 is a flowchart illustrating an example of a process of displaying execution screens of a plurality of applications through an external device according to an embodiment of the present invention.
6A to 6D illustrate an example of a process of sharing execution images of a plurality of applications to an external device in a mobile terminal according to an embodiment of the present invention.
7A and 7B illustrate an example of a process of changing an arrangement state between execution screens when execution images of a plurality of applications are shared with an external device in a mobile terminal according to one embodiment of the present invention.
8 illustrates an example of a process of changing a display state of a central execution screen when the execution images of a plurality of applications are shared by an external device in a mobile terminal according to an embodiment of the present invention.
9 shows an example of an execution image sharing method according to another aspect of an embodiment of the present invention.
10 illustrates an example of an overlay window control method in a mobile terminal according to another aspect of an embodiment of the present invention.
11 illustrates an example of a method of changing an arrangement state of an overlay window through a predetermined user interface in a mobile terminal according to another aspect of an embodiment of the present invention.
12 illustrates an example of a method of changing an overlay window arrangement state using a sensing unit in a mobile terminal according to another embodiment of the present invention.
13 illustrates an example of a method of sharing an image of another terminal to an external device in the form of an overlay window in a mobile terminal according to another aspect of an embodiment of the present invention.
14 illustrates an example of an operation according to an event occurrence while performing a sharing function in a mobile terminal according to another embodiment of the present invention.

Hereinafter, a mobile terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), LTE (Long Term Evolution) .

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for obtaining the position of the mobile terminal, and a representative example thereof is a Global Position System (GPS) module.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence or absence of user contact, the orientation of the mobile terminal, And generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed. Meanwhile, the sensing unit 140 may include a proximity sensor 141.

The output unit 150 is for generating an output relating to visual, auditory or tactile sense and includes a display unit 151, an acoustic output module 152, an alarm unit 153, a haptic module 154, 155, and the like.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another, or may be disposed integrally with one another, and may be disposed on different surfaces, respectively.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

The proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152 so that they may be classified as a part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The projector module 155 is a component for performing an image project function using the mobile terminal 100 and is similar to the image displayed on the display unit 151 in accordance with a control signal of the controller 180 Or at least partly display another image on an external screen or wall.

Specifically, the projector module 155 includes a light source (not shown) that generates light (for example, laser light) for outputting an image to the outside, a light source And a lens (not shown) for enlarging and outputting the image at a predetermined focal distance to the outside. Further, the projector module 155 may include a device (not shown) capable of mechanically moving the lens or the entire module to adjust the image projection direction.

The projector module 155 can be divided into a CRT (Cathode Ray Tube) module, an LCD (Liquid Crystal Display) module and a DLP (Digital Light Processing) module according to the type of the display means. In particular, the DLP module may be advantageous for miniaturization of the projector module 151 by enlarging and projecting an image generated by reflecting light generated from a light source on a DMD (Digital Micromirror Device) chip.

Preferably, the projector module 155 may be provided longitudinally on the side, front or back side of the mobile terminal 100. It goes without saying that the projector module 155 may be provided at any position of the mobile terminal 100 as occasion demands.

The memory unit 160 may store a program for processing and controlling the control unit 180 and temporarily store the input / output data (e.g., telephone directory, message, audio, For example. The memory unit 160 may also store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia). In addition, the memory unit 160 may store data on vibration and sound of various patterns output when the touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input by the user to the cradle may be transmitted It can be a passage to be transmitted to the terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

FIG. 2 is a perspective view of a mobile terminal or a portable terminal according to the present invention, as viewed from the front.

The disclosed mobile terminal 100 has a bar-shaped terminal body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injection molding a synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti) or the like.

The display unit 151, the sound output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, the interface 170, and the like may be disposed in the front body 101 have.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion and may be employed in any manner as long as the user operates in a tactile manner.

The contents inputted by the first or second operation unit 131 or 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 To the touch recognition mode of the touch screen.

In addition to the antenna for talking and the like, a broadcast signal receiving antenna 116 may be further disposed on the side of the terminal body. The antenna 116, which forms part of the broadcast receiving module 111 (see FIG. 1), can be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the portable terminal 100 is mounted on the terminal body. The power supply unit 190 may be built in the terminal body or may be detachable from the outside of the terminal body.

Hereinafter, a related operation of the display unit 151 and the touch pad 135 will be described with reference to FIG.

3 is a front view of a portable terminal for explaining an operation state of the portable terminal according to the present invention.

Various types of time information can be displayed on the display unit 151. [ These pieces of information can be displayed in the form of letters, numbers, symbols, graphics, or icons.

At least one of the letters, numbers, symbols, graphics, or icons may be displayed in a predetermined arrangement for inputting such information, thereby being implemented as a keypad. Such a keypad may be referred to as a so-called " virtual keypad ".

3 illustrates inputting of a touch applied to the virtual keypad through the front surface of the terminal body.

The display unit 151 may operate as an entire area or may be divided into a plurality of areas and operated. In the latter case, the plurality of areas can be configured to operate in association with each other.

For example, an output window 151a and an input window 151b are displayed on the upper and lower portions of the display unit 151, respectively. The output window 151a and the input window 151b are areas allocated for outputting or inputting information, respectively. In the input window 151b, a virtual keypad 151c is displayed in which a number for inputting a telephone number or the like is displayed. When the virtual keypad 151c is touched, numbers and the like corresponding to the touched virtual keypad are displayed on the output window 151a. When the first operation unit 131 is operated, call connection to the telephone number displayed on the output window 151a is attempted.

In addition, the display unit 151 or the touch pad 135 may be configured to receive a touch input by scrolling. The user can move a cursor or a pointer located on an object displayed on the display unit 151, for example, an icon or the like, by scrolling the display unit 151 or the touch pad 135. [ Further, when the finger is moved on the display portion 151 or the touch pad 135, the path along which the finger moves may be visually displayed on the display portion 151. [ This will be useful for editing the image displayed on the display unit 151. [

One function of the terminal may be executed in response to a case where the display unit 151 (touch screen) and the touch pad 135 are touched together within a predetermined time range. In the case of being touched together, there may be a case where the user clamps the terminal body using the thumb and index finger. The one function may be activation or deactivation of the display unit 151 or the touch pad 135, for example.

For convenience of explanation, it is assumed that the mobile terminal referred to below includes at least one of the components shown in FIG. Also, an arrow or a finger-shaped graphic for pointing a specific object on the display unit 151 or selecting a menu on the display unit 151 is called a pointer or a cursor. However, in the case of a pointer, it often means a finger or a stylus pen for a touch operation or the like. Therefore, in this specification, a graphic displayed on the display unit is referred to as a cursor, and a physical means for performing a touch, a proximity touch, and a gesture, such as a finger or a stylus pen, is referred to as a pointer.

Often, an application is used as a concept of software that is separately installed / executed. However, an application referred to in the present invention is a concept of indicating all objects that visually display information in a predetermined area when a specific function is executed. The controller 180 of the mobile terminal according to the present invention can simultaneously control two or more applications, and the executed application includes the display unit 151, other image display means provided on the mobile terminal, and / or the mobile terminal. The screen may be divided on the display unit provided in another external device connected to the display at the same time or only one of them may be displayed in full screen, or one application may be displayed to cover at least part of an area related to another application. In addition, it is assumed that the controller 180 of the mobile terminal 100 according to the present invention can perform a multi-tasking function of simultaneously executing and controlling two or more of the above-described applications. In addition, although one application execution screen is displayed on the display unit, it is assumed that each application that is a target of sharing through the external device is updated in real time so that the execution image is transmitted to the external device.

Control by connecting external devices, and contents  Play / swap

Recently, as the processor of the mobile terminal, i.e., the controller 180, improves dramatically, advanced computation is possible. In addition, the performance of the wireless communication unit 110 has been improved to enable high-speed data communication through various air interfaces. This makes it possible to share data between the mobile terminal and other digital devices with other mobile terminals or display devices, in particular the images / content displayed. Of course, data sharing between the devices may be performed through a wire as well as wireless.

Interoperability technology for content exchange between digital devices is currently being internationally standardized, and one of them is the Digital Living Network Alliance (DLNA). The DLNA standard proposes various conditions and methods for mutual data exchange between various digital devices. In the embodiments of the present invention, a connection method, a standard, and the like may be supplemented through the DLNA standard document, but the present invention is not limited thereto, and various communication interfaces, for example, Wi-Fi, Bluetooth, IEEE1394, universal serial bus, etc. (USB), infrared communication (IrDA), and UPnP (Universal Plug & Play). The content is delivered from the server to the renderer may be a method that delivers the source of the content as it is to play, as well as a method of transmitting a still-cut screenshot by sampling the image in which the content is displayed at a specific period or a predetermined frame It may be in the form of a real-time video streaming. In addition, a shared application for content sharing may be installed in a server and a renderer, respectively, and data exchange may be performed in a form defined in the shared application.

4 illustrates an example of a connection type between digital devices applicable to embodiments of the present invention.

Referring to FIG. 4A, the mobile terminal 100 according to the present invention may be connected to the computer 410 and the television 430 by wire / wireless to perform content / data exchange between them. In this case, one device may serve as a content server for supplying content to the other device, and the content sharing may be performed in a form in which the other device receives the same and displays the corresponding content. In addition, any one device may control a device serving as a content server, and the remaining devices may perform a content display function only. For example, the computer 410 may be a content server, and the mobile terminal 100 may control the computer 410 so that specific content of the computer is displayed through the television 430. In addition, the connection between the devices according to the present invention may be configured such that the content server may also serve as a control function, or a renderer in charge of displaying content may also perform a control function.

In this case, the content sharing method includes a method in which a server device transmits image information (for example, a frame buffer) itself that can be recognized by a display means to a renderer device, and executes an application of a form previously promised on each device. And transmitting and receiving only control data for changing image / audio information output from the renderer device. In addition, if the operating system or platform used for sharing is compatible between devices, transfer the application installation file (for example, an APK file for Android) from one device to another device, and then apply the application to each device. When the installation of the server is completed, the server device or the controller device may transmit a control data only to the renderer device. In addition, in the case of playing multimedia content having a predetermined compression scheme, a codec for decoding may be provided to the renderer device prior to playback. Of course, the multimedia file itself may be transmitted to the renderer device so that the corresponding multimedia file may be played through a playback application held in the renderer device itself.

The connection between the devices may be configured to further include a digital camera 450 and a digital camcorder 470 as shown in FIG.

Plural application  Simultaneous sharing of run screens

According to an embodiment of the present invention, when at least one of the external devices having the display device is connected to the mobile terminal, execution screens of a plurality of applications executed on the mobile terminal may be sequentially displayed through the display device of the external device. A method is provided. In the present embodiment, it is assumed that the mobile terminal performs a controller and a content server function, and the external device performs a renderer function. In addition, in the present embodiment, it is assumed that the mobile terminal performs all operations for generating an image and delivers only the resultant image to the external device.

First, the screen sharing method according to the present embodiment will be described with reference to FIG. 5. 5 is a flowchart illustrating an example of a process of displaying execution screens of a plurality of applications through an external device according to an embodiment of the present invention.

Referring to FIG. 5, first, in response to a user's command input or occurrence of an event, the controller 180 activates a sharing mode for displaying content of a mobile terminal on an external device, and a data path for sharing an image with a connectable external device. It may be set (S510). The user's command input includes selection of a menu or icon through the user input unit 130, touch input of a predetermined pattern, manipulation of a specific key button, and the like. In addition, when an event is detected, when a connectable external device is detected, when a connection request is received from a connectable external device, and when a connection according to a predetermined protocol is detected by the interface unit 170 (eg, wired communication). Cable plug detection, etc.);

In the process of setting a data path for sharing an image with an external device, the controller 180 searches for an external device connectable through the wireless communication unit 110 and / or the interface unit 170, and displays the search result in the display unit 151. The display may be performed through a reference numeral), and if the user inputs a selection command through the user input unit, connecting to an external device corresponding to the selection command. Of course, the controller 180 may select a device to be automatically connected according to a preset criterion without a user selection command.

After the sharing mode is activated, an application to be shared may be executed according to a user's command input or a default setting (S520).

As the application to be shared is executed, an execution screen of the corresponding application may be displayed on the first area of the external device (S530). The first area of the external device is preferably a fixed area set in advance by the display means provided in the external device, and more preferably, may be the center of the display means provided in the external device.

Then, as an application addition command is input through the user input unit 130 of the user (S540), the display unit of the mobile terminal additionally selects an application through an external device (for example, a home screen or a main menu). ) May be displayed. In addition, the execution image of the previously executed application displayed on the first region may be moved to the second region (S550). The second area may be an area excluding the first area from the display means provided in the external device, and more preferably, a peripheral area of the first area. In addition, the process of selecting the additional application may be displayed through the first area of the external device.

Thereafter, when an application to be shared additionally is selected and executed (S560), an execution screen of the additionally selected application may be displayed on the first region of the external device (S570).

In a situation where an additional application is selected, an application addition command may be input again (S580). In this case, the execution screen of the application displayed in the first region may be moved to the second region. Accordingly, the execution screen of the first executed application and the additionally executed application are displayed together in the second region. . That is, each time an application add command is input, steps S550 to S570 are repeatedly performed, so that execution screens of newly executed applications are displayed in the first area, and execution screens of previously executed applications can be accumulated in the second area. have.

When the application to be shared on the external device is no longer added, the applications displayed in the first area are controlled or the applications displayed in the first and second areas are controlled according to a command input through the user input unit 130. Various additional functions may be performed, such as a position or a display state being changed (S590).

The method described above with reference to FIG. 5 may be performed to be displayed in the first area from the image displayed through the display unit 151 before the respective application is executed. For example, when an external device is connected, a home screen provided by an operating system of a mobile terminal or a main menu including icons corresponding to each application are first displayed in a first area of the external device, and then displayed as an application is selected. The entire process of launching the application may be displayed in the first area. In addition, while the application displayed in the first area is moved to the second area according to the application addition command, the first area may be displayed again from the process of selecting an additional application (starting from the home screen or the main menu).

In addition, the application adding command may use the sensing unit 140 instead of the key button input. For example, whenever the mobile terminal is shaken, the application displayed on the display unit may be displayed in the first area or the second area of the external device.

Hereinafter, the execution image sharing method of the plurality of applications described above will be described in more detail with reference to FIGS. 6A to 6D.

6A to 6D illustrate an example of a process of sharing execution images of a plurality of applications to an external device in a mobile terminal according to an embodiment of the present invention.

In the following drawings including FIG. 6A, for convenience of explanation, it is assumed that the external device to which the contents of the mobile terminal are to be shared is the smart TV 430.

When the sharing mode is activated in the mobile terminal 100 and the data path is set up with the TV 430, the home screen may be displayed on the display unit 151 as the initial screen as shown in FIG. 6A (a). The home screen image 611 displayed on the current display unit 151 may be displayed on the TV 430 connected to the mobile terminal 100 as shown in FIG. 6A (b). In addition, although not displayed on the current display unit 151 in the form of screen division on both sides of the home screen image 611, images 612 and 613 of other pages constituting the home screen may be displayed together. Alternatively, as illustrated in (c) of FIG. 6A, a frame shaped like a mobile terminal may be displayed in the center of the TV 430, and a home screen image 621 currently displayed on the display unit 151 may be displayed therein. In the following descriptions, it is assumed that an initial screen is displayed as shown in FIG. 6A (c). That is, it is assumed that the image currently displayed on the display unit of the mobile terminal is displayed in the frame of the mobile terminal provided in the center of the connected external device.

In a situation such as (a) and (c) of FIG. 6A, if an application to which an execution screen is shared according to a user's command input is selected and executed in the mobile terminal as shown in FIG. 6B, (B) of FIG. 6B An execution image 622 of the selected application may also be displayed in the center of the connected TV 430 as shown in FIG. In addition, a predetermined visual effect 630 indicating that the image of the executed application is being displayed and can be controlled (ie, activated) through the mobile terminal is displayed around the frame of the mobile terminal type displayed in the center of the TV. . Although the visual effects are displayed in a form in which color is provided around the frame of the mobile terminal in FIG. 6B, the visual effects are exemplary, and the visual effects are not limited to any form as long as they are visually distinguishable from the surrounding execution screens to indicate that they are currently activated. .

Subsequently, when an application addition command is input through a user's manipulation of the user input unit 130, as shown in (c) of FIG. 6B, the display unit 151 of the mobile terminal additionally selects an application to which an execution screen is additionally shared. The home screen may be displayed on the screen. As an example of the application addition command, any one of the key buttons 135 provided in the main body of the mobile terminal may be operated. More specifically, the key button provided in the main body may be a home-key. As another example, a method of shaking the mobile terminal in a specific pattern may be used to be sensed by the sensing unit 140 as described above.

As the application addition command is input, as shown in (d) of FIG. 6B, in the TV 430, the execution screen 622 of the first shared application is located at the left edge (ie, the second area) at the center (ie, the first area). The home screen image 623 currently displayed on the mobile terminal may be displayed on the frame of the mobile terminal in the center. At this time, the execution screen 622 moved to the left may not be given a frame of the mobile terminal type. In addition, the execution screen 622 moved to the left may be updated in real time although it is not currently displayed on the mobile terminal. That is, even if the application is not displayed on the display unit 151, the controller 180 continuously performs the operation thereof, and the change in the execution screen generated as a result can be continuously transmitted to the external device.

In a situation in which the mobile terminal and the TV are shown in (c) and (d) of FIG. 6B, an application to which an execution screen is additionally shared according to a user's command input is selected and executed in the mobile terminal as shown in (a) of FIG. 6C. 6C, the execution image 624 of the selected application may also be displayed in the center of the connected TV 430. In addition, a predetermined visual effect 630 indicating that the image of the executed application is being displayed may be provided around the frame of the mobile terminal type displayed in the center of the TV.

Subsequently, when an application addition command is input again through a user's user input unit 130 manipulation, the display unit 151 of the mobile terminal further selects an application to be shared again as shown in (c) of FIG. 6C. The home screen may be displayed as the initial screen. Since the application addition command is similar to that described above in FIG. 6B, duplicate description thereof will be omitted.

As the application addition command is input, as shown in (d) of FIG. 6C, in the TV 430, the execution screen 624 of the second shared application is located at the right edge (ie, second) in the center (ie, the first area). The home screen image 625 displayed on the mobile terminal may be displayed in the frame of the mobile terminal in the center. At this time, the execution screen 624 moved to the right may not be given a frame in the form of a mobile terminal similar to the execution screen 622 moved to the left.

In the situation where the mobile terminal and the TV are shown in (c) and (d) of FIG. 6c, respectively, a third application to be shared with the execution screen according to a user's command input is selected and executed in the mobile terminal as shown in (a) of FIG. 6d. 6B, the execution image 626 of the selected application may be displayed in the center of the connected TV 430 as shown in FIG. 6D. In addition, as described above, a predetermined visual effect 630 indicating that an image of an application executed around a frame of a mobile terminal displayed in the center of the TV is being displayed may be given.

The application corresponding to the execution screen 626 displayed in the center can be directly manipulated by the user through the mobile terminal, and the execution screens 622 and 624 displayed on both sides are not displayed on the display unit of the mobile terminal. It can be updated in real time on the second area of the TV. If the user wants to control an application corresponding to the execution screen located in the second area, the execution screen corresponding to the control desired application may be moved to the first area (ie, the center).

Hereinafter, as an example of an additional function that may be performed in a situation in which execution images of a plurality of applications are shared through an external device, a method of changing an arrangement state between execution screens will be described with reference to FIGS. 7A and 7B. 7A and 7B illustrate an example of a process of changing an arrangement state between execution screens when execution images of a plurality of applications are shared with an external device in a mobile terminal according to one embodiment of the present invention.

Referring to (a) of FIG. 7A, execution screens 711 to 714 for four applications are shared through the TV 430 through the aforementioned method. In this case, only the execution screen 713 located at the center may be displayed in the frame of the mobile terminal. Here, as the user inputs an area switching command (for example, home key button long touch, search key button long touch, or shaking of a specific pattern) through the user input unit 130, an arrangement state between the execution screens is performed. May enter a state where it can be changed. The controller 180 has a mobile terminal type in the execution screen 713 ′ displayed at the center as shown in FIG. 7A (b) to indicate that the arrangement state between the execution screens is changed. You can remove the frame.

Thereafter, the user inputs a scroll command (for example, a dragging or flicking touch) to the left side on the touch screen 151 as shown in FIG. 7B, and performs each execution as shown in FIG. 7B. The images may be cyclically rotated in a direction in which a drag or flicking touch is input.

When the flicking touch is input only once, as shown in (b) of FIG. 7B, the execution screen 714 displayed on the right side of the TV 430 is located at the center, that is, the first area, and the corresponding execution is performed on the mobile terminal. The application corresponding to the screen 714 is displayed as shown in FIG. 7B (c). Subsequently, when a user inputs a command to end a state in which arrangement states between execution screens may be changed through the user input unit 130, even if the user inputs a flicking touch through the touch screen of the mobile terminal, The arrangement state between the execution screens is not changed. Instead, the controller 180 may recognize the corresponding flicking touch input as a command for controlling an application currently displayed on the touch screen.

In addition, the mobile terminal frame may be displayed on the execution screen 714 ′ located at the center again as shown in (d) of FIG. 7B to indicate that the state where the arrangement state between the execution screens can be changed is finished.

Next, as an example of an additional function that may be performed in a situation in which execution images of a plurality of applications are shared through an external device, a method of changing the display state of the execution screen displayed on the first area will be described with reference to FIG. 8. do.

8 illustrates an example of a process of changing a display state of a central execution screen when the execution images of a plurality of applications are shared by an external device in a mobile terminal according to an embodiment of the present invention.

In FIG. 8, execution screens for four applications are shared on the TV 430 as shown in FIG. 7A, and an application corresponding to the execution screen located at the center thereof is displayed on the touch screen of the mobile terminal. Assume a situation. In this case, as the mobile terminal in the portrait view mode is rotated 90 degrees to the left as shown in FIG. 8A and is changed to the landscape view mode, the user interface 810 of the application may also be changed to the landscape view mode 810 '. have.

Accordingly, as shown in (b) of FIG. 8, the connected TV 430 may be displayed as the full screen 820 while the execution screen of the corresponding application is changed to the landscape mode. Alternatively, as shown in (c) of FIG. 8, the execution screen of the corresponding application may be displayed in the horizontal mode while being rotated together with the frame of the mobile terminal type (820 ′).

Instead of rotating the mobile terminal in the horizontal direction, the execution screen displayed in the center of the TV 430 may be enlarged as shown in FIG. 8D by a predetermined operation through the user input unit 130. When the execution screen is enlarged, the frame in the form of a mobile terminal may disappear. As the execution screen displayed in the first area is enlarged, the execution screen (eg, 840) displayed in the second area is reduced or simplified with an icon. May be

Meanwhile, according to another aspect of the present embodiment, a method for controlling execution image sharing based on a specific key button is provided. The specific key button mentioned in the present embodiment may be a hardware key button (including a push button and a touch button) provided in an external housing of the mobile terminal, or may be a virtual key button displayed on a touch screen. The hardware key button may be separately provided for the present function, or may be implemented by mapping the present function to any one of the existing key buttons. Of course, the specific key button may also be configured as a combination of a hardware key button and a menu selected by touch. For example, when the menu provided in the task manager (multitasking management) window appears when the user touches the home key in the Android OS, the M button may be operated.

Hereinafter, for convenience of description, the specific key button described above is referred to as "M button".

9 shows an example of an execution image sharing method according to another aspect of an embodiment of the present invention.

Referring to FIG. 9A, the execution screen of the first application selected by the mobile terminal 100 after the sharing mode is executed is displayed on the connected TV 430 in full screen. Then, when the M button is operated, a menu (home screen or main menu) for additionally selecting an application is displayed on the touch screen of the mobile terminal as shown in FIGS. 9B and 9C, and the TV 430 is touched. The image displayed on the screen may be displayed in the form of overlay windows 910 and 920 on the previously shared execution screen. In this case, the shape of the overlay window may be changed according to the state in which the mobile terminal is placed. For example, when the mobile terminal is placed vertically as shown in (b) of FIG. 9, the overlay window 910 also becomes vertical, and when the mobile terminal is placed horizontally as shown in (c) of FIG. 9, the overlay window 920 is shown. Can also be displayed in the horizontal direction. In this case, although the execution image of the application, which is executed for the first time and is displayed on the full screen, is not displayed through the display unit of the mobile terminal, the controller 180 continuously performs the operation on the application, and the execution image is updated as a result. Can be provided to the TV in real time. In addition, if the mobile terminal terminates the additionally executed application, the overlay window of the external device also disappears, and the application returns to the form of sharing only the minimum executed application on the full screen.

Meanwhile, whether or not the overlay window is displayed on the external device may be changed by a key button toggle method. This will be described with reference to FIG.

10 illustrates an example of an overlay window control method in a mobile terminal according to another aspect of an embodiment of the present invention.

(A) of FIG. 10 is assumed to be a procedure subsequent to (b) of FIG. In other words, it is assumed that the first shared application is displayed in full screen on the TV 430 and the home screen is displayed in the form of an overlay window. At this time, when the M button is operated (clicked or double-clicked), the overlay window disappears in the area 1010 where the overlay window is displayed on the TV 430 as shown in FIG. However, the image displayed on the touch screen of the mobile terminal does not change. At this time, after the user executes another application as shown in FIG. 10 (b), returns to the home screen as shown in FIG. 10 (c), and operates the M button again, the overlay as shown in FIG. 10 (d). The overlay window 1020 may be displayed again in the area 1010 where the window was displayed.

Through the method as shown in FIG. 10, the user may decide whether to display the overlay window in a toggle manner, particularly when it is necessary to temporarily perform a personal task that is not desired to be shared through an external device while performing a sharing function. Can be useful.

Hereinafter, a method of changing an arrangement state of an overlay window will be described with reference to FIG. 11. 11 illustrates an example of a method of changing an arrangement state of an overlay window through a predetermined user interface in a mobile terminal according to another aspect of an embodiment of the present invention.

Referring to FIG. 11A, an execution screen currently displayed on the TV 430 on the touch screen 151 according to a predetermined menu operation (for example, operation of M button for a predetermined time or long touch). The layout of is displayed on the left side, and the menu buttons are respectively displayed on the right side. In the layout, a figure 1110 representing a full screen and a figure 1120 representing an overlay window are displayed. The menu button includes a mode button 1131, an add button 1133, and a control target selection button 1135.

Here, the mode button 1131 provides a function of changing the screen configuration in the form of a full screen + overlay window into a split screen, and the add button 1133 provides an additional generation function of the overlay window. In addition, the control object selection button 1135 provides a function of determining an object to be moved through the touch-drag input. FIG. 11A illustrates a case where the control target selection button 1135 is set to a phone screen, that is, an overlay window.

In this case, when the user wants to change the position of the overlay window, after touching the control target selection button 1135, a predetermined time indicating that the user is selected by the figure 1120 'representing the overlay window as shown in FIG. Effects can be given. In addition, the position of the figure 1120 ′ representing the overlay window may be changed in the figure 1110 representing the full screen according to a user's touch-drag input.

On the other hand, if the mode button 1131 is selected as shown in (c) of FIG. 11, the layout is changed into a screen division form as shown in (d) of FIG. 11, and the user touches the boundary surfaces of the two figures 1110 and 1130. As the drag input is applied, the size ratio of each other may change.

Instead of the menu manipulation method of FIG. 11, the arrangement state of the overlay window may be changed by using the sensing unit 140. This will be described with reference to FIG.

12 illustrates an example of a method of changing an overlay window arrangement state using a sensing unit in a mobile terminal according to another embodiment of the present invention.

When the screen displayed on the mobile terminal 100 as shown in (a) of FIG. 12 is displayed in the center of the TV 430 in the form of an overlay window 1210, the user operates the M button as shown in (b) of FIG. If the mobile terminal 100 is tilted to the right or vibrated in one state, the overlay window 1210 may be moved to the right. On the contrary, when the user tilts or vibrates the mobile terminal 100 to the left while operating the M button as shown in FIG. 12C, the overlay window 1210 may be moved to the left. The method can also be applied to the screen division type. For example, when the execution screen corresponding to the first executed application and the execution screen corresponding to the additionally executed application are divided left and right and displayed on the external device, the mobile terminal is tilted in one direction while pressing the M button. Each execution screen position may be reversed. In addition, when a movement of a specific pattern is applied to the mobile terminal while the M button is operated, the layout consisting of the overlay window and the entire screen on the external device may be changed to a screen division form or vice versa.

On the other hand, the execution image sharing method according to another aspect of the present embodiment may be performed in the form of relaying the screen of the other mobile terminal to the external device. This will be described with reference to FIG.

13 illustrates an example of a method of sharing an image of another terminal to an external device in the form of an overlay window in a mobile terminal according to another aspect of an embodiment of the present invention.

Referring to FIG. 13A, in a situation in which an execution screen of an application executed in the mobile terminal 100 is displayed in full screen on the TV 430, another mobile terminal 200 is connected to the mobile terminal 100. You can request screen sharing. Accordingly, as shown in FIG. 13B, a popup window 1320 for determining whether to accept the request is displayed on the touch screen 151 of the mobile terminal 100. In this case, when the acceptance of the pop-up window 1320 is selected, an image provided by another terminal may be displayed through the touch screen of the mobile terminal (not shown). When the M button is operated instead of the menu of the pop-up window, the controller 180 displays an image transmitted from another terminal 200 in the form of an overlay window 1330 on the TV 430 as shown in FIG. 13C. can do. Of course, the corresponding image 1330 ′ may be displayed in a screen division layout as shown in FIG. 13D.

In this case, the controller 180 of the mobile terminal 100 may directly render image data transmitted from the other terminal 200 and provide the same to the TV, and the application that has previously shared execution images on the touch screen of the mobile terminal Can be displayed. In this case, the user of the mobile terminal may control the application displayed on the full screen by manipulating the mobile terminal, and the user of another terminal may also control the application displayed on the overlay window by operating the other terminal.

Meanwhile, when an event occurs on the mobile terminal while the sharing function according to the present invention is performed, an image related to the event may not be shared with an external device. For example, when a new text message arrives while the execution screen of the SMS application is being shared with an external device, the controller 180 displays the arrival alarm image (pop-up window or notification window) of the text message only on the display of the mobile terminal. It can be displayed so that it can be displayed on an external device. Alternatively, when an event occurs, an application corresponding to the event may be additionally displayed on the external device in the form of a new execution screen. In this case, the size of the execution screen may change according to the type of event. This will be described in more detail with reference to FIG. 14.

14 illustrates an example of an operation according to an event occurrence while performing a sharing function in a mobile terminal according to another embodiment of the present invention.

FIG. 14 is a process subsequent to the situation as shown in FIG. 9A, and assumes an incoming call signal as an event.

Referring to FIG. 14A, when a call signal is received by the mobile terminal 100 while performing a sharing function, an incoming call related screen may be displayed on the touch screen 151. In this case, as shown in (b) of FIG. 14, the connected call screen 1410 in the form of an overlay window may be displayed on the connected TV 430. In this case, when the user rejects the incoming call, the overlay window may disappear from the predetermined area 1410 ′, and the incoming call related screen itself may not be displayed depending on the setting. If the user accepts the call signal, the call connection screen 1420 may be larger than the incoming call related screen 1410 as shown in FIG. When the call ends, the call connection screen 1420 may disappear again.

Further, according to an embodiment of the present invention, the above-described method can be implemented as a code that can be read by a processor on a medium on which the program is recorded. Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .

The above-described mobile terminal and its control method are not limited to the configuration and method of the above-described embodiments, but the embodiments may be modified such that all or some of the embodiments are selectively And may be configured in combination.

Claims (17)

  1. Establishing a data path with an external device including a first display;
    Executing the application in the mobile terminal;
    Displaying an execution screen of the application on a first area of the first display;
    Inputting an application addition command through the mobile terminal;
    Moving an application execution screen of the first area to a second area of the first display;
    Executing an additional application on the mobile terminal;
    And displaying the execution screen of the additional application in the first area.
  2. The method of claim 1,
    And if the application addition command is input, displaying a menu for selecting the additional application in the first area.
  3. The method of claim 1,
    Each of the execution screens displayed in the first area is
    And a second display provided on the mobile terminal.
  4. The method of claim 1,
    Receiving an area switch command;
    Receiving a scroll command through a user input unit of a mobile terminal; And
    And replacing the execution screen displayed in the first area with the execution screen displayed in the second area in response to the scroll command.
  5. The method of claim 1,
    Changing an arrangement direction of the mobile terminal; And
    And changing a form of an execution screen of the first area in correspondence to the arrangement direction.
  6. 6. The method of claim 5,
    The changing step,
    Changing an execution screen of the first area from a landscape mode to a portrait mode according to an arrangement direction of the mobile terminal; or
    A method of controlling a mobile terminal, the method comprising changing from a portrait mode to a landscape mode.
  7. The method of claim 1,
    The step of inputting the application addition command,
    Manipulating a key button provided in the mobile terminal; or,
    And detecting a movement of a specific pattern by a sensing unit detecting the movement of the mobile terminal.
  8. The method of claim 1,
    The execution screen displayed on the second area is updated in real time regardless of whether or not displayed on the second display provided in the mobile terminal.
  9. Establishing a data path with an external device including a first display;
    Executing the first application on the mobile terminal;
    Displaying an execution screen of the application on a first area of the first display;
    Manipulating a specific key button in the mobile terminal;
    Executing a second application on the mobile terminal; And
    And displaying an execution screen of the second application on a second area of the first display.
  10. The method of claim 9,
    The first area includes a full screen of the first display,
    And the second area is displayed in an overlay form in the first area.
  11. The method of claim 9,
    The first region and the second region,
    The control method of the mobile terminal, characterized in that not overlap each other on the first display.
  12. The method of claim 10,
    A second operation of the specific key button;
    Disappearing the execution screen of the second application displayed on the second area;
    Third operation of the specific key button; And
    And displaying the execution screen of the second application on the second area again.
  13. 13. The method of claim 12,
    And the execution screen of the second application is maintained on the second display provided in the mobile terminal even if the execution screen of the second application disappears according to the second operation.
  14. The method of claim 10,
    Changing a slope of the mobile terminal; And
    And changing the position of the second area within the first area in response to the changed slope.
  15. 12. The method of claim 11,
    Changing a slope of the mobile terminal; And
    And mutually changing positions of the first area and the second area in response to the changed slope.
  16. The method of claim 14,
    The step of changing the position,
    The control method of the mobile terminal, characterized in that performed only in the state that the specific key button is operated.
  17. Establishing a data path with a first external device including a first display;
    Executing the first application on the mobile terminal;
    Displaying an execution screen of the application on a first area of the first display;
    Receiving execution information of a second application from a second external device;
    And displaying an execution screen corresponding to execution information of the second application in a second area of the first display.
KR1020120042124A 2012-04-23 2012-04-23 Mobile terminal and method for controlling thereof KR101952682B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120042124A KR101952682B1 (en) 2012-04-23 2012-04-23 Mobile terminal and method for controlling thereof

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020120042124A KR101952682B1 (en) 2012-04-23 2012-04-23 Mobile terminal and method for controlling thereof
US13/654,949 US20130278484A1 (en) 2012-04-23 2012-10-18 Mobile terminal and controlling method thereof
EP13001752.8A EP2658228B1 (en) 2012-04-23 2013-04-05 Mobile terminal adapted to be connected to an external display and a method of controlling the same
CN201310144058.9A CN103379221B (en) 2012-04-23 2013-04-23 Mobile terminal and control method thereof

Publications (2)

Publication Number Publication Date
KR20130119172A true KR20130119172A (en) 2013-10-31
KR101952682B1 KR101952682B1 (en) 2019-02-27

Family

ID=48087350

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120042124A KR101952682B1 (en) 2012-04-23 2012-04-23 Mobile terminal and method for controlling thereof

Country Status (4)

Country Link
US (1) US20130278484A1 (en)
EP (1) EP2658228B1 (en)
KR (1) KR101952682B1 (en)
CN (1) CN103379221B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101582048B1 (en) * 2014-09-11 2015-12-31 엘지전자 주식회사 Method for controlling device
WO2019088793A1 (en) * 2017-11-06 2019-05-09 삼성전자 주식회사 Electronic device and screen sharing method using same

Families Citing this family (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012020864A1 (en) * 2010-08-13 2012-02-16 엘지전자 주식회사 Mobile terminal, display device, and method for controlling same
US9182937B2 (en) 2010-10-01 2015-11-10 Z124 Desktop reveal by moving a logical display stack with gestures
US8842080B2 (en) 2010-10-01 2014-09-23 Z124 User interface with screen spanning icon morphing
US9128582B2 (en) 2010-10-01 2015-09-08 Z124 Visible card stack
US8781452B2 (en) 2011-02-21 2014-07-15 Motorola Mobility Llc Wireless devices and methods of operating wireless devices based on the presence of another person
US9069455B2 (en) * 2012-06-22 2015-06-30 Microsoft Technology Licensing, Llc 3D user interface for application entities
US20140006474A1 (en) * 2012-06-28 2014-01-02 Netflix, Inc. Application Discovery
US9743017B2 (en) * 2012-07-13 2017-08-22 Lattice Semiconductor Corporation Integrated mobile desktop
KR101936075B1 (en) * 2012-09-21 2019-01-08 삼성전자주식회사 Method for displaying data of a dispay apparatus using a mobile communication terminal and the apparatuses
JP2014071669A (en) * 2012-09-28 2014-04-21 Toshiba Corp Information display device, control method, and program
KR101951473B1 (en) * 2012-10-15 2019-02-22 엘지전자 주식회사 Mobile terminal
KR20140105344A (en) * 2013-02-22 2014-09-01 삼성전자주식회사 Methdo for controlling display of a plurality of objects according to input related to operation for mobile terminal and the mobile terminal therefor
KR102032182B1 (en) * 2013-03-25 2019-10-15 삼성전자주식회사 Data sharing control method and data sharing control terminal
KR102043049B1 (en) * 2013-04-01 2019-11-11 삼성전자 주식회사 Operating Method of Application and Electronic Device, and Outputting Device supporting the same
US9686346B2 (en) * 2013-04-24 2017-06-20 Blackberry Limited Device and method for generating data for generating or modifying a display object
USD760726S1 (en) * 2013-05-15 2016-07-05 Tencent Technology (Shenzhen) Company Limited Pair of display screens with animated graphical user interface
JP2014235534A (en) * 2013-05-31 2014-12-15 株式会社東芝 Information processing apparatus and display control method
US20150020013A1 (en) * 2013-07-12 2015-01-15 Samsung Electronics Co., Ltd. Remote operation of applications using received data
KR20150009634A (en) * 2013-07-16 2015-01-27 삼성전자주식회사 Potable terminal and Method for controlling external device thereof
KR20150026109A (en) * 2013-08-30 2015-03-11 삼성전자주식회사 Multiple-display method, machine-readable storage medium and electronic device
KR20150026367A (en) * 2013-09-02 2015-03-11 삼성전자주식회사 Method for providing services using screen mirroring and apparatus thereof
KR20150028383A (en) * 2013-09-04 2015-03-16 삼성전자주식회사 Method for controlling a display apparatus, sink apparatus thereof, mirroring system thereof
JP2015076802A (en) * 2013-10-10 2015-04-20 船井電機株式会社 Display device
KR20150049583A (en) * 2013-10-30 2015-05-08 삼성전자주식회사 Apparatus for sharing application and method for controlling thereof
CN104636177A (en) 2013-11-11 2015-05-20 中兴通讯股份有限公司 Terminal and method for terminal to control background projection
KR20150070541A (en) * 2013-12-17 2015-06-25 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US8856948B1 (en) * 2013-12-23 2014-10-07 Google Inc. Displaying private information on personal devices
KR20150074547A (en) 2013-12-24 2015-07-02 삼성전자주식회사 User terminal and control method thereof
US8811951B1 (en) 2014-01-07 2014-08-19 Google Inc. Managing display of private information
US20150212610A1 (en) * 2014-01-30 2015-07-30 Samsung Display Co., Ltd. Touch-in-touch display apparatus
KR20150099304A (en) * 2014-02-21 2015-08-31 삼성전자주식회사 Communication method, electronic apparatus and storage medium
CN104898818A (en) * 2014-03-07 2015-09-09 联想(北京)有限公司 Information processing method and electronic equipment
CN103905886B (en) * 2014-03-28 2017-09-01 广州华多网络科技有限公司 Video broadcasting method and device
KR20150136316A (en) * 2014-05-27 2015-12-07 삼성전자주식회사 Electrical apparatus, method and system for providing information
CN105306851B (en) * 2014-06-27 2019-02-01 中兴通讯股份有限公司 The method of projection and mobile hotspot device
US20160034058A1 (en) * 2014-07-31 2016-02-04 Microsoft Corporation Mobile Device Input Controller For Secondary Display
CN105472100A (en) * 2014-08-15 2016-04-06 中兴通讯股份有限公司 Desktop sharing method and terminal
JP2016053773A (en) * 2014-09-03 2016-04-14 コニカミノルタ株式会社 Sharing display system, sharing display control program and sharing display control method
US10448111B2 (en) 2014-09-24 2019-10-15 Microsoft Technology Licensing, Llc Content projection
US9678640B2 (en) * 2014-09-24 2017-06-13 Microsoft Technology Licensing, Llc View management architecture
US10025684B2 (en) 2014-09-24 2018-07-17 Microsoft Technology Licensing, Llc Lending target device resources to host device computing environment
US9860306B2 (en) 2014-09-24 2018-01-02 Microsoft Technology Licensing, Llc Component-specific application presentation histories
US9769227B2 (en) 2014-09-24 2017-09-19 Microsoft Technology Licensing, Llc Presentation of computing environment on multiple devices
US10503459B2 (en) * 2014-10-10 2019-12-10 Samsung Electronics Co., Ltd. Method for sharing screen and electronic device thereof
KR20160042739A (en) * 2014-10-10 2016-04-20 삼성전자주식회사 Method for sharing a display and electronic device thereof
KR20160045370A (en) 2014-10-17 2016-04-27 삼성전자주식회사 Method for screen sharing with devices and device thereof
EP3203465A4 (en) * 2014-10-27 2017-10-11 Huawei Technologies Co., Ltd. Image display method, user terminal and video receiving equipment
US10055064B2 (en) * 2014-10-29 2018-08-21 Sony Corporation Controlling multiple devices with a wearable input device
US9791929B2 (en) * 2014-10-31 2017-10-17 Elwha Llc Tactile control system
KR20160059264A (en) * 2014-11-18 2016-05-26 삼성전자주식회사 Method for controlling display and an electronic device thereof
US10116748B2 (en) 2014-11-20 2018-10-30 Microsoft Technology Licensing, Llc Vehicle-based multi-modal interface
US10073599B2 (en) 2015-01-07 2018-09-11 Microsoft Technology Licensing, Llc Automatic home screen determination based on display device
US9959024B2 (en) * 2015-01-27 2018-05-01 I/O Interconnect, Ltd. Method for launching applications of handheld computer through personal computer
US10297002B2 (en) * 2015-03-10 2019-05-21 Intel Corporation Virtual touch pad method and apparatus for controlling an external display
US20160282906A1 (en) * 2015-03-27 2016-09-29 Panasonic Intellectual Property Management Co., Ltd. Transaction processing system, transaction processing method and transaction equipment
US9648220B2 (en) * 2015-03-27 2017-05-09 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus, imaging apparatus body and image sound output method
US20180011675A1 (en) * 2015-04-30 2018-01-11 Hewlett-Packard Development Company, L.P. Electronic display illumination
KR20170027435A (en) * 2015-09-02 2017-03-10 엘지전자 주식회사 Electronic device and method for controlling the same
CN105430483B (en) * 2015-11-03 2018-07-10 广东威创视讯科技股份有限公司 The mutual facies-controlled method and system of intelligent terminal
CN105426076B (en) * 2015-11-04 2019-10-29 联想(北京)有限公司 Information processing method and electronic equipment
CN105373319A (en) * 2015-12-09 2016-03-02 广东欧珀移动通信有限公司 Control method, control apparatus and electronic apparatus
KR20170096408A (en) * 2016-02-16 2017-08-24 삼성전자주식회사 Method for displaying application and electronic device supporting the same
US20170242648A1 (en) * 2016-02-19 2017-08-24 RAPC Systems, Inc. Combined Function Control And Display And System For Displaying And Controlling Multiple Functions
JP2017169808A (en) * 2016-03-23 2017-09-28 セイコーエプソン株式会社 Method of supporting display apparatus, system for supporting display apparatus and electronic apparatus
WO2017177302A1 (en) 2016-04-15 2017-10-19 Light Wave Technology Inc. Automotive rear-view camera peripheral
US20180004476A1 (en) * 2016-06-30 2018-01-04 Microsoft Technology Licensing, Llc Media production to operating system supported display
USD842892S1 (en) 2016-10-27 2019-03-12 Apple Inc. Electronic device with pair of display screens or portions thereof each with graphical user interface
US10311249B2 (en) 2017-03-31 2019-06-04 Google Llc Selectively obscuring private information based on contextual information
CN107193522A (en) * 2017-06-19 2017-09-22 联想(北京)有限公司 A kind of switching method and the first electronic equipment
US10331394B1 (en) * 2017-12-21 2019-06-25 Logmein, Inc. Manipulating shared screen content

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100122207A1 (en) * 2008-11-10 2010-05-13 Samsung Electronics Co., Ltd. Broadcast display apparatus and control method thereof
US20120060109A1 (en) * 2010-09-08 2012-03-08 Han Hyoyoung Terminal and contents sharing method for terminal
US20120088548A1 (en) * 2010-10-06 2012-04-12 Chanphill Yun Mobile terminal, display device and controlling method thereof

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8108005B2 (en) * 2001-08-28 2012-01-31 Sony Corporation Method and apparatus for displaying an image of a device based on radio waves
CN100555196C (en) * 2006-12-30 2009-10-28 集嘉通讯股份有限公司;技嘉科技股份有限公司 Double function operation touch screen component for palm type device and the method
CN101198123B (en) * 2007-12-08 2013-03-13 青岛海信移动通信技术股份有限公司 Mobile phone with expandable display screen
US8042391B2 (en) * 2008-09-30 2011-10-25 Cywee Group Limited Inertia sensing module
US8594467B2 (en) * 2008-12-19 2013-11-26 Microsoft Corporation Interactive virtual display system for ubiquitous devices
US8914462B2 (en) * 2009-04-14 2014-12-16 Lg Electronics Inc. Terminal and controlling method thereof
US8490002B2 (en) * 2010-02-11 2013-07-16 Apple Inc. Projected display shared workspaces
KR101695810B1 (en) * 2010-05-07 2017-01-13 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US8819705B2 (en) * 2010-10-01 2014-08-26 Z124 User interaction support across cross-environment applications
US8806369B2 (en) * 2011-08-26 2014-08-12 Apple Inc. Device, method, and graphical user interface for managing and interacting with concurrently open software applications

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100122207A1 (en) * 2008-11-10 2010-05-13 Samsung Electronics Co., Ltd. Broadcast display apparatus and control method thereof
US20120060109A1 (en) * 2010-09-08 2012-03-08 Han Hyoyoung Terminal and contents sharing method for terminal
US20120088548A1 (en) * 2010-10-06 2012-04-12 Chanphill Yun Mobile terminal, display device and controlling method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101582048B1 (en) * 2014-09-11 2015-12-31 엘지전자 주식회사 Method for controlling device
WO2019088793A1 (en) * 2017-11-06 2019-05-09 삼성전자 주식회사 Electronic device and screen sharing method using same

Also Published As

Publication number Publication date
CN103379221A (en) 2013-10-30
EP2658228B1 (en) 2018-06-06
US20130278484A1 (en) 2013-10-24
KR101952682B1 (en) 2019-02-27
EP2658228A1 (en) 2013-10-30
CN103379221B (en) 2016-08-31

Similar Documents

Publication Publication Date Title
US9898155B2 (en) Multiple window providing apparatus and method
KR101690232B1 (en) Electronic Device And Method Of Controlling The Same
KR101260770B1 (en) Mobile device and method for controlling play of contents in mobile device
CN101729657B (en) Image projection in mobile communication terminal
US9417789B2 (en) Mobile terminal and controlling method thereof
KR101973631B1 (en) Electronic Device And Method Of Controlling The Same
US9996226B2 (en) Mobile terminal and control method thereof
KR101657122B1 (en) Mobile terminal and method for controlling the same
US9159298B2 (en) Terminal and contents sharing method for terminal
KR20110037024A (en) Mobile terminal and method for controlling application execution thereof
KR20120069209A (en) Mobile terminal and method for controlling application thereof
US9460689B2 (en) Mobile terminal and method for controlling the same
EP2237140A2 (en) Mobile terminal and controlling method thereof
KR101576292B1 (en) The method for executing menu in mobile terminal and mobile terminal using the same
EP2463763A2 (en) Mobile terminal and image display controlling method thereof
KR101588733B1 (en) Mobile terminal
KR101537596B1 (en) Mobile terminal and method for recognizing touch thereof
KR20120018952A (en) Mobile terminal and method for setting application indicator thereof
US8627235B2 (en) Mobile terminal and corresponding method for assigning user-drawn input gestures to functions
US20130120447A1 (en) Mobile device for executing multiple applications and method thereof
US9563350B2 (en) Mobile terminal and method for controlling the same
KR101788049B1 (en) Mobile terminal and method for controlling thereof
KR101451667B1 (en) Terminal and method for controlling the same
EP2385689B1 (en) Mobile terminal and controlling method thereof
KR101537598B1 (en) Mobile terminal with an image projector and method for controlling the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant