KR101693690B1 - Mobile terminal and control method thereof - Google Patents

Mobile terminal and control method thereof Download PDF

Info

Publication number
KR101693690B1
KR101693690B1 KR1020100049297A KR20100049297A KR101693690B1 KR 101693690 B1 KR101693690 B1 KR 101693690B1 KR 1020100049297 A KR1020100049297 A KR 1020100049297A KR 20100049297 A KR20100049297 A KR 20100049297A KR 101693690 B1 KR101693690 B1 KR 101693690B1
Authority
KR
South Korea
Prior art keywords
application
window
displayed
information
sub
Prior art date
Application number
KR1020100049297A
Other languages
Korean (ko)
Other versions
KR20110129750A (en
Inventor
황인용
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020100049297A priority Critical patent/KR101693690B1/en
Priority claimed from EP20110150908 external-priority patent/EP2354914A1/en
Publication of KR20110129750A publication Critical patent/KR20110129750A/en
Application granted granted Critical
Publication of KR101693690B1 publication Critical patent/KR101693690B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/542Event management; Broadcasting; Multicasting; Notifications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • H04M1/72561With means for supporting locally a plurality of applications to increase the functionality for supporting an internet browser application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72597Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status wherein handling of applications is triggered by incoming communication events
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2215/00Metering arrangements; Time controlling arrangements; Time indicating arrangements
    • H04M2215/81Notifying aspects, e.g. notifications or displays to the user
    • H04M2215/815Notification when a specific condition, service or event is met

Abstract

A mobile terminal of the present invention includes: a display unit displaying a first application; An alarm unit for notifying the occurrence of a predetermined event when the first application is displayed; A first display area in which the first application is displayed is reduced according to a first user input and a second application related to the event is displayed in a second display area generated in accordance with the reduction of the first display area And a control unit.

Description

[0001] MOBILE TERMINAL AND CONTROL METHOD THEREOF [0002]

The present invention relates to a mobile terminal providing a multitasking environment by adjusting a screen size and a control method thereof.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

However, according to the related art, when the user receives a telephone call or a text message while using the additional function of the mobile communication terminal, the user stops or pauses the operation or display of the additional function, . Further, according to the related art, even when multitasking is possible, one application can display information through the entire screen, and only the application displayed on the entire screen can be changed. In addition, according to the related art, there is an inconvenience in using information input / output from a running application in another application being executed at the same time.

SUMMARY OF THE INVENTION The present invention has been made to solve the above problems and it is an object of the present invention to provide an information processing apparatus and a method for displaying information processed by an event occurring during execution of a certain application or another application by a user, And to provide a mobile terminal and a control method thereof that can facilitate exchange of information.

A mobile terminal according to an embodiment of the present invention includes: a display unit displaying a first application; An alarm unit for notifying the occurrence of a predetermined event when the first application is displayed; A first display area in which the first application is displayed is reduced according to a first user input and a second application related to the event is displayed in a second display area generated in accordance with the reduction of the first display area And a control unit.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: displaying a first application; When a predetermined event occurs during the display of the first application, notifying the occurrence of the event; Reducing a first display area in which the first application is displayed according to a predetermined first user input; And displaying a second application related to the event in a second display area generated according to the reduction of the first display area.

According to an embodiment of the present invention, there is provided a multitasking environment in which a user can easily adjust. In addition, the user is not disturbed by the current operation performed by the user due to an event occurring during the execution of the main application, and at the same time, the user can easily recognize the event that occurred at the desired time. In addition, when a user wants to perform another task together with the main application, it is possible to easily select an application to be executed easily without disturbing the current work, and it is possible to easily confirm various applications being executed simultaneously.

In addition, according to an embodiment of the present invention, it is possible to easily use the information input / output or used in any one of the simultaneously running applications in other running applications, thereby providing a more convenient multitasking environment for the user do.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2A is a front perspective view of a portable terminal according to an embodiment of the present invention.
2B is a rear perspective view of a portable terminal according to an embodiment of the present invention.
3A and 3B are views showing a main window and a sub window according to an embodiment of the present invention.
4A through 4E are views showing a main window and a sub window actually displayed on a display unit according to an embodiment of the present invention.
5A to 5E are views showing the arrangement of a plurality of sub-windows.
6A and 6B are views showing the sub windows and the main window shown in FIGS. 5A through 5E together.
7 is a flowchart illustrating a process of controlling a main window and a sub window according to an embodiment of the present invention when an event occurs during execution of an application.
8A to 8F are diagrams showing a main window and a sub window to be displayed when an event occurs during execution of an application, as illustrated in FIG.
9 is a flowchart illustrating a process of controlling a main window and a sub-window when a user attempts to execute another application during execution of an application according to an embodiment of the present invention.
10A to 10G are diagrams illustrating a main window and a sub-window displayed when a user attempts to execute another application during execution of an application, as described with reference to FIG.
11 is a flowchart illustrating a process of displaying a sub window when a plurality of applications are being executed through a main window and a sub window.
12A and 12B are diagrams showing an embodiment in which an application output through a main window and a sub window is switched.
13A to 13D are views showing an embodiment of transmitting and receiving a message through a sub window during a web browser operation through a main window according to an embodiment of the present invention.
14A to 14C are diagrams illustrating an embodiment of transmitting and receiving a message through a sub window during navigation through a main window according to an embodiment of the present invention.
15A to 15D are views illustrating an embodiment of transmitting and receiving a message through a sub window during broadcast viewing through a main window according to an embodiment of the present invention.
16 is a flowchart showing the processing flow of the embodiment described in Figs. 13A to 15D.

Hereinafter, a mobile terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), navigation and the like. However, it will be understood by those skilled in the art that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to a mobile terminal.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only (DVF-H) And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for obtaining the position of the mobile terminal, and a representative example thereof is a Global Position System (GPS) module.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence or absence of user contact, the orientation of the mobile terminal, And generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. It is also possible to sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor 141.

The output unit 150 is for generating output related to the visual, auditory or tactile sense and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154 .

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is surrounded by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152 so that they may be classified as a part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The memory 160 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input by the user to the cradle may be transmitted It can be a passage to be transmitted to the terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, The embodiments described may be implemented by the control unit 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

2A is a perspective view of an example of a mobile terminal or a mobile terminal according to the present invention.

The disclosed mobile terminal 100 has a bar-shaped terminal body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injection molding a synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti) or the like.

The display unit 151, the sound output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, the interface 170, and the like may be disposed in the front body 101 have.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion and may be employed in any manner as long as the user operates in a tactile manner.

The contents inputted by the first or second operation unit 131 or 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 To the touch recognition mode of the touch screen.

FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 'may be further mounted on the rear surface of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to the camera 121 (see FIG. 2A), and may be a camera having different pixels from the camera 121.

For example, the camera 121 may have a low pixel so that the face of the user can be photographed and transmitted to the other party in case of a video call or the like, and the camera 121 ' It is preferable to have a large number of pixels. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-upable.

A flash 123 and a mirror 124 are additionally disposed adjacent to the camera 121 '. The flash 123 illuminates the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to illuminate the user's own face or the like when the user intends to shoot himself / herself (self-photographing) using the camera 121 '.

An acoustic output 152 'may be additionally disposed on the rear surface of the terminal body. The sound output unit 152 'may implement the stereo function together with the sound output unit 152 (see FIG. 2A), and may be used for the implementation of the speakerphone mode during a call.

In addition to the antenna for talking and the like, a broadcast signal reception antenna 116 may be additionally disposed on the side of the terminal body. The antenna 116, which forms part of the broadcast receiving module 111 (see FIG. 1), can be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the portable terminal 100 is mounted on the terminal body. The power supply unit 190 may be built in the terminal body or may be detachable from the outside of the terminal body.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may also be of a light transmission type like the display unit 151. [ In this case, if the display unit 151 is configured to output time information on both sides, the time information can be recognized through the touch pad 135 as well. The information output on both sides may be all controlled by the touch pad 135. [ Alternatively, a display may be additionally mounted on the touch pad 135, and a touch screen may be disposed on the rear case 102 as well.

The touch pad 135 operates in correlation with the display portion 151 of the front case 101. The touch pad 135 may be disposed parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

Hereinafter, embodiments related to a control method that can be implemented in the terminal configured as above will be described with reference to the accompanying drawings. The following embodiments can be used alone or in combination with each other. In addition, the embodiments described below may be used in combination with the above-described user interface (UI).

3A shows a main window, FIG. 3B shows a sub-window, and FIGS. 4A to 4E show a main window and a sub-window actually displayed on the display unit according to an embodiment of the present invention.

According to an embodiment of the present invention, the display unit may have at least two layer layers. In one embodiment of the present invention, the main window means a window of an upper layer, and the sub window can mean a window of a lower layer than a layer of a main window. Here, the window means one screen for displaying input / output information through an operation screen of the mobile terminal or execution of an application, and can occupy the entire screen of the display unit or occupy a part of the entire screen.

Separate or the same application can be executed in each window. The functions that can be executed in each window include voice or video calls, receiving or sending text messages, text chatting, sound source playback, video playback, games, mobile internet, DMB, navigation, E-book reader, , A document creator, a game, and the like can be executed.

When the mobile terminal is in a standby mode (a state in which a specific application is not being executed, an event is generated or a user is waiting for input), or an application is running, information is displayed only through the main window occupying the entire screen.

The size of the main window 200 may be changed according to a user's input. For example, if the bar 201 or 202 is displayed on the edge of the main screen and the user drags the bar through the touch screen, the control unit displays the size of the main window according to the drag position of the user Lt; / RTI > Alternatively, or in place of or in addition to the above, an indicator 204 may be displayed that allows a user to drag the corner of the main window to change the size of the main window. When the size of the main window is resized, the controller causes the information output through the main window to be newly output according to the size of the changed main window.

When two or more applications are executed (including one application execution and one event generation), one is outputted through the main window and the other is outputted through the sub window. The subwindow may be more than one, depending on the number of applications being executed. The entire screen of the lower layer can be divided (301, 302, 303) without being hierarchically related to the sub-windows, and can be simultaneously displayed.

As described above, the mobile terminal according to the embodiment of the present invention is configured such that the display unit 151 displays image data to be displayed on the display unit 151 so that the main window and at least one sub- And a screen controller (not shown) for outputting image data to the display buffer to display a screen on the display unit 151 and controlling the display buffer. . The display buffer and the screen controller may be included in the display unit 151 and implemented. The screen controller determines which screen is to be displayed on the display unit 151 according to the control of the controller 180 and controls the division or arrangement of the screen window to display the main window and / So that the sub window is output. In addition, in order to control at least one display buffer according to the present invention, image data is output or updated to a display buffer according to a layer hierarchy or position of a window to be output.

4A to 4E, a sub-window of the lower layer is displayed on the display unit only the portion not covered by the main window of the upper layer. Also, the main window of the upper layer is not covered by the sub-window of the lower layer. Accordingly, by adjusting the size of the main window variously, the main window and the sub window of the portion that does not overlap with the main window can be displayed together.

4A shows a case where the size of the main window is adjusted so that two sub windows are displayed, FIG. 4B shows a case where one sub window is displayed, FIG. 4C shows a case where one sub window and a part of two sub windows are displayed , And FIG. 4D shows a case where three sub windows are displayed by minimizing the main window so that the main window is not displayed. 4E, names 401, 402, and 403 of applications executed in respective sub windows are displayed at the bottom of each sub window, and only application names 401 and 402 , Or the name 403 and the sub window 303 may be displayed together.

The size of the main window may be selected from a fixed size such as the embodiment of Figs. 4a to 4b, or may be continuously changed by a user's input regardless of the arrangement of sub windows.

5A to 5E are views showing the arrangement of a plurality of sub-windows.

The size of each subwindow may be variable or fixed depending on the number of subwindows. Referring to FIG. 5A, when the number of sub windows is four, each sub window has a size that divides the screen size of the entire display unit into four equal parts. If there are two subwindows, they will have two halves.

5B to 5E, each sub-window may have a fixed size regardless of the number of sub-windows, for example, the size of the screen of the entire display unit may be divided into three equal sizes. Fig. 5B shows the arrangement in the case where there are two sub windows, and the leftmost area is empty. Figures 5C and 5D show the arrangement of four sub-windows. By moving a subwindow arranged in a line by user input, the subwindow displayed on the display unit can be changed. In the case of FIG. 5C, by dragging the subwindows to the left, the subwindows sub2, sub3, and sub4 are displayed (only displayed if they are not covered by the main window, of course) , sub1, sub2, sub3 are displayed. Of course, you can move a series of subwindows to the right to display sub1, leaving the remaining space empty (ie, aligning subwindows in a row), or displaying sub3, sub4 (ie, placing a subwindow in a circle) have.

FIG. 6A is a view showing the sub windows and the main window shown in FIGS. 5A to 5E together. FIG. 6A shows a sub-window arranged as shown in FIG. 5C and a main window resized to a two-thirds size. As shown in the drawing, only sub4 is displayed, and sub2 and sub3 are not displayed on the main window. The user can move the arranged subwindows by dragging the displayed subwindow region (601) and change the resulting subwindow.

6B, in the state as shown in FIG. 6A, indicators 602, 603, and 604 for displaying sub windows hidden on the main window side, indicators 605 for sub windows that are not displayed on the right side of the displayed sub windows, May be displayed. E.g,

5A to 6B, a sub-window is generated during the execution of an application in the main window due to an event such as an application execution by a user, reception of a call signal, reception of a text message, alarm ringing, When the application terminates, it disappears. The plurality of subwindows can be arranged in a line according to the order in which they are created, the last displayed order, the frequently displayed order, the order set by the user, and the like. For example, the first sub-window sub1 may be located at the leftmost position, and the most recently generated sub-window sub4 may be located at the far right.

For example, a virtual sub-window (new, 501) to be newly created may be positioned to the right of the most recently generated sub-window. For example, in a state shown in FIG. 5C, when the user drags the sub window leftward, a new sub window (sub5, 502) is created. sub5 can display a standby screen or operation screen of the mobile terminal or a list of applications that can be multitasked in a state in which a special application is not currently executed. If the user drags the subwindow to the right again without any operation, sub5 can disappear. Alternatively, as shown in FIG. 5A, it can always be displayed (503) in the same manner as the already created sub-window.

7 is a flowchart illustrating a process of controlling a main window and a sub window according to an embodiment of the present invention when an event occurs during execution of an application.

When the mobile terminal is turned on, a standby mode screen is displayed on the main window of the display unit of the mobile terminal (S100). As described above, the standby mode is a state in which various function keys and menu keys are displayed and waiting for user input or event generation.

When a user selects an application (hereinafter referred to as App0) to be executed by using a menu key or the like displayed on the display unit (S110), the control unit loads and executes the application selected by the user and displays the information processed by the application on the display unit And is controlled to be output through the main window (S120).

If the control unit detects the occurrence of an event during execution of App0 (S130), an indicator for notifying the occurrence of the event is displayed (S140). The event may be a reception of a call signal, a reception of a text message, an alarm, etc., and the indicator may be an icon, a character box, or the like for displaying each of the events according to the generated event.

In addition to displaying the indicator, a sub-window for the generated event is generated (S150). The generated subwindow is not displayed until the user input is made, and the main window remains the same as before. However, the information about the event or the information related to the event may be stored in the display buffer for the sub window.

In order to confirm the event that occurred, the user drags the bar or the indicator of the main window edge to reduce the main window (S160).

When the size of the reduced main window is input from the user, the controller causes the output information of App0 to continue to be output through the main window of the changed size according to the size of the input main window (S170).

At the same time, a sub window that is not covered by the main window as the main window is reduced is displayed on the display unit, and event information is displayed through the sub window (S180).

8A to 8F are diagrams showing a main window and a sub window to be displayed when an event occurs during execution of an application, as illustrated in FIG.

Referring to FIG. 8A, information processed by an arbitrary application App0 is output through a main window occupying the entire screen of the display unit.

Referring to FIG. 8B, when a text message is received during the execution of App0, an indicator 801 of an icon shape indicating the reception of a text message is displayed together with or without a beep for informing the reception of the text message. The indicator may blink until the user confirms this or for a predetermined time. In addition, the indicator may serve as an indicator for adjusting the size of the main window. The user drags the indicator to the left to check the received text message.

Referring to FIG. 8C, the size of the main window is reduced according to the drag of the user, and the sub window 810 in which the main window is shrunk is displayed. If there is no sub window hidden by the main window, the indicator may not be displayed, or an indicator 802 with no meaning may be displayed.

When the user confirms the text message, the user can touch the OK key 803 to close the sub window 810 displayed with the text message. If another sub window is not created, the main window may be returned to the full screen or a sub window may be displayed in which a standby screen is displayed.

Referring to FIG. 8D, when a reply key 804 of the received message confirmation sub window 810 of FIG. 8C is touched, a sub window 820 for sending a text message is displayed. The display information may be simply changed in the sub window 810 for confirming the received message shown in FIG. 8C, but a sub window 820 separate from the sub window for message confirmation may be generated. As described above, the same application can be executed in two or more windows. The sub window 820 for the outgoing message is generated and the sub window 810 for confirming the previously displayed received message is shifted to the left and is hidden from the main window and is not displayed. An indicator 801 for displaying this can be displayed in the main window.

Referring to FIG. 8E, when the user drags the indicator 801 of FIG. 8D to the left, a sub window 810 for confirming the hidden reception message is displayed. The user can simultaneously create and send a text message while viewing the received text message.

Referring to FIG. 8F, when a user drags an indicator 802 displayed on the main window (or may display an edge on the main window edge or simply drag the edge of the main window) during the confirmation and transmission of a text message, , One sub-window, and the entire sub-window. The obscured subwindows are not closed, they are simply not displayed, and they do not affect applications running in a subwindow. Indicators 801 and 805 indicating an application executed in the obscured sub window can be displayed in the main window occupying the entire screen.

9 is a flowchart illustrating a process of controlling a main window and a sub-window when a user attempts to execute another application during execution of an application according to an embodiment of the present invention.

When the mobile terminal is turned on, a standby mode screen is displayed on the main window of the display unit of the mobile terminal (S200). As described above, the standby mode is a state in which various function keys and menu keys are displayed and waiting for user input or event generation.

When the user selects an application (hereinafter referred to as App0) to be executed using a menu key or the like displayed on the display unit (S210), the control unit loads and executes the application selected by the user, and the information processed by the application is displayed on the display unit And is controlled to be output through the main window (S220).

If the user wishes to simultaneously execute another application while executing App0, the main window is reduced by dragging a bar or an indicator at the edge of the main window (S230).

When the size of the reduced main window is input from the user, the controller causes the information output from the App0 to continue to be output through the main window of the changed size according to the size of the input main window (S240).

In addition, since there is no subwindow already created, the subwindow is newly generated and displayed according to the reduction of the main window (S250). Alternatively, when the main window is reduced, an empty space is displayed. As described above, when the user drags the empty space to the left, a new sub window can be created and displayed.

In the newly created sub window, a standby mode screen in which a list of applications to be executed, various function keys, and menu keys are displayed may be displayed (S260).

The user selects an application (hereinafter referred to as App1) to be executed through the standby mode screen of the sub window (S270). The control unit loads and executes the application selected by the user, and controls the information processed by the application to be output through the sub-window of the display unit (S280).

When App1 executed in the subwindow is terminated by the user's input or the like, the control unit closes the subwindow (S300). As a result, the sub window disappears and the main window can be enlarged to the entire screen. When the user enlarges the main window 310 without termination of App1, the sub-window is maintained without disappearing, and information processed by App0 is output through the main window occupying the entire screen at step S320.

10A to 10G are diagrams illustrating a main window and a sub-window displayed when a user attempts to execute another application during execution of an application, as described with reference to FIG.

10A and 10B, information processed by an arbitrary application App0 is output through the main window occupying the entire screen of the display unit (FIG. 10A). If the user wishes to simultaneously execute another application during execution of App0, the main window edge bar or indicator 1001 is dragged to reduce the main window. The indicator may be in the form of an icon indicating that a new sub window is newly created.

Referring to FIG. 10C, as the user shrinks the main window, a newly created sub window 1010 is displayed. A list of executable applications may be displayed in the newly created sub window 1010.

Referring to FIG. 10D, when the user selects one of the application lists, the control unit executes the selected application (App1) and outputs information through the sub window 1010. FIG. When the user wishes to execute another application while executing App1 through the sub window 1010, the sub window 1010 is dragged to the left as described above (1011).

Referring to FIG. 10E, if there is the above-mentioned drag input during execution of App1 in a subwindow, a new subwindow 1020 is created and a standby mode screen is displayed in the subwindow 1020. [ The sub window 1010 in which App1 is running moves to the left and is hidden in the main window, and an indicator 1012 is displayed in the main window. Of course, the user can further reduce the main window to the left to display all of the sub windows 1010 and 1020.

Referring to FIGS. 10F and 10G, when an event occurs during the execution of App1 in the sub window 1010, an indicator 1013 indicating occurrence of an event is displayed and blinking can be performed. When the user drags the indicator or the sub window to the left, the sub window 1010 in which App1 is running moves to the left and is hidden, and a sub window 1030 in which information about the generated event is displayed is displayed.

11 is a flowchart illustrating a process of displaying a sub window when a plurality of applications are being executed through a main window and a sub window.

In a state in which information output from the application App0 is displayed on the main window occupying the entire screen (S400), the user drags a bar or an indicator of the main window to confirm the sub window to reduce the main window (S410 ).

In response to the input of the user, the controller reduces the main window and displays the output information of App0 through the reduced main window (S420).

In addition, the controller displays the sub window n in the remaining space as the main window is reduced (S430). As described above, one or more sub-windows may be displayed according to the size of the reduced main window. The displayed subwindow may be the most recently created subwindow, the most recently displayed subwindow, or the like.

If the user wishes to change the subwindow being displayed (S430), for example, the displayed subwindow may be dragged left or right to move the subwindows (S440). Alternatively, the subwindow displayed may be changed by clicking or dragging an indicator for each subwindow displayed in the main window or subwindow.

(N-1) generated before the sub-window is displayed in the first direction, for example, when the right drag is input (S450), and when the second direction, for example, the left drag is inputted Subsequently, the generated sub window n + 1 may be displayed (S460). If there is no sub-window generated after that, a new sub-window can be created and a standby mode screen can be displayed.

12A or 12B is a diagram showing a method of switching an application output through a main window and a sub window. When two or more applications are being executed, the information processed by the first executed application is output through the main window, and the subsequently executed application can be output through the sub-window. When a user wants to output an application running in a subwindow through a main window or vice versa, the application executed in the main window can be changed according to the user's input.

12A, when the user drags the sub window sub3 into the main window while the main window and the sub window are displayed, or when the application name (App1) executed in the sub window is dragged to the main window, The application being executed in App0 is changed from App0 to App1 or App3, and App0 is executed in the subwindow.

Referring to FIG. 12B, when the main window occupies the entire screen and an indicator of the application being executed in the sub window hidden in the main window is displayed, the user can drag one of the displayed indicators (App1) When clicked, the application running in the main window is changed from App0 to App1, and App0 is executed in the subwindow.

In the above description, the main window and the sub window are located in the upper layer and the lower layer, respectively, and the sub window of the lower layer which is not covered by the main window according to the resizing of the main window is displayed, but the present invention is not limited thereto . Therefore, the present invention can be applied to all cases in which an event or another application generated according to a user's control can be simultaneously displayed (sub-window) while maintaining the display of a predetermined application through the main window.

For example, a main window and a sub window can be displayed at the same time by dividing the entire screen (main window display area and sub window display area) according to a user's resizing input, regardless of the layer hierarchy .

In addition, if a predetermined input (e.g., a touch input of an indicator) is detected without resizing the main window (the main window occupies the entire screen), the sub window is displayed in the PIP (Picture-In-Picture) form of the main window . The divided subwindow display area or PIP corresponds to a portion not covered by the main window of the upper layer as described above, and the methods described in detail with reference to FIGS. 3A to 12B can be similarly applied. The present invention can be applied to various cases where two or more windows are displayed on the same screen.

Hereinafter, with reference to the drawings, a method of easily using information to be input / output or processed in each application in a running application in a multi-tasking environment in which two or more windows are displayed on the same screen will be described in detail.

13A to 13D are views showing an embodiment of transmitting and receiving a message through a sub window during a web browser operation through a main window according to an embodiment of the present invention.

When the user operates the web browser of the mobile terminal, the web browser is displayed through the main window occupying the entire screen. When the user wishes to transmit a specific web page to the partner terminal using a text message service or the like, the text message creation application is driven and displayed on the sub window as described above (see FIG. 13A).

When a predetermined signal is input, the control unit 180 inputs the application being executed in the main window, that is, the information processed by the web browser, into an application running in the sub window, that is, a text message creating application (see FIG.

The predetermined signal may be a signal for dragging a web page displayed by the user in the main window and dropping the web page into the sub window in which the message generating application is executing.

The information processed by the web browser may be an Internet access address such as a URL related to the web page currently being displayed, image information in which the currently outputted web page is converted into an image format such as a screen capture.

When the above information is input into the text message creating application of the sub window, the URL and / or the image of the web page inputted in the text message creating window are displayed, and the other text message, Or web page image information to the other terminal.

Upon receiving the message including the URL information and / or image information, the terminal identifies and displays URL information and / or image information in the message (see FIG. 13C). When the user selects the displayed URL information and / or image information by touch or the like, the control unit 180 that senses the displayed URL information and / or image information drives the web browser, accesses the received web page, and displays the web page on the main window or sub window 13d).

14A to 14C are diagrams illustrating an embodiment of transmitting and receiving a message through a sub window during navigation through a main window according to an embodiment of the present invention.

When the user drives the navigation of the mobile terminal, a map is displayed through the main window occupying the entire screen. When the user wants to transmit information on the currently displayed map, the current user's location, destination, travel path, etc. to the other terminal using a text message service or the like, the text message creating application is driven and displayed on the sub window (See Fig. 14A).

When a predetermined signal is input, the control unit 180 inputs the application being executed in the main window, that is, the information processed by navigation, into an application running in the sub window, that is, a text message creating application (see FIG.

The predetermined signal may be a signal for dragging a map displayed by the user in the main window and dropping the selected map in the sub window under execution of the message creation application.

The information processed by the navigation includes information such as coordinate information of the currently displayed map, location information regarding the current user's location, destination, moving route information, neighboring local information, and / or a map displayed on the current main window, Image information converted into an image format in a method.

When the information is input to the text message creating application of the subwindow, coordinate information, position information, movement path information, neighboring local information, and / or image information input to the text message creating window are displayed, A telephone number, and the like, and transmits the information to the counterpart terminal.

The terminal receiving the message including the information identifies and displays the coordinate information, the position information, the movement route information, the neighboring local information, and / or the image information in the message (see FIG. When the user selects the displayed location information or the like by a touch or the like, the control unit 180, which senses the selected location information or the like, drives the navigation and displays the map of the point related to the received information on the main window or the sub window. In the displayed map, the user's current location, destination, moving route, and neighboring area information may be displayed according to the received information.

15A to 15D are views illustrating an embodiment of transmitting and receiving a message through a sub window during broadcast viewing through a main window according to an embodiment of the present invention.

When a user operates a broadcast viewing or various video playing application of a mobile terminal, a moving picture such as a broadcast is played through a main window occupying the entire screen. When a user wishes to transmit information on a currently playing video or the like to a correspondent terminal using a text messaging service or the like, the text message creating application is driven and displayed on the sub window as described above (see FIG. 15A).

When a predetermined signal is input, the control unit 180 inputs information processed by an application running in the main window, that is, a broadcast viewing application, into an application running in a subwindow, that is, a text message creating application (see FIG. 15B).

The predetermined signal may be a signal by which a user drags a moving picture being played back in the main window and drops the moving picture into a subwindow in which the message creating application is running.

The information processed by the broadcast viewing application may refer to channel information of a currently reproduced program, title, information on broadcast time, image information captured during playback, and the like. In the case of a moving image playback application, it may refer to information related to a file name of a moving image being played back, a content address, a connection address related to a VOD, a frame position currently being reproduced, or an image captured during playback.

When the information is input to the text message creating application of the sub window, the program information and / or the image information inputted in the text message creating window are displayed, and the other text message, the other party's telephone number and the like are inputted from the user, Lt; / RTI >

Upon receiving the message containing the information, the terminal identifies and displays program information and / or image information in the message (see FIG. 15C). When the user selects program information or the like displayed by touch or the like, the control unit 180, which senses the selected program information or the like, drives a broadcast viewing or moving picture playback application and plays a program or moving picture related to the received information. If frame information of a specific position of the moving image is received, playback can be started in the frame (see FIG. 15D).

The above-described web browser, navigation, broadcast viewing or moving picture playback application, message generating application, and the like are not limited to the embodiments according to the present invention. For example, applications that can exchange information processed and executed at the same time may include camera related applications, voice / video calls, text / video chat, E-book readers, text editors, various games, have.

Information delivered to another application running on a running application (e.g., display through a main window) (e.g., a display through a sub-window) according to a user's input, capture image information.

In addition, the information to be transmitted may be specific information to be input / output or processed according to a running application. For example, in the case of a camera-related application, it may be an image taken by a camera, or a preview image. In the case of an e-book reader, it may be the title, author, displayed page or sentence of the currently displayed e-book, and displayed E-book file information. In the case of an application related to a telephone number, a name, a telephone number, an e-mail address, an image, and the like selected by the user may be used. Each piece of information may be added with a predetermined identifier that identifies the type, or may be converted into a predetermined format and transmitted.

In one embodiment according to the present invention, the information to be transferred between the applications may be determined according to the application being executed in the main window and / or the sub window, or the information to be transferred according to the user's selection may be selected.

In the above description, an application executed in a sub window has mainly been described for a message creating application, but the present invention is not limited thereto. For example, an application that receives information from another running application may include a text message creation application as well as a video call, a video / text chat, a text creator, and the like. Information received from an application running in the main window can be processed by an application running in a subwindow. For example, in the case of a video call-related application, the transferred image can be transmitted to the other party as a substitute image, or various information received through the data channel can be transmitted to the other party.

16 is a flowchart showing the processing flow of the embodiment described in Figs. 13A to 15D.

When an application to be executed by the user is selected, the control unit 180 drives the application APP1 and displays the application through the main window occupying the entire screen (S510). When a resizing command of the main window is inputted from the user due to the occurrence of an event during the operation of the APP1 or another application (APP2) to be simultaneously executed by the user is selected, the main window is reduced and APP2 The sub window is displayed together with the main window (S520).

If a touch input for dragging a predetermined input, for example, from the main window to the sub window, is detected while the main window and the sub window are being displayed at the same time, the control unit 180 displays information related to the APP 1, Information inputted to be processed by the APP1, information processed by the APP1, and the like, to be processed by the APP2 (S540).

The control unit 180 processes the input information by the APP2 and performs a predetermined function according to the APP2 (S550).

Further, according to an embodiment of the present invention, the above-described method can be implemented as a code that can be read by a processor on a medium on which the program is recorded. Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .

The above-described mobile terminal and its control method are not limited to the configuration and method of the above-described embodiments, but the embodiments may be modified such that all or some of the embodiments are selectively And may be configured in combination.

In the foregoing, preferred embodiments of the present invention have been described with reference to the accompanying drawings.

Here, the terms and words used in the present specification and claims should not be construed as limited to ordinary or dictionary meanings, but should be construed as meaning and concept consistent with the technical idea of the present invention.

Therefore, the embodiments described in the present specification and the configurations shown in the drawings are merely the most preferred embodiments of the present invention, and not all of the technical ideas of the present invention are described. Therefore, It should be understood that various equivalents and modifications may be present.

100: mobile terminal 110: wireless communication unit
120: A / V input unit 130: user input unit
140: sensing unit 150: output unit
160: memory 170: interface section
180: control unit 190: power supply unit

Claims (22)

  1. Displaying on the touch screen an indicator associated with a list including information of a first window and an executable application, the first window comprising an execution screen of the first application;
    Outputting a list including information of the executable application based on a touch input to the indicator;
    Receiving a second application from the list;
    Outputting the execution screen of the first application to the first window and outputting the execution screen of the second application to the second window in response to the selection;
    Sensing a predetermined signal applied to the first window and the second window;
    Inputting information processed in a first application executed in the first window into an input of a second application executed in the second window when the predetermined signal is detected; And
    And driving the second application such that the second application processes information of the input first application,
    Wherein the first and second windows are included in different areas of the touch screen so as not to overlap each other,
    Wherein the first application and the second application are different types.
  2. The method according to claim 1,
    Wherein the list disappears from the touch screen when an execution screen of the second application is output to the second window.
  3. The method according to claim 1,
    Wherein the indicator is an icon and / or a bar indicating the presence of the list, and the touch input for the indicator is a touch input for clicking or dragging the icon or bar. Way.
  4. The method according to claim 1,
    Wherein the first window is a main window and the second window is a sub window.
  5. The method according to claim 1,
    The list may include at least one of a voice call application, a video call application, a text messaging application, a chat application, a game application, a web browser application, a DMB application, a navigation application, an e-book reader application, a camera application, a phone book application, The method comprising the steps of: determining whether the mobile terminal is a mobile terminal;
  6. The method according to claim 1,
    Wherein the indicator is displayed in an edge area of the first window,
    And when the indicator is dragged, the size of the window 1 window is tempered.
  7. The method according to claim 1,
    Wherein the position of the first and second windows is changed on the touch screen based on a user's request.
  8. The method according to claim 1,
    In response to the first request, when the second window disappears, the size of the first window is enlarged to a size corresponding to the size of the touch screen,
    Wherein the size of the second window is enlarged to a size corresponding to the size of the touch screen when the first window disappears in response to a second request.
  9. The method according to claim 1,
    The first application may be a broadcast viewing or moving picture playback application (web browser)
    Wherein the execution screen of the first application includes information on a channel, a title, a broadcast time, or a currently displayed image of the program being played back by the broadcast viewing or moving picture playback application.
  10. The method according to claim 1,
    The first application is a camera browser application,
    Wherein the execution screen of the first application includes an image, a moving image, and a preview image taken by a camera.
  11. The method according to claim 1,
    Wherein the first application is an e-book reader,
    Wherein the execution screen of the first application includes a title, an author, a page being displayed, a sentence or an image of the E-book being displayed.
  12. A touch screen for displaying an indicator related to a list including information of a first window and an executable application including an execution screen of the first application;
    Controls the touch screen to output a list including information of the executable application based on a touch input to the indicator,
    The control unit controls the touch screen so that an execution screen of the first application is output to the first window and an execution screen of the second application is output to the second window in response to the selection of the second application from the list And a control unit,
    Wherein,
    When a predetermined signal applied to the first window and the second window is sensed, information processed in the first application executed in the first window is input as an input of a second application executed in the second window , The second application to process the input first application information by the second application,
    Wherein the first and second windows are included in different areas of the touch screen so as not to overlap each other, and the first and second applications are different types.
  13. delete
  14. delete
  15. delete
  16. delete
  17. delete
  18. delete
  19. delete
  20. delete
  21. delete
  22. delete
KR1020100049297A 2010-05-26 2010-05-26 Mobile terminal and control method thereof KR101693690B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100049297A KR101693690B1 (en) 2010-05-26 2010-05-26 Mobile terminal and control method thereof

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR1020100049297A KR101693690B1 (en) 2010-05-26 2010-05-26 Mobile terminal and control method thereof
EP20110150908 EP2354914A1 (en) 2010-01-19 2011-01-14 Mobile terminal and control method thereof
US13/006,925 US9116594B2 (en) 2010-01-19 2011-01-14 Mobile terminal and control method thereof
CN2011100300939A CN102129345A (en) 2010-01-19 2011-01-19 Mobile terminal and control method thereof
US14/582,860 US9569063B2 (en) 2010-01-19 2014-12-24 Mobile terminal and control method thereof
US15/395,621 US10185484B2 (en) 2010-01-19 2016-12-30 Mobile terminal and control method thereof

Publications (2)

Publication Number Publication Date
KR20110129750A KR20110129750A (en) 2011-12-02
KR101693690B1 true KR101693690B1 (en) 2017-01-06

Family

ID=45498822

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100049297A KR101693690B1 (en) 2010-05-26 2010-05-26 Mobile terminal and control method thereof

Country Status (1)

Country Link
KR (1) KR101693690B1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130126428A (en) * 2012-05-11 2013-11-20 삼성전자주식회사 Apparatus for processing multiple applications and method thereof
KR101943357B1 (en) * 2012-06-01 2019-01-29 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
KR102044829B1 (en) * 2012-09-25 2019-11-15 삼성전자 주식회사 Apparatus and method for processing split view in portable device
US10282088B2 (en) 2012-12-06 2019-05-07 Samsung Electronics Co., Ltd. Configuration of application execution spaces and sub-spaces for sharing data on a mobile tough screen device
KR20140073399A (en) 2012-12-06 2014-06-16 삼성전자주식회사 Display apparatus and method for controlling thereof
KR102088911B1 (en) * 2013-04-18 2020-03-13 엘지전자 주식회사 Mobile terminal and control method thereof
KR20170116883A (en) * 2016-04-12 2017-10-20 삼성전자주식회사 A flexible device and operating method thereof

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101315953B1 (en) * 2006-11-23 2013-10-08 엘지전자 주식회사 Mobile station and method for disposal multi scene thereof
KR20080051573A (en) * 2006-12-06 2008-06-11 엘지전자 주식회사 Method of performing multi-tasking in mobile communication terminal
KR101548958B1 (en) * 2008-09-18 2015-09-01 삼성전자주식회사 A method for operating control in mobile terminal with touch screen and apparatus thereof.

Also Published As

Publication number Publication date
KR20110129750A (en) 2011-12-02

Similar Documents

Publication Publication Date Title
US9804763B2 (en) Mobile terminal and user interface of mobile terminal
KR101504236B1 (en) Mobile terminal
US10185484B2 (en) Mobile terminal and control method thereof
KR102049855B1 (en) Mobile terminal and controlling method thereof
KR101990035B1 (en) Mobile terminal and control method for the mobile terminal
KR101859100B1 (en) Mobile device and control method for the same
KR101873413B1 (en) Mobile terminal and control method for the mobile terminal
US8548528B2 (en) Mobile terminal and control method thereof
KR101868352B1 (en) Mobile terminal and control method thereof
KR101952682B1 (en) Mobile terminal and method for controlling thereof
KR101729523B1 (en) Mobile terminal and operation control method thereof
KR101911251B1 (en) Terminal and method for controlling the same
EP2306290B1 (en) Mobile terminal and method of controlling application execution in a mobile terminal
KR101631958B1 (en) Input device and mobile terminal having the same
KR101990036B1 (en) Mobile terminal and control method thereof
KR101651135B1 (en) Mobile terminal and method for controlling the same
KR101470543B1 (en) Mobile terminal including touch screen and operation control method thereof
KR101657122B1 (en) Mobile terminal and method for controlling the same
KR101561703B1 (en) The method for executing menu and mobile terminal using the same
KR101572892B1 (en) Mobile terminal and Method for displying image thereof
US8423904B2 (en) Mobile terminal and control method thereof
US9159298B2 (en) Terminal and contents sharing method for terminal
KR101609162B1 (en) Mobile Terminal With Touch Screen And Method Of Processing Data Using Same
KR101633332B1 (en) Mobile terminal and Method of controlling the same
KR101701834B1 (en) Mobile terminal and Control Methods for transmitting communication data and displaying communication list thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20191224

Year of fee payment: 4