KR101871711B1 - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR101871711B1
KR101871711B1 KR1020110118049A KR20110118049A KR101871711B1 KR 101871711 B1 KR101871711 B1 KR 101871711B1 KR 1020110118049 A KR1020110118049 A KR 1020110118049A KR 20110118049 A KR20110118049 A KR 20110118049A KR 101871711 B1 KR101871711 B1 KR 101871711B1
Authority
KR
South Korea
Prior art keywords
content
execution screen
function
indicator information
functions
Prior art date
Application number
KR1020110118049A
Other languages
Korean (ko)
Other versions
KR20130052801A (en
Inventor
최병윤
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110118049A priority Critical patent/KR101871711B1/en
Priority claimed from EP12007562.7A external-priority patent/EP2592548B1/en
Publication of KR20130052801A publication Critical patent/KR20130052801A/en
Application granted granted Critical
Publication of KR101871711B1 publication Critical patent/KR101871711B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution
    • G06F9/4856Task life-cycle, e.g. stopping, restarting, resuming execution resumption being on a different machine, e.g. task migration, virtual machine migration
    • G06F9/4862Task life-cycle, e.g. stopping, restarting, resuming execution resumption being on a different machine, e.g. task migration, virtual machine migration the task being a mobile agent, i.e. specifically designed to migrate
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The present invention searches for a function capable of interworking with the content included in the first function screen among the functions while displaying the first function screen among the simultaneously executing functions and notifies the user of the function, And to a control method thereof.

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention relates to a portable terminal and a control method thereof, in which the use of the terminal can be realized by further considering the convenience of the user.

A terminal such as a personal computer, a notebook computer, a mobile phone, or the like can be configured to perform various functions. Examples of such various functions include a data and voice communication function, a function of photographing a video or a moving image through a camera, a voice storage function, a music file playback function through a speaker system, and an image or video display function. Some terminals include additional functions to execute games, and some other terminals are also implemented as multimedia devices. Moreover, recent terminals can receive a broadcast or multicast signal to view a video or television program.

In general, the terminal is movable The mobile terminal can be divided into a mobile terminal and a stationary terminal depending on whether the mobile terminal is portable or not, and the mobile terminal can be divided into a handheld terminal and a vehicle mount terminal.

BACKGROUND ART [0002] At present, due to an increase in the use of terminals such as a smart phone, many applications that provide various types of functions available in the smart phone have been developed.

Accordingly, the terminal provides a multi-tasking function capable of simultaneously executing the same functions as a plurality of applications. For example, a user can listen to music through a terminal and simultaneously execute functions such as a web page, an image, and a document.

However, in order for the user to switch the currently displayed first function screen to another second function screen currently being multitasked, the current first function screen is switched to the home screen and then the second function is selected in the switched home screen There is a problem that it is inconvenient to switch.

An object of the present invention is to provide a method and apparatus for searching for a function capable of interworking with a content included in the first function screen among the functions while displaying a first function screen among simultaneously executing functions, And a control method of the portable terminal capable of directly executing the content through the searched function.

According to an aspect of the present invention, there is provided a mobile terminal supporting multi-tasking, the mobile terminal comprising: a touch screen displaying a screen of a specific function among two or more functions simultaneously executed; Wherein when at least one content among the contents included in the specific function screen is selected, information indicating at least one function capable of executing the selected content among the simultaneously executing functions is displayed, and when the information is selected, And a control unit for executing the selected content through a function corresponding to the information.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal supporting multi-tasking, the method comprising: simultaneously executing two or more functions; Displaying a screen of a specific function among the simultaneously executing functions; Searching at least one function capable of executing the selected content among the simultaneously executing functions when a specific content is selected from among contents included in the specific function screen; Displaying information indicating the searched at least one function; And if the information is selected, executing the selected content through a function corresponding to the information.

A portable terminal and a control method thereof according to the present invention are characterized in that a function capable of interworking with a content included in the first function screen among the functions while displaying a first function screen among the simultaneously executing functions And informs the user of the retrieval result, and the user can immediately execute the content through the retrieved function.

1 is a block diagram illustrating a portable terminal according to an embodiment of the present invention.
2A is a perspective view of a portable terminal according to an embodiment of the present invention.
FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.
FIG. 3 is a flowchart of a first embodiment of the present invention illustrating a process of simultaneously linking functions currently executing and content in a current function screen.
4 to 10 are explanatory diagrams of a first embodiment showing a process of interlocking the functions currently executing simultaneously with the content in the current function screen according to the present invention.
FIG. 11 is a flowchart of a second embodiment of the present invention, illustrating a process of concurrently executing functions that are simultaneously executing and contents in a current function screen.
FIG. 12 and FIG. 13 illustrate a second embodiment of the present invention, illustrating a process of simultaneously interfacing functions in a current function screen with functions being simultaneously executed.
FIG. 14 is a flowchart of a third embodiment of the present invention, illustrating a process of simultaneously linking functions currently executing and content in a current function screen.
FIG. 15 is an explanatory diagram of a third embodiment showing a process of interlocking the functions currently executing simultaneously with the content in the current function screen according to the present invention.

Hereinafter, a portable terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The portable terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to the portable terminal.

1 is a block diagram of a portable terminal according to an embodiment of the present invention.

The portable terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in Fig. 1 are not essential, and a portable terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the wireless terminal 100 and the wireless communication system or between the wireless terminal 100 and a network in which the wireless terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only A digital broadcasting system such as DVB-CB, OMA-BCAST, or Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 refers to a module for wireless Internet access, and may be built in or externally mounted in the mobile terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for acquiring the position of the portable terminal, and a representative example thereof is a GPS (Global Position System) module. According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites, and then applies trigonometry to the calculated information to obtain a three-dimensional string of latitude, longitude, The location information can be accurately calculated. At present, a method of calculating position and time information using three satellites and correcting an error of the calculated position and time information using another satellite is widely used. In addition, the GPS module 115 can calculate speed information by continuously calculating the current position in real time.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes an image frame such as a still image or moving image obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ The camera 121 may be provided with two or more cameras for producing left and right eyes and a camera for self-photographing for generating a 3D preview image according to the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal.

The user input unit 130 may receive from the user a signal designating two or more contents among the displayed contents according to the present invention. A signal for designating two or more contents may be received via the touch input, or may be received via the hard key and soft key input.

The user input unit 130 may receive an input from the user for selecting the one or more contents. In addition, an input for generating an icon related to a function that the portable terminal 100 can perform can be received from the user.

The user input unit 130 may include a directional keypad, a keypad, a dome switch, a touchpad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, the presence of the user, the orientation of the portable terminal, And generates a sensing signal for controlling the operation of the portable terminal 100. For example, when the portable terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. It is also possible to sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor 141. The proximity sensor 141 will be described later in relation to the touch screen.

The output unit 150 is for generating an output relating to visual, auditory or tactile sense and includes a display unit 151, an acoustic output module 152, an alarm unit 153, a haptic module 154, 155), and the like.

The display unit 151 displays (outputs) the information processed in the portable terminal 100. For example, when the portable terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received image, UI, or GUI is displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) A flexible display, and a three-dimensional display (3D display).

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the portable terminal 100. [ For example, in the portable terminal 100, a plurality of display units may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

The proximity sensor 141 may be disposed in an inner area of the portable terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception tones, message reception tones, etc.) performed in the portable terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events occurring in the portable terminal include receiving a call signal, receiving a message, inputting a key signal, and touch input. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. In this case, the display unit 151 and the audio output module 152 may be a type of the alarm unit 153. The display unit 151 and the audio output module 152 may be connected to the display unit 151 or the audio output module 152, .

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The projector module 155 is a component for performing an image project function using the portable terminal 100 and is similar to the image displayed on the display unit 151 in accordance with a control signal of the controller 180 Or at least partly display another image on an external screen or wall.

Specifically, the projector module 155 includes a light source (not shown) that generates light (for example, laser light) for outputting an image to the outside, a light source And a lens (not shown) for enlarging and outputting the image at a predetermined focal distance to the outside. Further, the projector module 155 may include a device (not shown) capable of mechanically moving the lens or the entire module to adjust the image projection direction.

The projector module 155 can be divided into a CRT (Cathode Ray Tube) module, an LCD (Liquid Crystal Display) module and a DLP (Digital Light Processing) module according to the type of the display means. In particular, the DLP module may be advantageous for miniaturization of the projector module 151 by enlarging and projecting an image generated by reflecting light generated from a light source on a DMD (Digital Micromirror Device) chip.

Preferably, the projector module 155 may be provided on the side surface, the front surface, or the back surface of the portable terminal 100 in the longitudinal direction. It goes without saying that the projector module 155 may be provided at any position of the portable terminal 100 as needed.

The memory 160 may store a program for processing and controlling the controller 180 and may store the input / output data (e.g., a telephone directory, a message, an audio, a still image, an electronic book, History, and the like). The memory 160 may also store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia). In addition, the memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory, etc.) ), A random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read- A magnetic disk, an optical disk, a memory, a magnetic disk, or an optical disk. The portable terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the portable terminal 100. The interface unit 170 receives data from an external device or receives power from the external device and transmits the data to each component in the portable terminal 100 or allows data in the portable terminal 100 to be transmitted to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the usage right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the portable terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the portable terminal 100, or various command signals input from the cradle by the user It can be a passage to be transmitted to the terminal. The various command signals input from the cradle or the power source may be operated as a signal for recognizing that the portable terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the portable terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

2A is a perspective view of a portable terminal according to an embodiment of the present invention.

The disclosed mobile terminal 100 has a bar-shaped main body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injection molding a synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti) or the like.

The display unit 151, the sound output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, the interface 170, and the like may be disposed in the front body 101 have.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion.

The contents inputted by the first or second operation unit 131 or 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 The touch recognition mode can be activated or deactivated.

FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 'may be further mounted on the rear surface of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to that of the camera 121 (see FIG. 2A), and may be a camera having the same or different pixels as the camera 121.

For example, the camera 121 may have a low pixel so that the face of the user can be photographed and transmitted to the other party in case of a video call or the like, and the camera 121 ' It is preferable to have a large number of pixels. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-upable.

A flash 123 and a mirror 124 may be additionally disposed adjacent to the camera 121 '. The flash 123 illuminates the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to illuminate the user's own face or the like when the user intends to shoot himself / herself (self-photographing) using the camera 121 '.

An acoustic output module 152 'may be additionally disposed on the rear side of the terminal body. The sound output unit 152 'may implement the stereo function together with the sound output module 152 (see FIG. 2A), and may be used for the implementation of the speakerphone mode during a call.

In addition to the antenna for communication, a broadcast signal receiving antenna 116 may be additionally disposed on the side of the terminal body. The antenna 116, which forms part of the broadcast receiving unit 111 (see FIG. 1), may be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the portable terminal 100 is mounted on the terminal body. The power supply unit 190 may be built in the terminal body or may be detachable from the outside of the terminal body.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may be of a light transmission type for the display unit 151. [ In this case, if the display unit 151 is configured to output time information on both sides (that is, in both the front and rear directions of the portable terminal), the time information can be recognized through the touch pad 135 do. The information output on both sides may be all controlled by the touch pad 135. [

Meanwhile, the display for exclusive use of the touch pad 135 may be separately installed, so that the touch screen may be disposed in the rear case 102 as well.

The touch pad 135 operates in correlation with the display portion 151 of the front case 101. The touch pad 135 may be disposed parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, The embodiments described may be implemented by the control unit 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

The portable terminal referred to herein may include at least one of the components shown in FIG. In addition, the control unit 180 can control the individual operation of the component or the linking operation between the plurality of components for performing an operation using the component (e.g., a touch screen, a wireless communication unit, a memory, and the like).

Hereinafter, with reference to FIG. 3 to FIG. 15, it is assumed that the first function screen among the simultaneously executing functions according to the present invention is displayed, and interworking with the content included in the first function screen, The process of retrieving possible functions and informing the user of the functions, and directly executing the contents through the retrieved functions will be described in detail.

First Embodiment

According to a first embodiment of the present invention, when a specific content in the first function screen is selected in a state in which a first function screen is displayed among functions being simultaneously executed, among the simultaneously executing functions, To inform the user of the functions, and to execute the selected content directly through the interlocking functions.

Hereinafter, the first embodiment of the present invention will be described in detail with reference to Figs. 3 to 10. Fig.

FIG. 3 is a flowchart of a first embodiment of the present invention illustrating a process of simultaneously linking functions currently executing and content in a current function screen.

4 to 10 are explanatory diagrams of a first embodiment showing a process of interlocking the functions currently executing simultaneously with the content in the current function screen according to the present invention.

3, the controller 180 of the portable terminal 100 executes a function of a specific first function selected by the user from among the simultaneously executing functions while simultaneously executing two or more functions selected by the user And displays the screen on the display unit 151 (S31). At this time, the concurrently executing functions include all functions executable in the mobile terminal, for example, an application providing a specific function, a widget, a menu function, a data file, and the like.

The control unit 180 detects whether at least one of the contents included in the execution image of the first function is selected (S32). In this case, the contents refer to objects included in or attached to the execution screen of the first function, for example, text, a file, an image, an icon assigned a specific function, a specific web page address, And the like.

If the at least one content is selected (S33), the controller 180 grasps one or more functions that can interwork with the selected content among remaining functions currently being executed except for the first function (S34 ].

At this time, the functions capable of interworking with the selected content may be functions capable of executing the selected content, functions associated with the selected content, and the functions selected through the memory 160 or the web, It may be a function to search for related information.

For example, assuming that the selected content is "LG Optimus Black" text and the functions currently being executed simultaneously are a YouTube function that provides a web browser function and a UCC (User Created Contents) video, The LG Optimus Black text and the web browser and the YouTube function can be interlocked with each other.

As another example, assuming that the selected content is an "LG Optimus Black.MP3" sound file and the functions that are currently being simultaneously executed are a sound playback function and a movie playback function, the sound playback function and the movie playback function may be referred to as "LG Optimus Black. MP3 "music file can be reproduced, the sound reproduction function and the moving picture reproduction function with the selected" LG Optimus Black.MP3 "sound file can be interlocked with each other.

Meanwhile, the control unit 180 may grasp the functions belonging to the group preset by the user among the simultaneously executing functions, and may grasp the functions capable of interlocking with the selected content among the grasped functions. This process will be described later in detail in FIG.

As described above, when one or more functions capable of interworking with the selected content are recognized among the currently executing functions, the controller 180 generates indicator information indicating the identified functions, Information in the execution screen of the first function (S35).

At this time, the indicator information may be in the form of a thumbnail image, a text, or an icon representing each of the identified functions. For example, if the identified functions are an application, the indicator information may be an icon form of the corresponding application.

Meanwhile, the controller 180 may display the indicator information by aligning the indicator information with the upper or lower end of the first function execution screen.

In addition, the controller 180 may display the indicator information on the execution screen of the first function in a transparent manner or in an overlapped manner.

In addition, the controller 180 may display the indicator information at or around the selected content.

In addition, the controller 180 may display the indicator information in a list form on the execution screen of the selected first function.

When a drag or flicking touch having a specific direction is input on the execution screen of the first function, the control unit 180 shifts the execution screen of the first function in the direction corresponding to the drag or flicking touch , And may display the indicator information in an area between the initial position of the first function execution screen and the shifted position. This process will be described later in detail in FIG. 5 below.

The control unit 180 detects whether the specific indicator information is selected from among the indicator information [S36]. If the specific indicator information is selected (S37), the controller 180 executes the selected content through a function corresponding to the selected indicator information S38].

Then, the control unit 180 displays the execution screen of the content (S39).

At this time, if the specific indicator information among the indicator information is directly touched by the user, the controller 180 can execute the selected content through the function corresponding to the touched indicator information.

If the selected content is dragged and dropped to a position where specific indicator information is displayed or the specific indicator information is dragged and dropped to a position where the selected content is displayed, To execute the selected content.

Also, when the content and the specific indicator information are simultaneously multi-touched, the controller 180 can execute the selected content through a function corresponding to the specific indicator information.

Next, when displaying the execution screen of the content, the control unit 180 can display the execution screen of the first function by switching to the execution screen of the content.

In addition, when displaying the execution screen of the content, the control unit 180 may further display an execution screen of the content in a new window form on the execution screen of the first function.

In addition, the control unit 180 divides the screen of the display unit 151 into first and second areas when displaying the execution screen of the content, and displays the execution screen of the first function in the first and second areas And an execution screen of the content, respectively.

In addition, when displaying the execution screen of the content, the control unit 180 may reduce the execution screen of the content to a thumbnail form at the display position of the content in the execution screen of the first function.

When a drag or flicking touch having a specific direction is input on the execution screen of the first function, the control unit 180 shifts the execution screen of the first function in the direction corresponding to the drag or flicking touch , An execution screen of the content may be displayed in an area between the initial position of the first function execution screen and the shifted position. The above process will be described later in detail in FIG.

Next, FIG. 4 illustrates the process of S31 to S39 of FIG. 3 as an example.

For example, FIG. 4A shows that the currently running functions are the web, the document viewer, the music playback, the YouTube function, and the function currently displayed on the screen is the document viewer function including two or more sentences 311 and 312 have.

4 (b), when the first sentence 311 is all or partially designated on the document viewer screen 310, the control unit 180 displays the web, music, YouTube, And displays indicator information 320 and 330 indicating the identified functions, respectively.

For example, FIG. 4 illustrates a web function and a YouTube function that can search for information or multimedia that is the same or related to the first sentence 311 as a function that can be interlocked with the first sentence 311.

4 (c), when the indicator information 320 corresponding to the web function is selected from the indicator information, the controller 180 determines whether the indicator information 320 corresponding to the first sentence 311 is associated with the first sentence 311 Information 311A, and displays the retrieved information 311A.

Next, the controller 180 generates the indicator information 320 and 330 through the above-described process and then generates the indicator information 320 and 330 on the document viewer screen 310 as shown in FIG. 5 (a) When a directional drag or flicking touch is input, the document viewer screen 310 is shifted in a direction corresponding to the drag or flicking touch as shown in FIG. 5 (b) The indicator information 320 or 330 may be displayed in an area between the initial position of the display 310 and the shifted position.

That is, when the indicator information is displayed on the document viewer screen 310, the user may be prevented from viewing the document due to the indicator information.

Accordingly, the indicator information 320 and 330 are displayed in a hidden state, and the indicator information 320 and 330 are displayed only when the document viewer screen 310 is dragged, so that the display unit 151 ) Can be efficiently used.

Next, Fig. 6 illustrates the process of S31 to S39 in Fig. 3 as an example.

6A and 6B, when all or part of the first sentence 311 is designated on the document viewer screen 310, the control unit 180 determines whether the first sentence 311 is currently being executed simultaneously And indicator information 320 and 330 indicating Web and YouTube functions that can be interlocked with the first sentence 311 among the web, music playback, and YouTube functions.

6 (c), when the indicator information 330 corresponding to the YouTube function is selected from the indicator information, the controller 180 determines whether the indicator information 330 corresponding to the first sentence 311 is associated with the first sentence 311 Searches for multimedia, and displays a list of the retrieved multimedia.

Next, as shown in FIG. 7 (a), when a command for group setting of two or more functions among the simultaneously executing functions is input from the user, , It identifies the functions currently being executed simultaneously and displays the list of functions 360 simultaneously executing.

As shown in FIG. 7 (b), when one or more functions 320 and 330 are selected in the list 360, the controller 180 groups the selected functions 320 and 330, (160).

7 (c), when the specific content 311 is selected in the document viewer screen 310, the control unit 180 determines whether the content 311 is currently selected by the user As shown in FIG. 7 (d), recognizes the functions that can be linked to the selected content 311 among the identified functions, and displays indicator information 320 indicating the identified functions , 330 are generated and displayed.

Next, FIG. 8 illustrates the process of S35 to S39 of FIG. 3 as an example.

8 (a), the controller 180 displays indicator information 320 and 330 indicating functions that can be interlocked with the selected content 311, and the content 311 The first indicator information 320 corresponding to the function of performing the first indicator information 320 and the content 311 are simultaneously multi-touched, the first indicator information 320 corresponding to the first indicator information 320, as shown in FIG. 8 (c) The content 311 can be executed through the function (web).

For example, in FIG. 8, the content 311 is the text "LG Optimus Black" in the document viewer screen 310, and the function corresponding to the "LG Optimus Black" text and the first indicator information 320 multi- Since it is the web, the control unit 180 retrieves information associated with the "LG Optimus Black" text via the web and displays the retrieved result.

8B, the controller 180 displays indicator information 320 and 330 indicating functions that can be interlocked with the selected content 311. The controller 311 displays the content 311, When the first indicator information 320 is dragged and dropped to the display position of the first indicator information 320 or when the first indicator information 320 is dragged and dropped to the display position of the content, The content 311 can be executed through a function (web) corresponding to the first indicator information 320. [

Next, FIG. 9 illustrates the step S39 of FIG. 3 as an example.

9A, the control unit 180 determines whether the text " LG Optimus Black "311 is selected in the document viewer screen 310 and the text" LG Optimus Black " LG Optimus Black "text 311 through the web function when the first indicator information 320 indicating the web function that can be interlocked with the LG Optimus Black text 311 is selected, Similarly, the screen of the retrieved information 311A may be further displayed on the document viewer screen 310 in a new window format.

9 (c), the control unit 180 divides the screen of the display unit 151 into first and second areas, and controls the first and second areas to display the document viewer screen 310 and the screen of the retrieved information 311A, respectively.

9 (d), the control unit 180 may reduce the searched information 311A to a thumbnail form and display it on the display position or the vicinity of the "LG Optimus Black" text 311 have. In this case, if the information 311A displayed in the thumbnail form is selected, the information 311A may be pulled up on the screen.

10 (a), the control unit 180 selects the text " LG Optimus Black "311 in the document viewer screen 310 and displays the text" LG Optimus Black & When the first indicator information 320 indicating the interlinkable web function is selected, information related to the "LG Optimus Black" text 311 is retrieved through the web function.

10 (b), when a drag or flicking touch having a specific direction is inputted on the document viewer screen 310, the control unit 180 determines whether the drag or flicking touch corresponding to the drag or flicking touch The document viewer screen 310 may be shifted in the direction of the document viewer screen 310 and the searched information 311A may be displayed in an area between the initial position of the document viewer screen 310 and the shifted position.

Second Example

The second embodiment of the present invention is characterized in that, before the specific content in the execution screen of the first function is selected, the execution screen of the first function And notifying the user of contents and functions that can be interlinked with the concurrently executing functions, and directly executing the selected content through the interlocking functions.

Hereinafter, a second embodiment of the present invention will be described in detail with reference to FIGS. 11 to 13. FIG.

FIG. 11 is a flowchart of a second embodiment of the present invention, illustrating a process of concurrently executing functions that are simultaneously executing and contents in a current function screen.

FIG. 12 and FIG. 13 illustrate a second embodiment of the present invention, illustrating a process of simultaneously interfacing functions in a current function screen with functions being simultaneously executed.

11, the controller 180 of the portable terminal 100 executes a function of a specific first function selected by the user among the simultaneously executing functions while simultaneously executing two or more functions selected by the user And displays the screen on the display unit 151 (S41).

The control unit 180 determines whether there is any content that can be interlocked with currently executing functions among the contents included in the execution image of the first function (S42).

The control unit 180 displays the identified contents so as to be identified in the execution screen of the first function (S43).

At this time, the control unit 180 makes the display style of the identified contents different from the display style of the execution screen of the first function, and when the user views the execution screen of the first function, .

For example, the control unit 180 may display the highlighted content by blinking the displayed content, or may display the identified content on other contents or the first function Can be displayed in a color different from that of the execution screen. In addition, the controller 180 may display the identified contents in a 3D (Dimensional) format and display other contents in a 2D format. In addition, the control unit 180 may display an underline only in the identified contents. In addition, the control unit 180 may enlarge and display the identified contents.

Next, the controller 180 generates indicator information indicating functions that can be interlocked with the identified contents, and displays the generated indicator information (S44).

If the first content and the first indicator information are interlocked with each other and the second content and the second indicator are interlocked with each other, 1 indicator information is identifiable with other content and other indicator information within the execution picture of the first function, and the second and second indicator information is also displayed in the execution screen of the first function The same display style is applied to identify with content and other indicator information.

As described above, the reason why the same display style is applied to the content and the indicator information is that the user reports the corresponding content in the execution screen of the first function and the display style of the indicator information, the content and the indicator information So that they can be easily interlocked with each other.

For example, the controller 180 may display the first and first indicator information as a "red" background color and the second and second indicator information as a "blue" background color, have.

3 to 10, the controller 180 may display the indicator information by aligning the indicator information with the upper or lower end of the first function execution screen, or may display the indicator information on the execution screen of the first function Or can be displayed in an overlapping manner. In addition, the controller 180 may display the indicator information at or near the corresponding contents. In addition, the controller 180 may display the indicator information in a list form on the execution screen of the selected first function. When a drag or flicking touch having a specific direction is input on the execution screen of the first function, the control unit 180 shifts the execution screen of the first function in the direction corresponding to the drag or flicking touch , And may display the indicator information in an area between the initial position of the first function execution screen and the shifted position.

The control unit 180 detects whether the specific indicator information is selected from among the indicator information [S45]. If the specific indicator information is selected (S46), the controller 180 executes the corresponding content through the function corresponding to the selected indicator information [S47 ].

Then, the control unit 180 displays the execution screen of the content (S48).

3 to 10, when the specific indicator information among the indicator information is directly touched by the user, the controller 180 executes the selected content through a function corresponding to the touched indicator information have.

3 to 10, the control unit 180 determines whether the specific content is dragged and dropped to the position where the specific indicator information is displayed or the specific indicator information is dragged and dropped to the position where the specific content is displayed The selected content can be executed through a function corresponding to the specific indicator information.

3 to 10, when the content and the specific indicator information are simultaneously multi-touched, the controller 180 can execute the selected content through a function corresponding to the specific indicator information.

3 to 10, when displaying the execution screen of the content, the control unit 180 can display the execution screen of the first function by switching to the execution screen of the content.

3 to 10, when displaying the execution screen of the content, the control unit 180 may further display an execution screen of the content in a new window form on the execution screen of the first function .

3 to 10, the control unit 180 divides the screen of the display unit 151 into first and second areas when the execution screen of the content is displayed, An execution screen of the first function and an execution screen of the content can be displayed in the second area.

3 to 10, when displaying the execution screen of the content, the control unit 180 displays the execution screen of the content as a thumbnail at the display position of the content in the execution screen of the first function It may be displayed in a reduced size.

3 to 10, when a drag or flicking touch having a specific direction is inputted on the execution screen of the first function, the controller 180 controls the dragging or flicking touch in the direction corresponding to the drag or flicking touch The execution screen of the first function may be shifted and the execution screen of the content may be displayed in an area between the initial position of the first function execution screen and the shifted position.

Next, Fig. 12 illustrates the process of S41 to S48 of Fig. 11 as an example.

For example, FIG. 12A shows that the functions currently being executed simultaneously are the web, the document viewer, the music reproduction, and the YouTube function, and the function displayed on the current screen is the document viewer function including one or more sentences 311.

If the first sentence 311 in the document viewer screen 310 is interlocked with one or more functions currently being simultaneously executed, The first sentence 311 is displayed in the document viewer screen 310 so as to be identified.

For example, in FIG. 12 (b), a highlight is displayed in the first sentence 311 to indicate that it is identified in the document viewer screen 310.

The controller 180 displays indicator information 320 and 330 indicating functions that can be interlocked with the first sentence 311 among functions currently being simultaneously executed.

For example, FIG. 12 illustrates a web function and a YouTube function that can search for information or multimedia that is the same or related to the first sentence 311 as a function that can be interlocked with the first sentence 311.

If the indicator information 320 corresponding to the web function is selected from the indicator information, the controller 180 determines whether the indicator information 320 corresponding to the web function is selected through the Web function, as shown in FIG. 12 (c) Information 311A, and displays the retrieved information 311A.

Next, FIG. 13A shows a document viewer screen 310 including two or more first and second sentences 311 and 312.

In this case, the control unit 180 recognizes functions that can be interlocked with the first and second sentences 311 and 312, respectively, among functions currently being simultaneously executed.

For example, in FIG. 13, functions that can be interlocked with the first sentence 311 are Web and YouTube functions, and a function capable of interlocking with the second sentence 312 is a music playback function.

13 (b) to (d), the controller 180 displays indicator information 320 and 330 indicating functions that can be interlocked with the first sentence 311 in the document viewer screen The first sentence 311 and the corresponding indicator information 320, 330 are given the same display style so that they are identified together in the first sentence 311 and the second sentence 310.

13 (b) to (d), the controller 180 displays indicator information 330 indicating functions that can be interlocked with the second sentence 312 in the document viewer screen 310 The same sentence style is given to the second sentence 312 and the corresponding indicator information 330 to be identified together.

For example, as shown in FIG. 13 (b), the controller 180 displays a highlight of the first color in the first sentence 311 and corresponding indicator information 320 and 330, The second sentence 312 and the corresponding indicator information 330 show the highlight of the second color different from the second color.

13 (c), the control unit 180 displays the first sentence 311 and the corresponding indicator information 320 and 330 in a 3D form, and the second sentence 312, And the corresponding indicator information 330 in a 2D form.

13 (d), the control unit 180 displays the first sentence 311 and corresponding indicator information 320 and 330 in bold, and the second sentence 312 and corresponding And the indicator information 330 is underlined.

Third Embodiment

In the third embodiment of the present invention, when the execution screen of the first function among the simultaneously executing functions is displayed, indicator information indicating the functions currently being simultaneously executed is displayed, and the specific content in the execution screen of the first function is selected The indicator information corresponding to the function capable of interlocking with the selected content among the indicator information is displayed so as to be distinguished from the other indicator information so as to inform the user of a function capable of interlocking with the selected content through the indicator information It is about the process that can be done.

Hereinafter, a third embodiment of the present invention will be described in detail with reference to Figs. 14 and 15. Fig.

FIG. 14 is a flowchart of a third embodiment of the present invention, illustrating a process of simultaneously linking functions currently executing and content in a current function screen.

FIG. 15 is an explanatory diagram of a third embodiment showing a process of interlocking the functions currently executing simultaneously with the content in the current function screen according to the present invention.

14, the controller 180 of the portable terminal 100 executes a function of a specific first function selected by the user from the simultaneously executing functions while simultaneously executing two or more functions selected by the user And displays the screen on the display unit 151 (S51).

The control unit 180 displays indicator information indicating the functions currently being simultaneously executed in the execution screen of the first function [S52], and determines whether at least one content among the contents included in the execution screen of the first function is selected (S53).

If the at least one content is selected (S54), the control unit 180 recognizes one or more functions that can interwork with the selected content among remaining functions currently being executed except for the first function (S55 ].

When one or more functions capable of interworking with the selected content are identified among functions currently being simultaneously executed, the controller 180 displays indicator information corresponding to the identified functions from among the indicator information with other indicator information (S56).

Preferably, the controller 180 displays the display style of the indicator information corresponding to the identified functions differently from the display style of the other indicator information.

For example, the controller 180 may identify the indicator information by blinking indicator information corresponding to the identified functions or displaying a highlight of a specific color.

In addition, the controller 180 may display indicator information corresponding to the identified functions in a 3D form, and display the other indicator information in a 2D form to be distinguished from each other.

In addition, the control unit 180 can enlarge and display the indicator information corresponding to the identified functions, thereby displaying the indicator information to be distinguished from the other indicator information. In addition, the control unit 180 can enlarge and display the indicator information corresponding to the identified functions, thereby displaying the indicator information to be distinguished from the other indicator information.

In addition, the controller 180 generates guide information for guiding the user to the selection of the indicator information corresponding to the identified functions, and displays the generated guide information to allow the user to interact with the currently selected content The selection of the indicator information corresponding to the functions can be induced. For example, the guide information may be an arrow indicating the indicator information at the display position of the selected content.

The control unit 180 detects whether specific indicator information is selected from among the indicator information to be identified, executes the selected content through the function corresponding to the selected indicator information [S57] ], An execution screen of the content is displayed (S60).

FIG. 15 illustrates the process of S51 to S60 of FIG. 14 as an example.

For example, FIG. 15A shows that the functions currently being executed simultaneously are the web, the document viewer, the music reproduction, and the YouTube function, and the function displayed on the current screen is the document viewer function including two or more sentences 311 and 312 have.

When the document viewer screen 310 is displayed, the control unit 180 displays indicator information 320, 330, and 340 indicating the currently executing functions in the document viewer screen 310.

The controller 180 displays the first sentence 311 in the document viewer screen 310 while displaying the indicator information 320, 330, and 340 as shown in FIG. 15 (b) The indicator information 320 corresponding to functions selected to be interlocked with the selected first sentence 311 among the indicator information 320, 330, and 340, as shown in FIG. 15 (c) (320, 330) to be distinguished from other indicator information (340).

That is, the user can view the indicator information 320 and 330 displayed to be identified, and can grasp the functions that can be linked with the content selected by the user.

When the indicator information 320 corresponding to the web function is selected from among the indicator information 320 and 330 to be identified as shown in FIG. 15 (d), the controller 180 displays the selected indicator information 320 Searches the information 311A associated with the selected first sentence 311 and displays the retrieved information 311A.

It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer-readable medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . Also, the computer may include a control unit 180 of the terminal.

Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

The above-described mobile terminal and its control method are not limited to the configuration and method of the above-described embodiments, but the embodiments may be modified such that all or some of the embodiments are selectively And may be configured in combination.

100: mobile terminal 110: wireless communication unit
111: broadcast receiver 112: mobile communication module
113 wireless Internet module 114 short-range communication module
115: Position information module 120: A / V input section
121: camera 122: microphone
130: user input unit 140: sensing unit
141: proximity sensor 150: output section
151: Display unit 152: Acoustic output module
153: Alarm module 154: Haptic module
155: projector module 160: memory
170: interface unit 180: control unit
181: Multimedia module 190: Power supply

Claims (16)

  1. A touch screen for displaying an execution screen of a specific application among two or more currently executing applications in a multi-tasking manner; And
    If at least one content among two or more contents included in the execution screen is selected, at least one application capable of interlocking with the selected at least one content among the currently executing applications other than the specific applications is identified, And a controller for generating and displaying an information indicator indicating at least one application and executing the selected content through an application corresponding to the information indicator when the information indicator is selected,
    Wherein the identified at least one application comprises:
    And a web application capable of searching for information associated with the selected content via the web,
    Wherein,
    Retrieving information associated with the content through the web application,
    When the drag-and-touch action of a user having a specific direction is input on the execution screen, the execution screen is shifted in a direction corresponding to the drag-
    And controls to display the searched information in a display area between an initial position of the execution screen and a shifted position of the execution screen.
  2. The method according to claim 1,
    Wherein the information indicator includes at least one of a thumbnail image, an icon, and text indicating a corresponding application.
  3. The method according to claim 1,
    Wherein the control unit displays the information indicator around the selected content or displays the information indicator transparently on the execution screen.
  4. The method according to claim 1,
    Wherein the control unit executes the content through an application corresponding to the information indicator when the content is touched and the information indicator is dragged to a displayed position or the information indicator is touched and the content is dragged to a displayed position, terminal.
  5. The method according to claim 1,
    Wherein the control unit recognizes at least one application belonging to a predetermined group among the running applications and displays an information indicator indicating an application capable of interlocking with the selected content among the identified applications.
  6. The method according to claim 1,
    Wherein the control unit executes the selected content through an application corresponding to the information indicator when the information indicator is selected and switches the execution screen to the execution screen of the content or displays the execution screen of the content on the execution screen .
  7. The method according to claim 1,
    Wherein the control unit executes the selected content through an application corresponding to the information indicator when the information indicator is selected and displays an execution screen of the content in a thumbnail form at the selected content display position in the execution screen, Mobile terminal.
  8. The method according to claim 1,
    Wherein the control unit executes the selected content through an application corresponding to the information indicator when the information indicator is selected and divides the execution screen into first and second areas, And displays the execution screen and the execution screen of the content, respectively.
  9. The method according to claim 1,
    Wherein the control unit grasps at least one content that can be linked through at least one application among the running applications among the contents in the execution screen before the content is selected, And displays one content to be distinguished from other contents.
  10. delete
  11. delete
  12. delete
  13. delete
  14. delete
  15. delete
  16. delete
KR1020110118049A 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same KR101871711B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110118049A KR101871711B1 (en) 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR1020110118049A KR101871711B1 (en) 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same
EP12007562.7A EP2592548B1 (en) 2011-11-14 2012-11-07 Mobile terminal and controlling method thereof
US13/676,969 US9769299B2 (en) 2011-11-14 2012-11-14 Mobile terminal capable of recognizing at least one application inter-workable with another executed application and controlling method thereof

Publications (2)

Publication Number Publication Date
KR20130052801A KR20130052801A (en) 2013-05-23
KR101871711B1 true KR101871711B1 (en) 2018-06-27

Family

ID=48662174

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110118049A KR101871711B1 (en) 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR101871711B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170019789A (en) 2015-08-12 2017-02-22 삼성전자주식회사 Deveice and method for executing application

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100688046B1 (en) 2005-09-01 2007-03-02 (주) 엘지텔레콤 Display device for portable equipment and method thereof
US20100313156A1 (en) * 2009-06-08 2010-12-09 John Louch User interface for multiple display regions
US20110167339A1 (en) * 2010-01-06 2011-07-07 Lemay Stephen O Device, Method, and Graphical User Interface for Attachment Viewing and Editing

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070031836A (en) * 2004-07-21 2007-03-20 소니 가부시끼 가이샤 Content processing device, content processing method, and computer program
KR101488389B1 (en) * 2008-12-19 2015-01-30 엘지전자 주식회사 Mobile terminal and operation method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100688046B1 (en) 2005-09-01 2007-03-02 (주) 엘지텔레콤 Display device for portable equipment and method thereof
US20100313156A1 (en) * 2009-06-08 2010-12-09 John Louch User interface for multiple display regions
US20110167339A1 (en) * 2010-01-06 2011-07-07 Lemay Stephen O Device, Method, and Graphical User Interface for Attachment Viewing and Editing

Also Published As

Publication number Publication date
KR20130052801A (en) 2013-05-23

Similar Documents

Publication Publication Date Title
KR101873744B1 (en) Mobile terminal and method for controlling the same
KR101859100B1 (en) Mobile device and control method for the same
KR101873413B1 (en) Mobile terminal and control method for the mobile terminal
KR101919788B1 (en) Mobile terminal and method for controlling thereof
KR101595029B1 (en) Mobile terminal and method for controlling the same
KR101740436B1 (en) Mobile terminal and method for controlling thereof
KR101500741B1 (en) Mobile terminal having a camera and method for photographing picture thereof
KR101526998B1 (en) a mobile telecommunication device and a power saving method thereof
KR101863926B1 (en) Mobile terminal and method for controlling thereof
KR101860341B1 (en) Mobile terminal and control method for the same
KR101561703B1 (en) The method for executing menu and mobile terminal using the same
KR101481556B1 (en) A mobile telecommunication terminal and a method of displying an object using the same
KR101623783B1 (en) Mobile terminal and method for extracting data thereof
KR101608532B1 (en) Method for displaying data and mobile terminal thereof
KR101520689B1 (en) a mobile telecommunication device and a method of scrolling a screen using the same
KR20140089245A (en) Method for controlling using double touch jesture and the terminal thereof
KR101984592B1 (en) Mobile terminal and method for controlling the same
KR101572892B1 (en) Mobile terminal and Method for displying image thereof
KR101504210B1 (en) Terminal and method for controlling the same
KR101781852B1 (en) Mobile terminal and method for controlling the same
KR20140141269A (en) Mobile terminal and controlling method thereof
KR101576292B1 (en) The method for executing menu in mobile terminal and mobile terminal using the same
KR101788046B1 (en) Mobile terminal and method for controlling the same
KR101451667B1 (en) Terminal and method for controlling the same
KR101559178B1 (en) Method for inputting command and mobile terminal using the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant