KR20130052801A - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR20130052801A
KR20130052801A KR1020110118049A KR20110118049A KR20130052801A KR 20130052801 A KR20130052801 A KR 20130052801A KR 1020110118049 A KR1020110118049 A KR 1020110118049A KR 20110118049 A KR20110118049 A KR 20110118049A KR 20130052801 A KR20130052801 A KR 20130052801A
Authority
KR
South Korea
Prior art keywords
content
information
function
screen
functions
Prior art date
Application number
KR1020110118049A
Other languages
Korean (ko)
Other versions
KR101871711B1 (en
Inventor
최병윤
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110118049A priority Critical patent/KR101871711B1/en
Priority claimed from EP12007562.7A external-priority patent/EP2592548B1/en
Publication of KR20130052801A publication Critical patent/KR20130052801A/en
Application granted granted Critical
Publication of KR101871711B1 publication Critical patent/KR101871711B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution
    • G06F9/4856Task life-cycle, e.g. stopping, restarting, resuming execution resumption being on a different machine, e.g. task migration, virtual machine migration
    • G06F9/4862Task life-cycle, e.g. stopping, restarting, resuming execution resumption being on a different machine, e.g. task migration, virtual machine migration the task being a mobile agent, i.e. specifically designed to migrate
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The present invention searches for a function capable of interworking with content included in the first function screen among the functions and displays a first function screen among functions being executed at the same time, and informs the user. The present invention relates to a mobile terminal capable of immediately executing the content and a control method thereof.

Description

MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME

The present invention relates to a portable terminal and a control method thereof, in which the use of the terminal can be realized by further considering the convenience of the user.

A terminal such as a personal computer, a notebook computer, a mobile phone, or the like can be configured to perform various functions. Examples of such various functions include a data and voice communication function, a function of photographing a video or a moving image through a camera, a voice storage function, a music file playback function through a speaker system, and an image or video display function. Some terminals include additional functions to execute games, and some other terminals are also implemented as multimedia devices. Moreover, recent terminals can receive a broadcast or multicast signal to view a video or television program.

In general, the terminal is movable The mobile terminal can be divided into a mobile terminal and a stationary terminal depending on whether the mobile terminal is portable or not, and the mobile terminal can be divided into a handheld terminal and a vehicle mount terminal.

Currently, due to an increase in the use of terminals such as smartphones, many applications that provide various kinds of functions available in the smartphones have been developed.

Accordingly, the terminal provides a multitasking function that can simultaneously execute functions such as several applications. For example, a user may listen to music through a terminal and simultaneously execute functions such as a web page, an image, a document, and the like.

However, in order for the user to switch the currently displayed first function screen to another second function screen currently being multitasked, the user selects the second function within the switched home screen after switching the current first function screen to the home screen. There is an uncomfortable problem to switch to.

An object of the present invention, in the state of displaying a first function screen among the functions being executed at the same time, to search for a function capable of interworking with the content included in the first function screen among the functions (information) to the user, A mobile terminal and a control method thereof capable of immediately executing the content through the retrieved function are provided.

According to an aspect of the present invention, there is provided a portable terminal supporting multi-tasking, comprising: a touch screen displaying a screen of a specific function among two or more functions simultaneously executed; When at least one content is selected from among contents included in the specific function screen, information indicating at least one function capable of executing the selected content among the concurrently executed functions is displayed. When the information is selected, the information is displayed. And a controller for executing the selected content through a function corresponding to information.

In addition, the present invention provides a method of controlling a portable terminal supporting multi-tasking, comprising: executing two or more functions simultaneously; Displaying a screen of a specific function among the concurrently executing functions; Searching for at least one function capable of executing the selected content among the concurrently executing functions when a specific content is selected from among contents included in the specific function screen; Displaying information indicative of the retrieved at least one function; And if the information is selected, executing the selected content through a function corresponding to the information.

A portable terminal and a control method thereof according to the present invention provide a function capable of interworking with content included in the first function screen among the functions while displaying a first function screen among functions that are being executed simultaneously. The user is searched and notified to the user, and the user provides an effect of directly executing the content through the searched function.

1 is a block diagram illustrating a mobile terminal according to an embodiment of the present invention.
2A is a front perspective view of an example of a mobile terminal according to the present invention;
FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.
3 is a flowchart of a first embodiment showing a process of interworking functions simultaneously being executed and content in a current function screen according to the present invention.
4 to 10 are diagrams illustrating a first embodiment illustrating a process of interworking functions simultaneously executed and contents in a current function screen according to the present invention.
FIG. 11 is a flowchart of a second embodiment showing a process of interworking functions simultaneously being executed and content in a current function screen according to the present invention.
12 and 13 are diagrams illustrating a second embodiment illustrating a process of interworking functions simultaneously being executed and content in a current function screen according to the present invention.
14 is a flowchart of a third embodiment illustrating a process of interworking functions currently being executed and content in a current function screen according to the present invention.
FIG. 15 is an explanatory diagram of a third embodiment showing a process of interworking functions simultaneously being executed and content in a current function screen according to the present invention; FIG.

Hereinafter, a portable terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The portable terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to the portable terminal.

1 is a block diagram of a portable terminal according to an embodiment of the present invention.

The portable terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in Fig. 1 are not essential, and a portable terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the wireless terminal 100 and the wireless communication system or between the wireless terminal 100 and a network in which the wireless terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only A digital broadcasting system such as DVB-CB, OMA-BCAST, or Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 refers to a module for wireless Internet access, and may be built in or externally mounted in the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The location information module 115 is a module for acquiring the location of the portable terminal, and a representative example thereof is a GPS (Global Position System) module. According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites, and then applies trigonometry to the calculated information to obtain a three-dimensional string of latitude, longitude, The location information can be accurately calculated. At present, a method of calculating position and time information using three satellites and correcting an error of the calculated position and time information using another satellite is widely used. In addition, the GPS module 115 can calculate speed information by continuously calculating the current position in real time.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ The camera 121 may be provided with two or more cameras such as a camera for left and right eyes for generating a 3D preview image and a camera for self-photographing according to a usage environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal.

The user input unit 130 may receive from the user a signal designating two or more contents among the displayed contents according to the present invention. A signal for designating two or more contents may be received via the touch input, or may be received via the hard key and soft key input.

The user input unit 130 may receive an input from the user for selecting the one or more contents. In addition, the user may receive an input for generating an icon related to a function that the portable terminal 100 can perform.

The user input unit 130 may include a directional keypad, a keypad, a dome switch, a touchpad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, the presence of the user, the orientation of the portable terminal, And generates a sensing signal for controlling the operation of the portable terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed. Meanwhile, the sensing unit 140 may include a proximity sensor 141. The proximity sensor 141 will be described later in relation to the touch screen.

The output unit 150 is for generating an output relating to visual, auditory or tactile sense and includes a display unit 151, an acoustic output module 152, an alarm unit 153, a haptic module 154, 155, and the like.

The display unit 151 displays (outputs) the information processed in the portable terminal 100. For example, when the portable terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received image, UI, or GUI is displayed.

As described above, the display unit 151 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), The display device may include at least one of a flexible display and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the portable terminal 100. [ For example, in the portable terminal 100, a plurality of display units may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

The proximity sensor 141 may be disposed in an inner area of the portable terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception tones, message reception tones, etc.) performed in the portable terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events occurring in the portable terminal include receiving a call signal, receiving a message, inputting a key signal, and touch input. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. In this case, the display unit 151 and the audio output module 152 may be a type of the alarm unit 153. The display unit 151 and the audio output module 152 may be connected to the display unit 151 or the audio output module 152, .

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through an injection or inlet port, grazing to the skin surface, contact of an electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The projector module 155 is a component for performing an image project function using the portable terminal 100 and is the same as an image displayed on the display unit 151 according to a control signal of the controller 180. Or at least partially different images may be displayed on an external screen or wall.

Specifically, the projector module 155 includes a light source (not shown) that generates light (for example, laser light) for outputting an image to the outside, a light source And a lens (not shown) for enlarging and outputting the image at a predetermined focal distance to the outside. Further, the projector module 155 may include a device (not shown) capable of mechanically moving the lens or the entire module to adjust the image projection direction.

The projector module 155 can be divided into a CRT (Cathode Ray Tube) module, an LCD (Liquid Crystal Display) module and a DLP (Digital Light Processing) module according to the type of the display means. In particular, the DLP module may be advantageous for miniaturization of the projector module 151 by enlarging and projecting an image generated by reflecting light generated from a light source on a DMD (Digital Micromirror Device) chip.

Preferably, the projector module 155 may be provided in the longitudinal direction on the side, front, or back of the portable terminal 100. Of course, the projector module 155 may be provided at any position of the mobile terminal 100 as necessary.

The memory 160 may store a program for processing and controlling the controller 180 and may store the input / output data (e.g., a telephone directory, a message, an audio, a still image, an electronic book, History, and the like). The memory 160 may also store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia). In addition, the memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory, etc.) ), A random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read- A magnetic disk, an optical disk, a memory, a magnetic disk, or an optical disk. The portable terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the portable terminal 100. The interface unit 170 receives data from an external device or receives power from the external device and transmits the data to each component in the portable terminal 100 or allows data in the portable terminal 100 to be transmitted to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip that stores various types of information for authenticating the use authority of the mobile terminal 100, and includes a user identification module (UIM), a subscriber identification module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the portable terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the portable terminal 100, or various command signals input from the cradle by the user It can be a passage to be transmitted to the terminal. The various command signals input from the cradle or the power source may be operated as a signal for recognizing that the portable terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the portable terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

2A is a front perspective view of an example of a mobile terminal according to the present invention;

The disclosed portable terminal 100 has a bar-shaped main body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).

The display unit 151, the sound output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, the interface 170, and the like may be disposed in the front body 101 have.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion.

The content input by the first or second manipulation units 131 and 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 The touch recognition mode can be activated or deactivated.

FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 'may be further mounted on the rear surface of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to that of the camera 121 (see FIG. 2A), and may be a camera having the same or different pixels as the camera 121.

For example, the camera 121 may have a low pixel so that the face of the user can be photographed and transmitted to the other party in case of a video call or the like, and the camera 121 ' It is preferable to have a large number of pixels. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-upable.

A flash 123 and a mirror 124 may be additionally disposed adjacent to the camera 121 '. The flash 123 illuminates the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to illuminate the user's own face or the like when the user intends to shoot himself / herself (self-photographing) using the camera 121 '.

An acoustic output module 152 'may be additionally disposed on the rear side of the terminal body. The sound output unit 152 ′ may implement a stereo function together with the sound output module 152 (see FIG. 2A), and may be used to implement a speakerphone mode during a call.

In addition to the antenna for communication, a broadcast signal receiving antenna 116 may be additionally disposed on the side of the terminal body. The antenna 116 constituting a part of the broadcast receiving unit 111 (refer to FIG. 1) may be installed to be pulled out from the terminal body.

The terminal body is equipped with a power supply unit 190 for supplying power to the portable terminal 100. The power supply unit 190 may be built in the terminal body or may be directly detachable from the outside of the terminal body.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may be of a light transmission type for the display unit 151. [ In this case, if the display unit 151 is configured to output visual information on both sides (i.e., in both the front and rear sides of the mobile terminal), the visual information can be recognized through the touch pad 135 as well. do. The information output on both sides may be all controlled by the touch pad 135. [

Meanwhile, the display for exclusive use of the touch pad 135 may be separately installed, so that the touch screen may be disposed in the rear case 102 as well.

The touch pad 135 operates in association with the display unit 151 of the front case 101. The touch pad 135 may be disposed parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

The portable terminal referred to herein may include at least one of the components shown in FIG. 1. In addition, the controller 180 may control an individual operation of an element or a link operation between a plurality of elements in order to perform an operation using a component (for example, a touch screen, a wireless communication unit, or a memory).

Hereinafter, with reference to FIGS. 3 to 15, in a state in which a first function screen is displayed among functions being executed simultaneously according to the present invention, interworking with content included in the first function screen is performed. A process of searching for a possible function and informing a user and immediately executing the content through the found function will be described in detail.

First Embodiment

According to the first embodiment of the present invention, when a specific content in the first function screen is selected in a state in which a first function screen is displayed among functions that are simultaneously executed, the first embodiment of the present invention may link with the selected content among the functions that are simultaneously executed. The present invention relates to a process of notifying functions to a user and immediately executing the selected content through the interoperable functions.

Hereinafter, a first embodiment of the present invention will be described in detail with reference to FIGS. 3 to 10.

3 is a flowchart of a first embodiment showing a process of interworking functions simultaneously being executed and content in a current function screen according to the present invention.

4 to 10 are diagrams illustrating a first embodiment illustrating a process of interlocking functions currently being executed simultaneously with content in a current function screen according to the present invention.

First, referring to FIG. 3, the controller 180 of the mobile terminal 100 simultaneously executes two or more functions selected by a user, and executes a specific first function selected by the user among the concurrently executed functions. The screen is displayed on the display unit 151 [S31]. In this case, the concurrently executing functions include all functions that can be executed in the mobile terminal, and include, for example, an application that provides a specific function, a widget, a menu function, a data file, and the like.

The controller 180 detects whether at least one content is selected from among contents included in the execution screen of the first function [S32]. In this case, the contents refer to objects included in or attached to the execution screen of the first function. For example, text, a file, an image, an icon to which a specific function is assigned, a specific webpage address, a hyperlink to which a specific webpage is linked. And so on.

When the at least one content is selected [S33], the controller 180 identifies one or more functions that can be interworked with the selected content among the remaining functions that are currently being executed simultaneously except the first function [S34]. ].

In this case, the functions capable of interworking with the selected content may be functions capable of executing the selected content, may be functions associated with the selected content, and may be associated with the selected content through the memory 160 or the web. It can be a function to retrieve related information.

For example, assuming that the selected content is "LG Optimus Black" text, and the functions currently being executed simultaneously are a web browser function and a YouTube function that provides a user created contents (UCC) video, the web browser and the YouTube Since the information related to the "LG Optimus Black" may be searched for, the selected "LG Optimus Black" text and the web browser and the YouTube function may be interoperable with each other.

As another example, assuming that the selected content is a "LG Optimus Black.MP3" sound file and the functions currently being executed at the same time are a sound play function and a video play function, the "LG Optimus Black. MP3 "music file can be played, and the selected" LG Optimus Black.MP3 "sound file and the sound play function and the video play function can be linked to each other.

Meanwhile, the controller 180 may identify functions belonging to a group preset by the user among the simultaneously executed functions, and may identify functions which can be linked with the selected content among the identified functions. The process will be described later in detail with reference to FIG. 7.

As described above, when one or more functions capable of interworking with the selected content are identified among the functions currently being executed at the same time, the controller 180 generates indicator information representing each of the identified functions, and generates the indicator. Information is displayed in the execution screen of the first function [S35].

In this case, the indicator information may be in the form of a thumbnail image, text, icon, etc. representing each of the identified functions. For example, if the identified functions are applications, the indicator may be in the form of an icon of the corresponding application.

The controller 180 may display the indicator information aligned with the top or bottom of the first function execution screen.

In addition, the controller 180 may display the indicator information transparently or overlap the execution screen of the first function.

In addition, the controller 180 may display the indicator information at or around the selected content.

In addition, the controller 180 may configure and display the indicator information in a list form on the execution screen of the selected first function.

In addition, when a drag or flicking touch having a specific direction is input on the execution screen of the first function, the controller 180 shifts the execution screen of the first function in a direction corresponding to the drag or flicking touch. The indicator information may be displayed in an area between an initial position of the first function execution screen and the shifted position. The process will be described later in detail with reference to FIG. 5.

The controller 180 detects whether specific indicator information is selected among the indicator information [S36], and if the specific indicator information is selected [S37], the controller 180 executes the selected content through a function corresponding to the selected indicator information [S36]. S38].

The controller 180 displays an execution screen of the content [S39].

In this case, when specific indicator information among the indicator information is directly touched by the user, the controller 180 may execute the selected content through a function corresponding to the touched indicator information.

In addition, the controller 180 corresponds to the specific indicator information when the selected content is moved by dragging and dropping to the position where the specific indicator information is displayed or when the specific indicator information is dragging and dropping to the position where the selected content is displayed. The selected content may be executed through a function of doing so.

In addition, when the content and the specific indicator information are multi-touched at the same time, the controller 180 can execute the selected content through a function corresponding to the specific indicator information.

Next, when the execution screen of the content is displayed, the controller 180 may switch the execution screen of the first function to the execution screen of the content and display the same.

When the execution screen of the content is displayed, the controller 180 may further display the execution screen of the content in a new window form on the execution screen of the first function.

In addition, when the controller 180 displays the execution screen of the content, the controller 180 divides the screen of the display unit 151 into first and second regions, and executes the execution screen of the first function in the first and second regions. And an execution screen of the content, respectively.

In addition, when the execution screen of the content is displayed, the controller 180 may reduce and display the execution screen of the content in a thumbnail form at a display position of the content in the execution screen of the first function.

In addition, when a drag or flicking touch having a specific direction is input on the execution screen of the first function, the controller 180 shifts the execution screen of the first function in a direction corresponding to the drag or flicking touch. The execution screen of the content may be displayed in an area between an initial position of the first function execution screen and the shifted position. The process will be described later in detail with reference to FIG. 10.

Next, FIG. 4 illustrates the processes of S31 to S39 of FIG. 3 as an example.

For example, (a) of FIG. 4 shows that the functions currently being executed simultaneously are web, document viewer, music playback, and YouTube functions, and the function displayed on the current screen is a document viewer function including two or more sentences 311 and 312. have.

If all or a portion of the first sentence 311 is designated on the document viewer screen 310, the controller 180 may perform web, music playback, and YouTube functions that are currently being executed at the same time as shown in FIG. Among the functions, the functions interoperable with the designated first sentence 311 are identified, and indicator information 320 and 330 indicating the identified functions are displayed.

For example, FIG. 4 illustrates a web function and a YouTube function that can search information or multimedia that is the same as or related to the first sentence 311 as a function that can be linked to the first sentence 311.

When the indicator information 320 corresponding to the web function is selected from among the indicator information, the controller 180 is associated with the first sentence 311 through the web function, as shown in FIG. Information 311A is retrieved and the retrieved information 311A is displayed.

Next, the controller 180 generates the indicator information 320 or 330 through the above-described process, and then, on the document viewer screen 310 as shown in FIG. When a drag or flicking touch having a directionality is input, as illustrated in FIG. 5B, the document viewer screen 310 is shifted in a direction corresponding to the dragging or flicking touch, and the document viewer screen is displayed. The indicator information 320 or 330 may be displayed in an area between the initial position of 310 and the shifted position.

That is, when the indicator information is displayed on the document viewer screen 310, the user may interfere with viewing the document because of the indicator information.

Accordingly, the indicator information 320, 330 is normally hidden and displayed, and the display information 151 is displayed by displaying the indicator information 320, 330 only when the document viewer screen 310 is dragged. ) Can be used efficiently.

Next, FIG. 6 illustrates the processes of S31 to S39 of FIG. 3 as an example.

For example, as illustrated in FIGS. 6A and 6B, when all or part of the first sentence 311 is designated on the document viewer screen 310, the controller 180 is currently executing at the same time. Among the web, music playback, and YouTube functions, indicator information 320 and 330 indicating web and YouTube functions that can be linked to the designated first sentence 311 are displayed.

When the indicator information 330 corresponding to the YouTube function is selected among the indicator information, the controller 180 is associated with the first sentence 311 through the YouTube function as shown in FIG. 6C. Search for multimedia and display the list of searched multimedia.

Next, as shown in (a) of FIG. 7, the controller 180 receives a command for setting a group for two or more functions among the concurrently executed functions from the user. As shown in FIG. 2), functions currently being executed at the same time are identified, and a list 360 of functions simultaneously being executed is displayed.

As shown in FIG. 7B, when one or more functions 320 and 330 are selected in the list 360, the controller 180 groups the selected functions 320 and 330 into a memory. Save to 160.

As illustrated in (c) of FIG. 7, when the specific content 311 is selected in the document viewer screen 310, the controller 180 is assigned to a group set by the user among functions currently being executed at the same time. As shown in (d) of FIG. 7, the functions belonging to each other are identified, and among the identified functions, functions that can be linked with the selected content 311 are identified, and indicator information 320 indicating the identified functions is shown. 330 is generated and displayed.

Next, FIG. 8 illustrates a process of S35 to S39 of FIG. 3 as an example.

For example, as illustrated in FIG. 8A, the controller 180 displays indicator information 320 and 330 indicating functions that can be linked to the selected content 311, and the content 311. When the first indicator information 320 and the content 311 corresponding to the function to execute) are simultaneously multi-touched, as shown in FIG. 8C, the first indicator information 320 corresponding to the first indicator information 320 may be used. The content 311 may be executed through a function (web).

For example, in FIG. 8, the content 311 is "LG Optimus Black" text in the document viewer screen 310, and a function corresponding to the "LG Optimus Black" text and the multi-touch first indicator information 320 is described. Since it is a web, the controller 180 searches for information related to the text “LG Optimus Black” through the web and displays the searched result.

In addition, as shown in (b) of FIG. 8, the controller 180 displays indicator information 320 or 330 indicating functions capable of interworking with the selected content 311. Is dragged and dropped to the display position of the first indicator information 320, or when the first indicator information 320 is dragged and dropped to the display position of the content, shown in (c) of FIG. As described above, the content 311 may be executed through a function (web) corresponding to the first indicator information 320.

Next, FIG. 9 illustrates the process S39 of FIG. 3 as an example.

For example, as illustrated in FIG. 9A, the controller 180 selects the text “LG Optimus Black” 311 and selects the text “LG Optimus Black” 311 in the document viewer screen 310. When the first indicator information 320 indicating a web function interoperable with the selected information is selected, information associated with the text “LG Optimus Black” 311 is retrieved through the web function, as shown in FIG. 9B. Likewise, the screen of the retrieved information 311A may be further displayed on the document viewer screen 310 in a new window form.

In addition, as illustrated in FIG. 9C, the controller 180 divides the screen of the display unit 151 into first and second areas, and displays the document viewer screen (on the first and second areas). 310 and a screen of the retrieved information 311A may be displayed respectively.

In addition, as illustrated in FIG. 9D, the controller 180 may reduce the searched information 311A to a thumbnail form and display the information on or around the display position of the text “LG Optimus Black” 311. have. In this case, when the information 311A displayed in the form of the thumbnail is selected, the information 311A may be pulled up and displayed on the screen.

In addition, as illustrated in FIG. 10A, the controller 180 selects the text “LG Optimus Black” 311 from the document viewer screen 310, and the “LG Optimus Black” text 311. When the first indicator information 320 indicating the linkable web function is selected, information associated with the text “LG Optimus Black” 311 is searched through the web function.

As illustrated in FIG. 10B, when the drag or flicking touch having a specific direction is input on the document viewer screen 310, the controller 180 corresponds to the drag or flicking touch. The document viewer screen 310 may be shifted in the direction, and the retrieved information 311A may be displayed in an area between the initial position of the document viewer screen 310 and the shifted position.

Second Example

In the second embodiment of the present invention, the execution screen of the first function is selected before the specific content in the execution screen of the first function is selected while the execution screen of the first function is displayed among the functions that are being executed simultaneously. The present invention relates to a process of informing a user of contents which can be interworked with the simultaneously executing functions and corresponding functions among contents in the content and immediately executing the selected content through the interoperable functions.

Hereinafter, a second embodiment of the present invention will be described in detail with reference to FIGS. 11 to 13.

FIG. 11 is a flowchart of a second embodiment showing a process of interworking functions simultaneously being executed and content in a current function screen according to the present invention.

12 and 13 are diagrams illustrating a second embodiment illustrating a process of interworking functions simultaneously being executed and content in a current function screen according to the present invention.

First, referring to FIG. 11, while the controller 180 of the portable terminal 100 simultaneously executes two or more functions selected by a user, execution of a specific first function selected by the user among the concurrently executed functions. The screen is displayed on the display unit 151 [S41].

The controller 180 determines whether there are contents which can be linked to functions currently being executed simultaneously among contents included in the execution screen of the first function [S42].

The controller 180 displays the identified contents to be identified in the execution screen of the first function [S43].

In this case, the controller 180 makes the display style of the identified contents different from the display style of the execution screen of the first function, and when the user views the execution screen of the first function, the identified contents are visually identified. Be sure to

For example, the controller 180 displays a highlight on the identified contents, displays a blinking process that is displayed flickering, or displays the identified contents as other contents or a first function. It can be displayed in a different color from the execution screen of. In addition, the controller 180 may display the identified contents in 3D (Dimensional) form and other contents in 2D form. In addition, the controller 180 may underline only the identified contents. In addition, the controller 180 may enlarge and display the identified contents.

Subsequently, the controller 180 generates indicator information indicating functions capable of interworking with the identified contents, and displays the generated indicator information [S44].

In this case, the controller 180 may interoperate with the first content and the first indicator information among the identified contents and the indicator information, and if the second content and the second indicator can interoperate with each other, the first and the first information. The same display style is applied such that the first indicator information is identified with other contents and other indicator information in the execution screen of the first function, and the second and second indicator information are also different in the execution screen of the first function. The same display style is applied to identify with the content and other indicator information.

As described above, the reason for applying the same display style to the content and the indicator information is that the user reports the display style of the content and the indicator information in the execution screen of the first function, and the content and indicator information to which the same display style is applied. It is to make it easy to figure out what can work with each other.

For example, the controller 180 may display the first and first indicator information in a "red" background color, and display the second and second indicator information in a "blue" background color, so that they can be identified from each other. have.

Meanwhile, as illustrated in FIGS. 3 to 10, the controller 180 displays the indicator information aligned with the top or bottom of the first function execution screen or displays the indicator information on the execution screen of the first function. It may be displayed transparently or overlapped with. In addition, the controller 180 may display the indicator information at or around the corresponding contents. In addition, the controller 180 may configure and display the indicator information in a list form on the execution screen of the selected first function. In addition, when a drag or flicking touch having a specific direction is input on the execution screen of the first function, the controller 180 shifts the execution screen of the first function in a direction corresponding to the drag or flicking touch. The indicator information may be displayed in an area between an initial position of the first function execution screen and the shifted position.

The controller 180 detects whether specific indicator information is selected among the indicator information [S45], and when the specific indicator information is selected [S46], the controller 180 executes the corresponding content through a function corresponding to the selected indicator information [S47]. ].

The controller 180 displays an execution screen of the content [S48].

3 to 10, when specific indicator information among the indicator information is directly touched by the user, the controller 180 can execute the selected content through a function corresponding to the touched indicator information. have.

In addition, as described with reference to FIGS. 3 to 10, the controller 180 is moved by dragging and dropping specific content to a position where specific indicator information is displayed, or by dragging and dropping the specific indicator information to a position where the specific content is displayed. When moved, the selected content may be executed through a function corresponding to the specific indicator information.

3 to 10, when the content and the specific indicator information are multi-touched at the same time, the controller 180 can execute the selected content through a function corresponding to the specific indicator information.

3 to 10, when the execution screen of the content is displayed, the controller 180 may convert the execution screen of the first function into an execution screen of the content and display the same.

3 to 10, when the execution screen of the content is displayed, the controller 180 may further display the execution screen of the content in a new window form on the execution screen of the first function. .

3 to 10, when the controller 180 displays the execution screen of the content, the controller 180 divides the screen of the display unit 151 into first and second regions, and the first and second regions. The execution screen of the first function and the execution screen of the content may be displayed in two areas, respectively.

3 to 10, when the execution screen of the content is displayed, the controller 180 displays the execution screen of the content in a thumbnail form at a display position of the content in the execution screen of the first function. You can also zoom out.

3 to 10, when a drag or flicking touch having a specific direction is input on the execution screen of the first function, the controller 180 may move in a direction corresponding to the drag or flicking touch. The execution screen of the first function may be shifted, and the execution screen of the content may be displayed in an area between an initial position of the first function execution screen and the shifted position.

Next, FIG. 12 illustrates the processes of S41 to S48 of FIG. 11 as an example.

For example, (a) of FIG. 12 shows that the functions currently being executed at the same time are web, document viewer, music playback, and YouTube functions, and the function displayed on the current screen is a document viewer function including one or more sentences 311.

If the first sentence 311 in the document viewer screen 310 can be linked with one or more functions among the functions currently being executed at the same time, the controller 180 can display the above as shown in FIG. The first sentence 311 is displayed to be identified in the document viewer screen 310.

For example, in FIG. 12B, a highlight is displayed on the first sentence 311 to identify the document in the document viewer screen 310.

In addition, the controller 180 displays indicator information 320 and 330 indicating functions capable of interworking with the first sentence 311 among functions currently being executed simultaneously.

For example, FIG. 12 illustrates a web function and a YouTube function for searching for information or multimedia that is the same as or related to the first sentence 311 as a function that can be linked to the first sentence 311.

When the indicator information 320 corresponding to the web function is selected from among the indicator information, the controller 180 is associated with the first sentence 311 through the web function, as shown in FIG. Information 311A is retrieved and the retrieved information 311A is displayed.

Next, FIG. 13A illustrates a document viewer screen 310 including two or more first and second sentences 311 and 312.

In this case, the controller 180 identifies functions that can be linked to the first and second sentences 311 and 312, respectively, among functions that are being executed simultaneously.

For example, FIG. 13 shows that the functions capable of interworking with the identified first sentence 311 are web and YouTube functions, and the function capable of interworking with the identified second sentence 312 is a music reproduction function.

In this case, as shown in FIGS. 13B to 13D, the controller 180 displays indicator information 320 and 330 indicating functions capable of interworking with the first sentence 311. The same sentence style is given to the first sentence 311 and the corresponding indicator information 320 and 330 to be identified together at 310.

In addition, as shown in FIGS. 13B to 13D, the controller 180 may display indicator information 330 indicating functions capable of interworking with the second sentence 312 on the document viewer screen 310. The same sentence style is given to the second sentence 312 and the indicator information 330 so as to be identified together.

For example, as illustrated in FIG. 13B, the controller 180 displays a highlight of a first color on the first sentence 311 and corresponding indicator information 320 and 330. The two sentences 312 and the indicator information 330 indicate that the highlight of the second color different from the second color is displayed.

In addition, as shown in FIG. 13C, the controller 180 displays the first sentence 311 and corresponding indicator information 320 and 330 in 3D form, and the second sentence 312. And the indicator information 330 in 2D form.

In addition, as illustrated in FIG. 13D, the controller 180 displays the first sentence 311 and the indicator information 320 and 330 in bold, and the second sentence 312 and the corresponding sentence. The underline is indicated in the indicator information 330.

Third Embodiment

According to the third embodiment of the present invention, when the execution screen of the first function is displayed among the functions that are being executed at the same time, indicator information indicating functions currently being executed at the same time is displayed, and specific content in the execution screen of the first function is selected. In this case, the indicator information corresponding to the function capable of interworking with the selected content among the indicator information is displayed to be identified with other indicator information, thereby informing the user of a function capable of interworking with the selected content through the identified indicator information. It is about the process that can be.

Hereinafter, a third embodiment of the present invention will be described in detail with reference to FIGS. 14 and 15.

14 is a flowchart of a third embodiment illustrating a process of interworking functions currently being executed and content in a current function screen according to the present invention.

FIG. 15 is an explanatory diagram of a third embodiment showing a process of interworking functions simultaneously being executed and content in a current function screen according to the present invention; FIG.

First, referring to FIG. 14, while the controller 180 of the portable terminal 100 simultaneously executes two or more functions selected by the user, execution of a specific first function selected by the user among the concurrently executed functions. The screen is displayed on the display unit 151 [S51].

The controller 180 displays indicator information indicating respective functions currently being executed simultaneously in the execution screen of the first function [S52], and determines whether at least one content is selected from the contents included in the execution screen of the first function. Detect [S53].

When the at least one content is selected [S54], the controller 180 identifies one or more functions that can be interworked with the selected content among the remaining functions currently being executed simultaneously except for the first function [S55]. ].

If one or more functions capable of interworking with the selected content are identified among functions currently being executed at the same time, the controller 180 may display indicator information corresponding to the identified functions among the indicator information with other indicator information. Indicated to be identified [S56].

Preferably, the controller 180 displays the display style of the indicator information corresponding to the identified functions differently from the display style of the other indicator information.

For example, the controller 180 may be identified with the other indicator information by blinking indicator information corresponding to the identified functions or by displaying a highlight of a specific color.

In addition, the controller 180 may display the indicator information corresponding to the identified functions in 3D form, and the other indicator information in 2D form to display the indicator information.

In addition, the controller 180 may enlarge and display the indicator information corresponding to the identified functions, and display the indicator information to be distinguished from the other indicator information. In addition, the controller 180 may enlarge and display the indicator information corresponding to the identified functions, and display the indicator information to be distinguished from the other indicator information.

In addition, the controller 180 generates guide information for guiding a user to the selection of indicator information corresponding to the identified functions and displays the generated guide information, thereby enabling the user to interact with the content currently selected by the user. Selection of indicator information corresponding to functions may be induced. For example, the guide information may be an arrow indicating corresponding indicator information at a display position of the selected content.

The controller 180 detects whether specific indicator information is selected among the indicator information to be identified [S57], and if selected [S58], executes the selected content through a function corresponding to the selected indicator information [S59]. ], Display an execution screen of the content [S60].

FIG. 15 illustrates the processes of S51 to S60 of FIG. 14 as an example.

For example, (a) of FIG. 15 shows that the functions currently being executed simultaneously are web, document viewer, music playback, and YouTube functions, and the function displayed on the current screen is a document viewer function including two or more sentences 311 and 312. have.

When the document viewer screen 310 is displayed, the controller 180 displays indicator information 320, 330, and 340 indicative of functions currently being executed at the same time in the document viewer screen 310.

As illustrated in FIG. 15B, the controller 180 displays the indicator information 320, 330, and 340, and then displays the first sentence 311 in the document viewer screen 310. If all or a part of is designated, as shown in (c) of FIG. 15, indicator information corresponding to functions that can be linked to the selected first sentence 311 among the indicator information 320, 330, and 340. These are displayed to be identified with other indicator information 340.

That is, the user can identify the functions that can be linked to the content selected by the user by viewing the indicator information 320 and 330 displayed to be identified.

If the indicator information 320 corresponding to the web function is selected from among the indicator information 320 and 330 displayed to be identified, the controller 180 selects the selected indicator information 320 as shown in FIG. Search for the information 311A associated with the selected first sentence 311 and display the searched information 311A.

It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, optical data storage devices, and the like, which are also implemented in the form of carrier waves (eg, transmission over the Internet). It also includes. Also, the computer may include a control unit 180 of the terminal.

Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.

The above-described mobile terminal and its control method are not limited to the configuration and method of the above-described embodiments, but the embodiments may be modified such that all or some of the embodiments are selectively And may be configured in combination.

100: mobile terminal 110: wireless communication unit
111: broadcast receiver 112: mobile communication module
113 wireless Internet module 114 short-range communication module
115: Position information module 120: A / V input section
121: camera 122: microphone
130: user input unit 140: sensing unit
141: proximity sensor 150: output section
151: Display unit 152: Acoustic output module
153: Alarm module 154: Haptic module
155: Projector Module 160: Memory
170: interface unit 180: control unit
181: Multimedia module 190: Power supply

Claims (16)

  1. In a mobile terminal that supports multi-tasking (Multi-tasking),
    A touch screen displaying a screen of a specific function among two or more functions executed simultaneously; And
    If at least one content is selected from among contents included in the specific function screen, information indicating at least one function capable of executing the selected content among the concurrently executing functions is displayed, and if the information is selected, And a controller for executing the selected content through a function corresponding to information.
  2. The method according to claim 1,
    The functions are portable terminals, characterized in that the applications installed in the portable terminal.
  3. The method according to claim 1,
    The information may include at least one of a thumbnail image, an icon, and a text indicating a corresponding function.
  4. The method according to claim 1,
    The controller may be configured to display the information around the selected content or transparently on the specific function screen.
  5. The method according to claim 1,
    And the controller executes the content through a function corresponding to the information when the content is moved to a location where the information is displayed or when the information is moved to a location where the information is displayed.
  6. The method according to claim 1,
    The controller may be configured to identify functions belonging to a group preset by a user among the simultaneously executed functions, and to display information indicating at least one function capable of executing the selected content among the identified functions. Mobile terminal.
  7. The method according to claim 1,
    When the information is selected, the controller executes the selected content through a function corresponding to the information, and converts the specific function screen into an execution screen of the content.
  8. The method according to claim 1,
    When the information is selected, the controller executes the selected content through a function corresponding to the information, and further displays an execution screen of the content on the specific function screen.
  9. The method according to claim 1,
    When the information is selected, the controller executes the selected content through a function corresponding to the information, and displays an execution screen of the content in a thumbnail form on the selected content display position in the specific function screen. Mobile terminal.
  10. The method according to claim 1,
    When the information is selected, the controller executes the selected content through a function corresponding to the information, divides the specific function screen into first and second areas, and specifies the first and second areas. And a function screen and an execution screen of the content, respectively.
  11. The method according to claim 1,
    Before the content is selected, the controller identifies one or more contents executable through one or more functions simultaneously executed from among contents in the specific function screen, and the one or more contents identified in the specific function screen. To distinguish it from other content,
    And the selected content is the content displayed to be identified.
  12. 12. The method of claim 11,
    The controller may display the determined display style of the content differently from other contents in the specific function screen.
  13. The method according to claim 1,
    The controller may be configured to display the selected content and the information in the same style so as to be identified in the specific function screen.
  14. The method according to claim 1,
    The controller is configured to display information representing the functions that are being executed at the same time on the specific function screen, and when the content is selected, information corresponding to at least one function capable of executing the content among the information is identified. A mobile terminal, characterized in that for displaying.
  15. 15. The method of claim 14,
    The controller may display guide information for guiding a user to select information of a function capable of executing the selected content.
  16. In the control method of a mobile terminal that supports multi-tasking (Multi-tasking),
    Executing two or more functions simultaneously;
    Displaying a screen of a specific function among the concurrently executing functions;
    Searching for at least one function capable of executing the selected content among the concurrently executed functions when a specific content is selected from among contents included in the specific function screen;
    Displaying information indicating the retrieved at least one function; And
    And if the information is selected, executing the selected content through a function corresponding to the information.
KR1020110118049A 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same KR101871711B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110118049A KR101871711B1 (en) 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR1020110118049A KR101871711B1 (en) 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same
EP12007562.7A EP2592548B1 (en) 2011-11-14 2012-11-07 Mobile terminal and controlling method thereof
US13/676,969 US9769299B2 (en) 2011-11-14 2012-11-14 Mobile terminal capable of recognizing at least one application inter-workable with another executed application and controlling method thereof

Publications (2)

Publication Number Publication Date
KR20130052801A true KR20130052801A (en) 2013-05-23
KR101871711B1 KR101871711B1 (en) 2018-06-27

Family

ID=48662174

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110118049A KR101871711B1 (en) 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR101871711B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10310866B2 (en) 2015-08-12 2019-06-04 Samsung Electronics Co., Ltd. Device and method for executing application

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100688046B1 (en) * 2005-09-01 2007-03-02 (주) 엘지텔레콤 Display device for portable equipment and method thereof
KR20070031836A (en) * 2004-07-21 2007-03-20 소니 가부시끼 가이샤 Content processing device, content processing method, and computer program
KR20100071681A (en) * 2008-12-19 2010-06-29 엘지전자 주식회사 Mobile terminal and operation method thereof
US20100313156A1 (en) * 2009-06-08 2010-12-09 John Louch User interface for multiple display regions
US20110167339A1 (en) * 2010-01-06 2011-07-07 Lemay Stephen O Device, Method, and Graphical User Interface for Attachment Viewing and Editing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070031836A (en) * 2004-07-21 2007-03-20 소니 가부시끼 가이샤 Content processing device, content processing method, and computer program
KR100688046B1 (en) * 2005-09-01 2007-03-02 (주) 엘지텔레콤 Display device for portable equipment and method thereof
KR20100071681A (en) * 2008-12-19 2010-06-29 엘지전자 주식회사 Mobile terminal and operation method thereof
US20100313156A1 (en) * 2009-06-08 2010-12-09 John Louch User interface for multiple display regions
US20110167339A1 (en) * 2010-01-06 2011-07-07 Lemay Stephen O Device, Method, and Graphical User Interface for Attachment Viewing and Editing

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10310866B2 (en) 2015-08-12 2019-06-04 Samsung Electronics Co., Ltd. Device and method for executing application

Also Published As

Publication number Publication date
KR101871711B1 (en) 2018-06-27

Similar Documents

Publication Publication Date Title
KR101873413B1 (en) Mobile terminal and control method for the mobile terminal
KR101740436B1 (en) Mobile terminal and method for controlling thereof
KR101708821B1 (en) Mobile terminal and method for controlling thereof
KR101873741B1 (en) Mobile terminal and method for controlling the same
KR101886753B1 (en) Mobile terminal and control method thereof
KR101863926B1 (en) Mobile terminal and method for controlling thereof
US8627235B2 (en) Mobile terminal and corresponding method for assigning user-drawn input gestures to functions
KR101608764B1 (en) Mobile terminal and method for controlling display thereof
KR101608532B1 (en) Method for displaying data and mobile terminal thereof
KR101727041B1 (en) Mobile terminal and method for controlling the same
KR101526998B1 (en) a mobile telecommunication device and a power saving method thereof
KR101919787B1 (en) Mobile terminal and method for controlling thereof
KR101660746B1 (en) Mobile terminal and Method for setting application indicator thereof
KR101520689B1 (en) a mobile telecommunication device and a method of scrolling a screen using the same
US8427511B2 (en) Mobile terminal with image projection
KR20140104183A (en) Mobile terminal and controlling method thereof
KR101561703B1 (en) The method for executing menu and mobile terminal using the same
KR101984592B1 (en) Mobile terminal and method for controlling the same
KR101504210B1 (en) Terminal and method for controlling the same
KR101609162B1 (en) Mobile Terminal With Touch Screen And Method Of Processing Data Using Same
KR20140141269A (en) Mobile terminal and controlling method thereof
KR101521932B1 (en) Terminal and method for controlling the same
KR101781852B1 (en) Mobile terminal and method for controlling the same
KR101802760B1 (en) Mobile terminal and method for controlling thereof
KR101559178B1 (en) Method for inputting command and mobile terminal using the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant