KR20100034608A - Terminal and method for controlling the same - Google Patents

Terminal and method for controlling the same Download PDF

Info

Publication number
KR20100034608A
KR20100034608A KR1020080093835A KR20080093835A KR20100034608A KR 20100034608 A KR20100034608 A KR 20100034608A KR 1020080093835 A KR1020080093835 A KR 1020080093835A KR 20080093835 A KR20080093835 A KR 20080093835A KR 20100034608 A KR20100034608 A KR 20100034608A
Authority
KR
South Korea
Prior art keywords
menu
multitasking
method
displayed
displaying
Prior art date
Application number
KR1020080093835A
Other languages
Korean (ko)
Other versions
KR101451667B1 (en
Inventor
김문주
박준석
유혜진
이동석
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020080093835A priority Critical patent/KR101451667B1/en
Publication of KR20100034608A publication Critical patent/KR20100034608A/en
Application granted granted Critical
Publication of KR101451667B1 publication Critical patent/KR101451667B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72583Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status for operating the terminal by selecting telephonic functions from a plurality of displayed items, e.g. menus, icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Abstract

PURPOSE: A terminal and a method for controlling the same are provided to additionally perform multi-tasking for a function of the same kind which is currently executed. CONSTITUTION: A user input unit(130) receives an instruction or a command for multitasking mode entry. A display unit(151) displays a menu list for multitasking. A control unit(180) performs a control operation to display whether multi-tasking of each menu item is possible according to the multitasking mode entry. When the first menu is already executed during the multitasking mode entry, only menu items multi-tasked with the first menu are displayed on the menu list.

Description

Terminal and its control method {TERMINAL AND METHOD FOR CONTROLLING THE SAME}

The present invention relates to a terminal and a control method thereof for performing multitasking in consideration of user convenience.

Terminal can move It may be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal depending on whether or not. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.

As the terminal functions are diversified, for example, such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.

In order to support and increase the function of such a terminal, it may be considered to improve the structural part and / or the software part of the terminal.

Recently, more and more users are trying to use one or more functions added to a terminal at the same time. For example, you might want to read a document or send a text message while listening to music. Accordingly, a number of terminals to which a multitasking function is applied has increased, and a method for allowing a user to use the multitasking function more conveniently is required.

The present invention provides a terminal capable of displaying multitasking capability for each function and a control method thereof.

Another object of the present invention is to provide a terminal capable of displaying multitasking capability for each function in an activated or deactivated state and a control method thereof.

In addition, the present invention is to provide a terminal and a method of controlling the same that can display whether or not multitasking for each function can be displayed to a clear or clear.

Another object of the present invention is to provide a terminal and a control method thereof capable of displaying whether or not multitasking is possible for each function in the size of an icon.

In addition, the present invention is to provide a terminal capable of performing multitasking with respect to the same kind of function and a control method thereof.

Another object of the present invention is to provide a terminal capable of displaying functions currently being multitasked and a control method thereof.

Another object of the present invention is to provide a terminal and a control method thereof capable of selectively activating or deactivating one of the same kinds of functions when they are multitasking.

A terminal related to an embodiment of the present invention for realizing the above object includes a user input unit for receiving an instruction or command for entering a multitasking mode, a display unit for displaying a menu list for multitasking, and the multitasking mode. And a controller configured to control whether or not multi-tasking of each menu item is displayed in the menu list according to entry.

In addition, the present invention for realizing the above problem is to enter a multi-tasking mode, and display a menu list for multitasking in accordance with the mode entry, it is made to display whether or not multi-tasking of each menu item in the menu list .

The terminal related to at least one embodiment of the present invention configured as described above may further perform multitasking with respect to the same type of function that is already executed, thereby improving user convenience.

In addition, since it is possible to display whether or not the function is a multitasking function before executing any function, the user's convenience can be improved.

Hereinafter, a mobile terminal according to the present invention will be described in more detail with reference to the accompanying drawings. The suffixes "module" and "unit" for components used in the following description are given or used in consideration of ease of specification, and do not have distinct meanings or roles from each other.

The mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, and an interface. The unit 170, the controller 180, and the power supply unit 190 may be included. The components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a pre-generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H). Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.

The wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.

The location information module 115 is a module for obtaining a location of a mobile terminal, and a representative example thereof is a GPS (Global Position System) module.

Referring to FIG. 1, the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by an image sensor in a video call mode or a photographing mode. The processed image frame may be displayed on the display unit 151.

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be provided according to the use environment.

The microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. The processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode. The microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The user input unit 130 generates input data for the user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

The sensing unit 140 detects a current state of the mobile terminal 100 such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, presence or absence of a user contact, orientation of the mobile terminal, acceleration / deceleration of the mobile terminal, and the like. To generate a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed. The sensing unit 140 may include a proximity sensor 141.

The output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154. Can be.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the mobile terminal displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays photographed and / or received images, a UI, and a GUI.

The display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). and at least one of a 3D display.

Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED). The rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the implementation form of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another, or may be disposed integrally with one another, and may be disposed on different surfaces, respectively.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is surrounded by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays. Proximity sensors have a longer life and higher utilization than touch sensors.

Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.

The proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100. The sound output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events occurring in the mobile terminal include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying occurrence of an event in a form other than a video signal or an audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152, so that they 151 and 152 may be classified as part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.

In addition to vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of the electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 may not only deliver the haptic effect through direct contact, but also may implement the user to feel the haptic effect through a muscle sense such as a finger or an arm. Two or more haptic modules 154 may be provided according to a configuration aspect of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.). The memory 160 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM (Random Access Memory, RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic It may include a storage medium of at least one type of disk, optical disk. The mobile terminal 100 may operate in connection with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device. For example, wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video input / output (I / O) port, the earphone port, and the like may be included in the interface unit 170.

The identification module is a chip that stores various types of information for authenticating the usage rights of the mobile terminal 100, and includes a user identification module (UIM), a subscriber identify module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Therefore, the identification device may be connected to the terminal 100 through a port.

The interface unit may be a passage through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or various command signals input from the cradle by a user may be transferred. It may be a passage that is delivered to the terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for voice calls, data communications, video calls, and the like. The controller 180 may include a multimedia module 181 for playing multimedia. The multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.

The controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.

According to a hardware implementation, embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more functions and operations described herein. Software code may be implemented in software applications written in a suitable programming language. The software code may be stored in the memory 160 and executed by the controller 180.

2A is a front perspective view of an example of a mobile terminal or a portable terminal according to the present invention.

The disclosed portable terminal 100 has a terminal body in the form of a bar. However, the present invention is not limited thereto and may be applied to various structures such as a slide type, a folder type, a swing type, a swivel type, and two or more bodies are coupled to be relatively movable.

The body includes a casing (casing, housing, cover, etc.) that forms an exterior. In this embodiment, the case may be divided into a front case 101 and a rear case 102. Various electronic components are built in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be further disposed between the front case 101 and the rear case 102.

The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).

The display unit 151, the audio output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, and the interface 170 may be disposed in the terminal body, mainly the front case 101. have.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on side surfaces of the front case 101 and the rear case 102.

The user input unit 130 is manipulated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132. The manipulation units 131 and 132 may also be collectively referred to as manipulating portions, and may be employed in any manner as long as the user operates the tactile manner with a tactile feeling.

Content input by the first or second manipulation units 131 and 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 To the touch recognition mode of the touch screen.

FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 ′ may be additionally mounted on the rear of the terminal body, that is, the rear case 102. The camera 121 ′ has a photographing direction substantially opposite to that of the camera 121 (see FIG. 2A), and may be a camera having different pixels from the camera 121.

For example, the camera 121 has a low pixel so that the user's face is photographed and transmitted to the counterpart in case of a video call, and the camera 121 'photographs a general subject and does not transmit it immediately. It is desirable to have a high pixel because there are many. The cameras 121 and 121 'may be installed on the terminal body to be rotatable or pop-up.

A flash 123 and a mirror 124 are further disposed adjacent to the camera 121 '. The flash 123 shines light toward the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to see his / her own face or the like when photographing (self-photographing) the user using the camera 121 '.

The sound output unit 152 'may be further disposed on the rear surface of the terminal body. The sound output unit 152 ′ may implement a stereo function together with the sound output unit 152 (see FIG. 2A), and may be used to implement a speakerphone mode during a call.

In addition to the antenna for a call or the like, a broadcast signal reception antenna 124 may be additionally disposed on the side of the terminal body. The antenna 124 constituting a part of the broadcast receiving module 111 (see FIG. 1) may be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the portable terminal 100 is mounted on the terminal body. The power supply unit 190 may be embedded in the terminal body or may be directly detachable from the outside of the terminal body.

The rear case 102 may be further equipped with a touch pad 135 for sensing a touch. Like the display unit 151, the touch pad 135 may also be configured to have a light transmission type. In this case, if the display unit 151 is configured to output visual information from both sides, the visual information may be recognized through the touch pad 135. The information output on both surfaces may be controlled by the touch pad 135. Alternatively, a display may be additionally mounted on the touch pad 135, and a touch screen may also be disposed on the rear case 102.

The touch pad 135 operates in association with the display unit 151 of the front case 101. The touch pad 135 may be disposed in parallel to the rear of the display unit 151. The touch pad 135 may have the same or smaller size as the display unit 151.

Hereinafter, the operation of the display unit 151 and the touch pad 135 will be described with reference to FIGS. 3A and 3B.

3A and 3B are front views of a portable terminal for explaining an operation state of the portable terminal according to the present invention.

Various types of time information can be displayed on the display unit 151. [ These pieces of information can be displayed in the form of letters, numbers, symbols, graphics, or icons.

In order to input such information, at least one of the letters, numbers, symbols, graphics, or icons may be displayed in a predetermined arrangement so as to be implemented in the form of a keypad. Such a keypad may be called a so-called " soft key ".

3A illustrates receiving a touch applied to a softkey through the front of the terminal body.

The display unit 151 may operate in an entire area or may be operated in a divided manner. In the latter case, the plurality of areas can be configured to operate in association with each other.

For example, an output window 151a and an input window 151b are displayed on the upper and lower portions of the display unit 151, respectively. The output window 151a and the input window 151b are areas allocated for output or input of information, respectively. In the input window 151b, a soft key 151c displaying a number for inputting a telephone number or the like is output. When the softkey 151c is touched, a number or the like corresponding to the touched softkey is displayed on the output window 151a. When the first manipulation unit 131 is operated, a call connection to the telephone number displayed on the output window 151a is attempted.

3B illustrates receiving a touch applied to a softkey through the rear of the terminal body. If FIG. 3A is a portrait in which the terminal body is arranged vertically, FIG. 3B illustrates a landscape in which the terminal body is arranged horizontally. The display unit 151 may be configured to convert the output screen according to the arrangement direction of the terminal body.

3B shows that the text input mode is activated in the mobile terminal. The display unit 151 displays an output window 151a 'and an input window 151b'. A plurality of soft keys 151c 'displaying at least one of letters, symbols, and numbers may be arranged in the input window 151b'. The softkeys 151c 'may be arranged in the form of a QWERTY key.

When the soft keys 151c 'are touched through the touch pad 135 (refer to FIG. 2B), letters, numbers, symbols, etc. corresponding to the touched soft keys are displayed on the output window 151a'. As described above, the touch input through the touch pad 135 has an advantage of preventing the softkey 151c 'from being covered by the finger when touched, as compared with the touch input through the display unit 151. When the display unit 151 and the touch pad 135 are transparent, the fingers located at the rear of the terminal body can be visually checked, and thus more accurate touch input is possible.

In addition to the input methods disclosed in the above embodiments, the display unit 151 or the touch pad 135 may be configured to receive a touch input by scrolling. By scrolling the display unit 151 or the touch pad 135, the user may move an object displayed on the display unit 151, for example, a cursor or a pointer located at an icon. Further, when the finger is moved on the display portion 151 or the touch pad 135, the path along which the finger moves may be visually displayed on the display portion 151. [ This may be useful for editing an image displayed on the display unit 151.

One function of the terminal may be executed in response to a case where the display unit 151 (touch screen) and the touch pad 135 are touched together within a predetermined time range. In the case of being touched together, there may be a case where the user clamps the terminal body using the thumb and index finger. For example, the function may include activation or deactivation of the display unit 151 or the touch pad 135.

The proximity sensor 141 described with reference to FIG. 1 will be described in more detail with reference to FIG. 4.

4 is a conceptual diagram illustrating a proximity depth of a proximity sensor.

As shown in FIG. 4, when a pointer such as a user's finger or pen approaches the touch screen, the proximity sensor 141 disposed in or near the touch screen detects this and outputs a proximity signal.

The proximity sensor 141 may be configured to output different proximity signals according to a distance between the proximity touched pointer and the touch screen (hereinafter, referred to as “proximity depth”).

In FIG. 4, for example, a cross section of a touch screen on which three proximity sensors capable of sensing three proximity depths is disposed is illustrated. Of course, proximity sensors that detect less than three or more than four proximity depths are also possible.

In detail, when the pointer is completely in contact with the touch screen (d0), the pointer is recognized as a touch. When the pointer is positioned below the distance d1 on the touch screen, the pointer is recognized as a proximity touch of a first proximity depth. When the pointer is spaced apart from the distance d1 or more and less than d2 on the touch screen, the pointer is recognized as a proximity touch of a second proximity depth. When the pointer is spaced apart from the d2 distance by more than d3 distance on the touch screen, it is recognized as a proximity touch of a third proximity depth. When the pointer is located at a distance greater than or equal to d3 on the touch screen, the proximity touch is recognized as released.

Accordingly, the controller 180 may recognize the proximity touch as various input signals according to the proximity depth and the proximity position of the pointer, and perform various operation control according to the various input signals.

FIG. 5 is a conceptual diagram illustrating a control method for a touch operation when a pair of display units 155 and 156 overlap.

The terminal disclosed in this drawing is a folder-type terminal in which a folder unit is foldable with respect to a main body. The first display unit 155 mounted on the folder unit may be a light transmissive type or a transparent type such as a TOLED, but the second display unit 156 mounted on the main body may have a form in which light does not transmit, such as an LCD. The first and second display units 155 and 156 may be configured as touch screens capable of touch input, respectively.

For example, when a touch (contact touch or proximity-touch) with respect to the first display unit or the TOLED 155 is detected, the controller 180 may be configured according to the type of touch and the touch time. At least one image from the list of images displayed on the TOLED 155 may be selected or run.

Hereinafter, touch, long touch, long touch, and drag (drag) are applied to a method in which information displayed on another display unit or LCD 156 is controlled when a touch is applied to the TOLED 155 exposed to the outside in an overlapped form. The description is based on the input method separated by).

In the overlapped state (mobile terminal is closed), the TOLED 155 is disposed to overlap the lower side of the LCD 156. In this state, when a touch different from a touch for controlling the image displayed on the TOLED 155 is detected, for example, a long touch (for example, a touch lasting more than 2 seconds to 3 seconds), the controller 180 ) Causes at least one image of the image list displayed on the LCD 156 to be selected according to the sensed touch input. The result of the execution of the selected image is displayed on the TOLED 155.

The long touch can also be used to selectively move a desired one of the objects displayed on the LCD 156 to the TOLED 155 (without its execution operation). That is, the user may select one area of the TOLED 155 corresponding to the specific object of the LCD 156. In the long touch, the controller 180 moves the object to the TOLED 155 for display. Meanwhile, the object displayed on the TOLED 155 may also be transferred to the LCD 156 according to a predetermined touch input, for example, flicking, swirling, or the like, on the TOLED 155. In this figure, the menu No. 2 displayed on the LCD 156 is moved to the TOLED 155 and illustrated.

If another input, for example, a drag, is additionally detected along with the long touch, the controller 180 is a function related to the image selected by the long touch, for example, a preview screen for the image is displayed on the TOLED 155. Can be displayed on. In this figure, the case where the preview (male photograph) with respect to the 2nd menu (image file) is performed is illustrated.

When the preview screen is output and the drag is made to the TOLED 155 to another image while maintaining the long touch, the controller 180 controls the selection cursor (or selection bar) of the LCD 156. The selected cursor is displayed on the preview screen (woman photo). After that, when the touch (long touch and drag) is finished, the controller 180 displays the first image selected by the long touch.

The touch operation (long touch and drag) may be performed when a slide (movement of the proximity touch corresponding to the drag) is detected together with the long proximity touch (the proximity touch lasting at least 2 seconds to 3 seconds) for the TOLED 155. The same applies to.

If a touch operation other than those mentioned above is detected, the controller 180 may operate in the same manner as a general touch control method.

The control method for the touch operation in the overlapped form may be applied to a terminal having a single display. In addition, the present invention can be applied to a terminal different from a folder type having a dual display.

Hereinafter, embodiments related to a control method that can be implemented in a terminal configured as described above will be described with reference to the accompanying drawings. Embodiments described later may be used alone or in combination with each other. In addition, embodiments described below may be used in combination with the above-described user interface (UI).

Hereinafter, a control method of a terminal according to the present invention will be described.

6 is a flowchart illustrating an example of a method for controlling a terminal according to the present invention.

In this embodiment, it is assumed that a multitasking key (or a simultaneous operation key) is input in an arbitrary operation state. As the multitasking key is input, the terminal may enter a multitasking mode.

The arbitrary operation state may be a standby state in which no menu (or function) is executed, a menu list may be called, or a certain menu is already executed and operated in the menu list. It can also be

In addition, the multitasking key may be a hardware key attached to one side of the body of the terminal or may be a software key displayed on a screen. In addition, the multitasking key may be input in the form of a user's instruction or command without a specific shape. For example, the multi-tasking may be instructed by using a user's voice command, a user's voice command, a proximity touch, a gesture, or a rotation or shaking of the terminal.

In the following embodiment, it is assumed that a multitasking key is input using a hardware key or a software key for convenience.

When the multitasking key is input as described above (S101), the controller 180 displays a menu (or function) list capable of multitasking on the display module 151 (S102).

The menu list may display only at least a multitaskable menu (or function) according to a preset option. Alternatively, information indicating whether multitasking is possible may be displayed. A more specific display method of the menu list will be described with reference to other drawings.

As described above, information related to multitasking may be additionally displayed in the menu list (S103), and the user may select a menu (or function) to multitask with reference to the additionally displayed information.

The user may simultaneously select or select one or more menus (or functions) from the menu list (S104). Accordingly, the controller 180 multitasks the selected menus (S105).

Hereinafter, an embodiment of a control method of a terminal according to the present invention will be described with reference to the accompanying drawings.

7A to 7C are exemplary views illustrating a menu list displayed according to a state when a multitasking key is input.

In particular, FIG. 7A illustrates an example of a menu list displayed when a multitasking key is input in a standby state in which no menu is executed. FIG. 7B illustrates multitasking in an operating state in which at least one arbitrary menu is already executed. An example diagram showing a menu list displayed when a key is input. 7C is an exemplary view showing a screen displaying a list of menu items that are already executed.

As shown in Fig. 7A, when a multitasking key is input in the standby state, all multitaskable menu items 310 are displayed, and any menu item 320 set as a default is automatically selected among them. .

For example, it is assumed that the menu items include a phone related menu, a message related menu, an internet related menu, a video related menu, a music related menu, a TV related menu, and the like. In this case, if a telephone related menu is set as a default among the menu items, when the multitasking key is input in the standby state, the menu list is displayed with the telephone related menu item selected.

Therefore, when the user inputs a confirmation key (or an execution key) while the default menu item is selected, the default menu item (eg, a telephone related menu item) is immediately executed. You can also use the navigation keys to select another menu item and execute it.

Here, the default menu item may be set automatically and changed by the user. For example, the menu item with the most frequent user execution with multitasking may be automatically set as the default menu item. In addition, the user can directly change a menu item that is already set as a default to another menu item. Automatic selection and manual selection of the default menu item may be set through a configuration option (not shown).

Here, the confirmation key for executing the selected menu item or the navigation keys for selecting any menu item from the menu list may also be input in the form of a user's instruction or command. As described above, the key input method (key input method using a user's instruction or command such as a voice without displaying a special key shape) may be applied in all the embodiments even if there is no mention.

Although not shown in the drawings, the menu list may be displayed in one side such as horizontal or vertical, or may be displayed in a grid arrangement.

In addition, the menu list may be displayed in an overlay manner on the execution screen that was displayed before the multitasking key is input. Transparency of the execution screen or the menu list may be adjusted to display the overlay type. That is, the execution screen can be displayed more transparently and the menu list can be displayed more clearly.

In addition, a menu item selected from the menu list may be displayed with emphasis over menu items not selected. For example, a menu item selected from the menu list may be displayed in a larger size than menu items not selected. In addition, the menu item selected from the menu list may be displayed in color, and the unselected menu item may be displayed in black and white. In addition, a menu item selected from the menu list may be displayed by applying a highlight.

As shown in FIG. 7B, when a multitasking key is input in an operating state in which at least one or more menus are already executed, a menu item 330 may be additionally displayed to display all menu items that are already executed. have. For example, assuming that the menu items include a phone-related menu, a message-related menu, an internet-related menu, a video-related menu, a music-related menu, a TV-related menu, and the like, one or more menu items (eg, a menu item) may be used. If an Internet related menu or a music related menu is already executed, a menu item 330 for displaying the menu items being executed (eg, an internet related menu or a music related menu) may be additionally displayed.

When the user selects and executes a menu item 330 displaying a list of menu items that are already executed as described above, the controller 180 displays a list 340 of menu items that are already executed, as shown in FIG. 7C. ) Can be displayed. For example, assuming that three menus (eg, an electronic dictionary menu, a timetable menu, and an alarm menu) are already executed, the three menus are displayed.

As described above, when at least two menus (or functions) are multitasked, the controller 180 adds an indicator indicating the multitasking state and the number of menu items being multitasked to one side of the screen. It may be displayed.

Information indicating the indicator and the number of multitasking may be simultaneously represented by one icon 350. In this case, an animation function such as blinking may be applied to the indicator. In addition, when the indicator display area is reduced by the newly displayed icon 350 as described above, the size or shape of the previously displayed indicator may be automatically changed to a smaller size or shape.

Meanwhile, in an embodiment of the present invention, multitasking may be additionally performed on the same menu (or function). Therefore, when displaying the list of menu items that are already executed as described above, the same menu items may be displayed in plural, or the number of the same menu items being multitasked may be additionally displayed. An embodiment of a display method when multitasking the same menu (or function) will be described later with reference to other drawings.

Hereinafter, the display method of the above-described menu list will be described.

In particular, a method of displaying a multitaskable menu list will be described with reference to FIGS. 8A to 8D.

As described above, when the multitasking key is input, the controller 180 displays a multitaskable menu list.

However, among the menu items, there are menu items which cannot be executed at the same time. For example, a video related menu cannot be executed while a music related menu has already been executed. Or, the TV related menu cannot be executed while the video related menu is already executed. This is because the two menus (or functions) use the same output (eg sound output module). Alternatively, there may be a menu that cannot be multitasked with each other according to the mechanical or circuit characteristics of the terminal.

As described above, when a user executes a second menu item (for example, a TV-related menu) that cannot be executed at the same time while a first menu item (for example, a video-related menu) is already executed, the controller 180 is A guide message indicating that the second menu item (eg, TV-related menu) is not executed or a guide message asking whether to stop execution of the first menu item (eg, video-related menu) may be output.

Accordingly, the user must release the display of the guide message and then select another second menu item. However, the method of causing the user to select another second menu item after the guide message for the second menu item which cannot be executed at the same time is already output as described above may be troublesome for the user.

Therefore, in the present invention, the convenience of the user can be improved by displaying information indicating whether multitasking is possible when displaying the menu list.

Hereinafter, embodiments of a method of displaying a menu list indicating whether multitasking is possible will be described.

8A is an exemplary view illustrating a screen displaying only multi-taskable menu items in a terminal according to the present invention.

When a multitasking key is input while a certain menu is already executed, the controller 180 may display only the first menu item that is already executed and the menu items capable of multitasking. For example, assuming that the multitaskable menu is the telephone related menu 360 and the message related menu 370 when the music related menu is already executed, only the two menus may be displayed.

In this case, the information about the mutual multitaskable menu items or the information about the mutual multitaskable menu items may be stored in the memory 160 in advance.

The mutual multitaskable menu items may include the same menu (or function). For example, if the message-related menu is a menu capable of further multitasking, even if the message-related menu is already executed, the message-related menu item may be displayed in the menu list.

In this case, the number of times that the user can further multitask the same menu may be set in advance. Therefore, even if the menu has already been executed, the same menu capable of multitasking may be additionally displayed if it is within the number of times of multitasking.

The message related menu item may further display information (eg, an indicator) indicating that the same menu is already executed. For example, as shown in FIG. 8B, when three message-related menus are already executed, the message-related menu item 370 includes information indicating that the number of message-related menu items that are already executed is three (380). ) Can be displayed.

8C is an exemplary view showing a screen displaying whether multitasking is enabled or disabled in a terminal related to the present invention.

When the multitasking key is input as described above, if any menu item (for example, a music-related menu) is already executed, the controller 180 can simultaneously perform (multitask) the menu item being executed. Menu items (eg, a phone-related menu and a message-related menu) 390 may be displayed in an active state, and menu items 400 that may not be concurrent with the menu item being executed may be displayed in an inactive state.

The activated state and the deactivated state may be represented to a degree of transparency or clarity of the corresponding menu item. For example, an activated menu item may be displayed more clearly and an inactive menu item may be displayed more transparently.

In addition, the activated state and the deactivated state may be represented by using an icon shape of a corresponding menu item. For example, as shown in FIG. 8D, the menu item in the active state is displayed in the shape of a normal icon, and the menu item in the inactive state is shaped like a scissors table (for example, ×). An icon 410 including an image (or text) may be displayed.

In addition, a color (for example, red or gray) meaning that the icon cannot be used may be additionally applied to the icon.

As described above, the menu items that can be operated simultaneously (= multitaskable menu items) and the menu items that cannot be operated simultaneously (= multitaskable menu items) are displayed separately, so that the user cannot know and select menu items that cannot be performed simultaneously in advance. There is convenience.

However, even if the menu item in the inactive state as described above, the selection is not impossible at all. That is, when the user selects a menu item in an inactive state, the controller 180 stops executing the previously executed menu item (first menu item) and executes the newly selected menu item (second menu item) instead. Can be.

In this case, the controller 180 may output a guide message asking whether to stop the execution of the first menu item that was previously executed. Accordingly, the user may stop executing the first menu item in response to the question of the guide message and then execute the second menu item.

In other words, by displaying the non-multitasking menu items as described above, the user executes the second menu item only when the user wants to execute the second menu item instead of the first menu item.

9A and 9B are exemplary views illustrating display screens of menu items that are already executed when multitasking the same menu.

As described above, when the multitasking key is input while at least one menu is already executed, the menu list additionally displays a menu item for displaying menu items that are already executed. When the user selects and executes the additionally displayed menu item, the controller 180 displays a list of menu items that are already executed. In this case, the total number of menu items being multitasked may be displayed on one side of the screen, and the number of menu items may be displayed when there are the same menu items being multitasked.

When the user selects the same menu being multitasked, as illustrated in FIG. 9A, the controller 180 may display the detailed menu list. For example, assuming that the same menu being multitasked is a memo-related menu and three memo-related menus are already multitasking, three memo-related menu items 420 are displayed as a detailed menu list of memo-related items. Is displayed.

The detailed menu list may be displayed in text form or in icon form according to the upper menu list display method.

Alternatively, as shown in FIG. 9B, an execution screen of each detailed menu item may be displayed in the form of each window. For example, the same memo-related menu may be additionally executed while the memo-related menu is already executed. In this case, whenever the memo-related menu is additionally executed, a separate window 431-433 may be generated to display the execution screen.

Each menu window of the same type that is multitasked as described above may be internally classified using depth. For example, assuming that the depth of the first generated window is 1 (Depth = 1), the depth of the next generated window may continue to increase as 2 or 3 (Depth = 2, Depth = 3).

Since the depth of each window is information for distinguishing internally, the depth of each window does not necessarily need to be displayed externally. However, in consideration of the user's convenience, information (eg, the title of the window, the window number, and the border color of the window) for distinguishing each window may be displayed on one side of the window.

10A and 10B are exemplary diagrams for describing a method of additionally multitasking the same menu through separate windows.

As shown in FIG. 10A, the same menu 440 that is already being executed may be additionally selected and executed from a menu list. As described above, each time the same menu 440 is further executed, a new window having a different depth is automatically generated.

The newly created window may display an execution screen of the same menu that is additionally executed. In this case, the number of times that the user can further multitask the same menu may be set in advance. That is, a depth of a window that may further multitask the same menu may be set in advance.

In addition, as illustrated in FIG. 10B, a window generation menu (eg, New) 450 may be provided on one side (eg, top) of each window. Each time the window creation menu is executed, new windows 461 and 462 having different depths may be generated. However, new windows cannot be created at the same depth of the same menu. In other words, the same menu creates windows with different depths.

Then, the execution screen of the newly executed menu (for example, message 2) may be displayed through the newly created window 452. The window creation menu 450 is automatically created when the same menu is further multitaskable. Therefore, if the depth of the multitaskable window is set in advance, the window generation menu 450 is not generated in the window created after the depth.

In addition, the window generation menu (eg, New) may be displayed separately.

The separately generated window generation menu may operate in the same function as the multitasking key. For example, when the window generation menu is input, multitaskable menus are displayed. When one of the menus is executed, a new window may be generated to display an execution screen of the menu in the window.

In this case, as shown in FIGS. 10A and 10B, the number of the same menus (or functions) 451 and 452 being multitasked may be displayed on one side of each window, respectively.

Hereinafter, a method of selecting one of the same menus as described above will be described.

11A and 11B are exemplary views illustrating a screen for selecting one of the same menus being multitasked.

As described above, if at least one menu has already been executed, a list of the executed menus can be displayed. In addition, one of the menus may be selected to display an execution screen of the corresponding menu.

That is, when the same menus are multitasked as described above, as shown in FIG. 11A, of the same kind of detailed menu items already executed in the same kind of detailed menu list 420 being multitasked. One 421 may be selected to display the execution screen 422.

Alternatively, when the same menus being multitasked are displayed through the respective windows 431-433 as shown in Fig. 11B, the touch screen, the navigation keys, or the user's specific instruction or command described above can be used to display the same menu. Either 432 can be selectively activated.

In this case, the activated window is displayed to be displayed on the upper layer of another window that is not activated and may be displayed more clearly.

In addition, although not shown in the drawing, when a menu is additionally executed, a window of a menu previously executed may be automatically deactivated, and a window of a newly executed menu may be automatically activated.

In addition, according to an embodiment of the present invention, the above-described method may be implemented as code that can be read by a processor in a medium in which a program is recorded. Examples of processor-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.

The above-described mobile terminal is not limited to the configuration and method of the above-described embodiments, but the embodiments may be configured by selectively combining all or some of the embodiments so that various modifications can be made. It may be.

1 is a block diagram of a mobile terminal associated with one embodiment of the present invention;

Figure 2a is a front perspective view of a mobile terminal according to an embodiment of the present invention.

2B is a rear perspective view of a mobile terminal according to an embodiment of the present invention.

3A and 3B are front views of a portable terminal for explaining an operation state of the portable terminal according to the present invention.

4 is a conceptual diagram for explaining a proximity depth of a proximity sensor.

5 is a conceptual diagram illustrating a control method for a touch operation in a form in which a pair of display units overlap.

6 is a flowchart illustrating an example of a method for controlling a terminal according to the present invention;

7A to 7C are exemplary views showing a menu list displayed according to a state when a multitasking key is input.

8A to 8D are exemplary diagrams of screens for explaining a method of displaying a multitaskable menu list in a terminal according to the present invention.

9A and 9B are exemplary diagrams illustrating a display screen of menu items that are already executed when multitasking the same menu.

10A and 10B are exemplary views for explaining a method of additionally multitasking the same menu through separate windows.

11A and 11B illustrate an example of a screen for selecting one of the same menus being multitasked.

Claims (17)

  1. A user input unit configured to receive an instruction or command for entering a multitasking mode;
    A display unit which displays a menu list for multitasking;
    And a controller configured to control whether or not multi-tasking of each menu item is displayed in the menu list as the multitasking mode is entered.
  2. The method of claim 1, wherein the control unit,
    A terminal characterized in that the user enters a multitasking mode by receiving at least one of a predetermined hardware key, a software key, or a user's voice command, a proximity touch, a gesture, or a rotation or shaking of the terminal.
  3. The method of claim 1, wherein the control unit,
    And controlling to display the number of the menu items when the same menu items are being multitasked.
  4. The method of claim 1, wherein the control unit,
    If any first menu is already executed when entering the multitasking mode,
    And controlling to display only menu items capable of multitasking with the first menu in a menu list.
  5. The method of claim 4, wherein the control unit,
    If the first menu that is already executed is a menu capable of further multitasking,
    And displaying the same menu item as the first menu in a menu list.
  6. The method of claim 5, wherein the control unit,
    And displaying information indicating whether the first menu is already executed and the number of first menus already executed.
  7. The method of claim 5, wherein the control unit,
    And multi-tasking a menu identical to an already executed menu, wherein the terminal automatically controls to display a window of a new execution screen having a different depth.
  8. The method of claim 1, wherein the control unit,
    And displaying a menu in which the multitaskable menu is activated and displaying a menu in which the multitasking is not possible.
  9. The method of claim 1, wherein the control unit,
    And displaying a menu list including an image having a shape or color indicating that the menu list cannot be used for multitasking.
  10. Entering a multitasking mode;
    Displaying a menu list for multitasking as the mode enters;
    And displaying whether or not multi-tasking of each menu item is possible in the menu list.
  11. The method of claim 10, wherein the multitasking mode,
    The control method of the terminal, characterized in that entering by receiving a predetermined input of a specific key or a user's instruction or command.
  12. The method of claim 10, wherein displaying the menu list comprises:
    If any first menu is already executed, only a menu item capable of mutual multitasking with the first menu can be displayed.
  13. The method of claim 12, wherein the menu list,
    And if the already executed first menu is additionally multitasking, the first menu may be displayed again.
  14. The method of claim 13, wherein the menu list,
    And displaying information indicating whether each same menu has already been executed and the number of times each same menu has been executed.
  15. The method of claim 10, wherein displaying the multitasking capability comprises:
    The multi-taskable menu is displayed in an activated state, and the menu that is not multitaskable is configured to be displayed in an inactive state.
  16. The method of claim 10, wherein displaying the multitasking capability comprises:
    A method for controlling a terminal, comprising: displaying an image having a shape or color indicating unavailability in a menu that cannot be multitasked.
  17. The method of claim 13,
    When additionally multitasking the same menu already executed,
    Whenever the same menu is additionally executed, the control method of the terminal, characterized in that for automatically generating and displaying a window of a new execution screen having a different depth.
KR1020080093835A 2008-09-24 2008-09-24 Terminal and method for controlling the same KR101451667B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020080093835A KR101451667B1 (en) 2008-09-24 2008-09-24 Terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020080093835A KR101451667B1 (en) 2008-09-24 2008-09-24 Terminal and method for controlling the same

Publications (2)

Publication Number Publication Date
KR20100034608A true KR20100034608A (en) 2010-04-01
KR101451667B1 KR101451667B1 (en) 2014-10-16

Family

ID=42212727

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020080093835A KR101451667B1 (en) 2008-09-24 2008-09-24 Terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR101451667B1 (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120035772A (en) * 2010-10-06 2012-04-16 엘지전자 주식회사 Mobile terminal and method for controlling icons on multi home screen thereof
EP2595042A3 (en) * 2011-11-16 2014-07-02 Samsung Electronics Co., Ltd Mobile device for executing multiple applications and method for same
US9052925B2 (en) 2010-04-07 2015-06-09 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US9207838B2 (en) 2011-08-26 2015-12-08 Apple Inc. Device, method, and graphical user interface for managing and interacting with concurrently open software applications
US9244606B2 (en) 2010-12-20 2016-01-26 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
US9513801B2 (en) 2010-04-07 2016-12-06 Apple Inc. Accessing electronic notifications and settings icons with gestures
US9569102B2 (en) 2010-01-06 2017-02-14 Apple Inc. Device, method, and graphical user interface with interactive popup views
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9658740B2 (en) 2013-03-15 2017-05-23 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US9823831B2 (en) 2010-04-07 2017-11-21 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US9823839B2 (en) 2012-05-09 2017-11-21 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10310732B2 (en) 2013-03-15 2019-06-04 Apple Inc. Device, method, and graphical user interface for concurrently displaying a plurality of settings controls
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09319595A (en) * 1996-05-27 1997-12-12 Matsushita Electric Ind Co Ltd Multitask controller
KR20080016358A (en) * 2006-08-18 2008-02-21 삼성전자주식회사 Method for display of multitasking list in mobile terminal
KR101221910B1 (en) * 2006-11-17 2013-01-15 엘지전자 주식회사 Multi tasking method and mobile terminal using the same

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9569102B2 (en) 2010-01-06 2017-02-14 Apple Inc. Device, method, and graphical user interface with interactive popup views
US9513801B2 (en) 2010-04-07 2016-12-06 Apple Inc. Accessing electronic notifications and settings icons with gestures
US10101879B2 (en) 2010-04-07 2018-10-16 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications using a three-dimensional stack of images of open applications
US9052925B2 (en) 2010-04-07 2015-06-09 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US9052926B2 (en) 2010-04-07 2015-06-09 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US9058186B2 (en) 2010-04-07 2015-06-16 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US10156962B2 (en) 2010-04-07 2018-12-18 Apple Inc. Device, method and graphical user interface for sliding an application view by a predefined amount of sliding based on a touch input to a predefined button of a multifunction device
US9823831B2 (en) 2010-04-07 2017-11-21 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
KR20120035772A (en) * 2010-10-06 2012-04-16 엘지전자 주식회사 Mobile terminal and method for controlling icons on multi home screen thereof
US9244606B2 (en) 2010-12-20 2016-01-26 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
US10261668B2 (en) 2010-12-20 2019-04-16 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
US10007400B2 (en) 2010-12-20 2018-06-26 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
US9207838B2 (en) 2011-08-26 2015-12-08 Apple Inc. Device, method, and graphical user interface for managing and interacting with concurrently open software applications
US9983664B2 (en) 2011-11-16 2018-05-29 Samsung Electronics Co., Ltd. Mobile device for executing multiple applications and method for same
EP2595042A3 (en) * 2011-11-16 2014-07-02 Samsung Electronics Co., Ltd Mobile device for executing multiple applications and method for same
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9823839B2 (en) 2012-05-09 2017-11-21 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10191627B2 (en) 2012-05-09 2019-01-29 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10481690B2 (en) 2012-05-09 2019-11-19 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9857897B2 (en) 2012-12-29 2018-01-02 Apple Inc. Device and method for assigning respective portions of an aggregate intensity to a plurality of contacts
US10101887B2 (en) 2012-12-29 2018-10-16 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US9658740B2 (en) 2013-03-15 2017-05-23 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US10310732B2 (en) 2013-03-15 2019-06-04 Apple Inc. Device, method, and graphical user interface for concurrently displaying a plurality of settings controls
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities

Also Published As

Publication number Publication date
KR101451667B1 (en) 2014-10-16

Similar Documents

Publication Publication Date Title
EP2397936B1 (en) Mobile terminal and method of controlling the same
KR101646254B1 (en) Method for removing icon in mobile terminal and mobile terminal using the same
KR101481556B1 (en) A mobile telecommunication terminal and a method of displying an object using the same
US8219152B2 (en) Mobile terminal and control method thereof
KR101608532B1 (en) Method for displaying data and mobile terminal thereof
KR20110125900A (en) Mobile terminal and operating method thereof
KR101626621B1 (en) Method for controlling data in mobile termina having circle type display unit and mobile terminal thereof
KR101453909B1 (en) Mobile terminal using touch screen and control method thereof
KR20100043476A (en) Mobile terminal with an image projector and method for controlling the same
KR20120125086A (en) Mobile device and control method for the same
KR20120070190A (en) Mobile terminal and operation control method thereof
KR20100040406A (en) Mobile terminal and display method thereof
US8548528B2 (en) Mobile terminal and control method thereof
KR20100044527A (en) A mobile telecommunication device and a method of scrolling a screen using the same
EP2428947A2 (en) Terminal and contents sharing method for terminal
KR20110123348A (en) Mobile terminal and method for controlling thereof
KR20090088597A (en) Mobile terminal including touch screen and operation control method thereof
US9804763B2 (en) Mobile terminal and user interface of mobile terminal
KR20120063694A (en) Mobile terminal and method for controlling thereof
KR101617461B1 (en) Method for outputting tts voice data in mobile terminal and mobile terminal thereof
KR102040611B1 (en) Mobile terminal and controlling method thereof
KR20100064873A (en) Terminal and method for controlling the same
KR101788051B1 (en) Mobile terminal and method for controlling thereof
EP2665243B1 (en) Mobile terminal and control method thereof
KR101537706B1 (en) Mobile terminal and control method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20170922

Year of fee payment: 4

LAPS Lapse due to unpaid annual fee