KR20140130853A - Terminal and method for controlling the same - Google Patents

Terminal and method for controlling the same Download PDF

Info

Publication number
KR20140130853A
KR20140130853A KR20130049341A KR20130049341A KR20140130853A KR 20140130853 A KR20140130853 A KR 20140130853A KR 20130049341 A KR20130049341 A KR 20130049341A KR 20130049341 A KR20130049341 A KR 20130049341A KR 20140130853 A KR20140130853 A KR 20140130853A
Authority
KR
South Korea
Prior art keywords
content
item
screen
area
contents
Prior art date
Application number
KR20130049341A
Other languages
Korean (ko)
Inventor
안금주
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR20130049341A priority Critical patent/KR20140130853A/en
Publication of KR20140130853A publication Critical patent/KR20140130853A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Abstract

The present invention relates to a portable terminal and a control method thereof, in which a user can display an execution screen of another content desired by a user in a desired shape using a touch gesture on a current content execution screen from among simultaneously running contents.

Description

TECHNICAL FIELD [0001] The present invention relates to a portable terminal,

The present invention relates to a portable terminal and a control method thereof, in which the use of the terminal can be realized by further considering the convenience of the user.

The terminal can move And can be divided into a mobile / portable terminal and a stationary terminal depending on whether the mobile terminal is a mobile terminal or a mobile terminal. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

Such a terminal has various functions, for example, an electronic book viewer function, a document viewer function, a composite function such as photographing or photographing of a moving picture, reproduction of a music or video file, reception of a game, And is implemented in the form of a multimedia device equipped with a multimedia player.

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

Currently, a mobile terminal is equipped with a multitasking function capable of simultaneously executing a plurality of functions, data, and other content, and the user can execute a desired number of contents and use the contents.

However, at present, there is a problem that it is inconvenient for the user to switch from the execution screen of the current content to the execution screen of another desired content without displaying the execution screen of two or more desired contents among the currently executing contents on one screen at the same time.

In addition, there is a problem that currently, execution screens of other contents can not be displayed together with a shape desired by the user on the execution screen of the specific content among the currently executing contents.

It is an object of the present invention to provide a portable terminal and a control method thereof, which can simultaneously display an execution screen of another content desired by the user on a current content execution screen from among concurrently running contents.

According to an aspect of the present invention, there is provided a portable terminal including: a touch screen for displaying first content among two or more contents simultaneously running on a screen; Displaying an item on the screen indicating at least one second content excluding the first content among the two or more contents, and when the item is moved to a display area of the first content, the first content And a controller for controlling the second content corresponding to the moved item to be displayed together.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: displaying a first content among two or more simultaneously running contents on a screen of a touch screen; Displaying, on the screen, an item representing at least one second content excluding the first content among the two or more contents; And displaying the first content and the second content corresponding to the moved item together on the screen when the item is moved to the display area of the first content.

The mobile terminal and its control method according to the present invention can display an execution screen of another content desired by a user in a desired shape by using a touch gesture on the current content execution screen among the simultaneously running contents to provide.

1 is a block diagram of a portable terminal according to an embodiment of the present invention.
2 is a front perspective view of a portable terminal according to an embodiment of the present invention.
3 is a front view of a portable terminal for explaining an operation state of the portable terminal according to the present invention.
4 is a flowchart illustrating a process of controlling the mobile terminal according to the present invention.
5 to 8 are explanatory views illustrating a process of controlling the portable terminal according to the present invention.

Hereinafter, a portable terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The portable terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to the portable terminal.

1 is a block diagram of a portable terminal according to an embodiment of the present invention.

The portable terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the wireless terminal 100 and the wireless communication system or between the wireless terminal 100 and a network in which the wireless terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), LTE (Long Term Evolution) .

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for obtaining the position of the portable terminal 100, and a representative example thereof is a Global Position System (GPS) module.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, the presence of the user, the orientation of the portable terminal, And generates a sensing signal for controlling the operation of the portable terminal 100. For example, when the portable terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. It is also possible to sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor 141 according to the present invention.

The output unit 150 is for generating output related to visual, auditory or tactile sense and includes a display unit 151, an acoustic output unit 152, an alarm unit 153, a haptic module 154, 155, and the like.

The display unit 151 displays (outputs) the information processed in the portable terminal 100. For example, when the portable terminal 100 is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with the call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received image, UI, or GUI is displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the portable terminal 100. [ For example, in the portable terminal 100, a plurality of display units may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

The proximity sensor 141 may be disposed in an inner region of the portable terminal 100 or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch" The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the portable terminal 100.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152 so that they may be classified as a part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The projector module 155 is a component for performing an image project function using the portable terminal 100 and is similar to the image displayed on the display unit 151 in accordance with a control signal of the controller 180 Or at least partly display another image on an external screen or wall.

Specifically, the projector module 155 includes a light source (not shown) that generates light (for example, laser light) for outputting an image to the outside, a light source And a lens (not shown) for enlarging and outputting the image at a predetermined focal distance to the outside. Further, the projector module 155 may include a device (not shown) capable of mechanically moving the lens or the entire module to adjust the image projection direction.

The projector module 155 can be divided into a CRT (Cathode Ray Tube) module, an LCD (Liquid Crystal Display) module and a DLP (Digital Light Processing) module according to the type of the display means. In particular, the DLP module may be advantageous for miniaturization of the projector module 151 by enlarging and projecting an image generated by reflecting light generated from a light source on a DMD (Digital Micromirror Device) chip.

Preferably, the projector module 155 may be provided on the side surface, the front surface, or the back surface of the portable terminal 100 in the longitudinal direction. It goes without saying that the projector module 155 may be provided at any position of the mobile terminal 100 as occasion demands.

The memory 160 may store a program for processing and controlling the control unit 180 and temporarily store the input / output data (e.g., telephone directory, message, audio, still image, For example. The memory unit 160 may also store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia). In addition, the memory unit 160 may store data on vibration and sound of various patterns output when the touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The portable terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the portable terminal 100. The interface unit 170 receives data from an external device or receives power from the external device and transmits the data to each component in the portable terminal 100 or allows data in the portable terminal 100 to be transmitted to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the usage right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the portable terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the portable terminal 100, or various command signals input from the cradle by the user And may be a passage to be transmitted to the terminal 100. The various command signals input from the cradle or the power source may be operated as a signal for recognizing that the portable terminal 100 is correctly mounted on the cradle.

The controller 180 generally controls the overall operation of the portable terminal 100. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, The embodiments described may be implemented by the control unit 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

2 is a perspective view of a portable terminal according to an embodiment of the present invention.

The disclosed mobile terminal 100 has a bar-shaped terminal body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injection molding a synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti) or the like.

The front body 101 mainly includes a display unit 151, an audio output unit 152, a camera 121, user inputs 130/131 and 132, a microphone 122, an interface 170, a proximity sensor 141 And a contact sensor 143 may be disposed.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion and may be employed in any manner as long as the user operates in a tactile manner.

The contents inputted by the first or second operation unit 131 or 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 To the touch recognition mode of the touch screen.

In addition to the antenna for talking and the like, a broadcast signal receiving antenna 116 may be further disposed on the side of the terminal body. The antenna 116, which forms part of the broadcast receiving module 111 (see FIG. 1), can be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the portable terminal 100 is mounted on the terminal body. The power supply unit 190 may be built in the terminal body or may be detachable from the outside of the terminal body.

Hereinafter, a related operation of the display unit 151 and the touch pad 135 will be described with reference to FIG.

3 is a front view of a portable terminal for explaining an operation state of the portable terminal according to the present invention.

Various types of time information can be displayed on the display unit 151. [ These pieces of information can be displayed in the form of letters, numbers, symbols, graphics, or icons.

At least one of the letters, numbers, symbols, graphics, or icons may be displayed in a predetermined arrangement for inputting such information, thereby being implemented as a keypad. Such a keypad may be referred to as a so-called " virtual keypad ".

3 illustrates inputting of a touch applied to the virtual keypad through the front surface of the terminal body.

The display unit 151 may operate as an entire area or may be divided into a plurality of areas and operated. In the latter case, the plurality of areas can be configured to operate in association with each other.

For example, an output window 151a and an input window 151b are displayed on the upper and lower portions of the display unit 151, respectively. The output window 151a and the input window 151b are areas allocated for outputting or inputting information, respectively. In the input window 151b, a virtual keypad 151c is displayed in which a number for inputting a telephone number or the like is displayed. When the virtual keypad 151c is touched, numbers and the like corresponding to the touched virtual keypad are displayed on the output window 151a. When the first operation unit 131 is operated, call connection to the telephone number displayed on the output window 151a is attempted.

In addition, the display unit 151 or the touch pad 135 may be configured to receive a touch input by scrolling. The user can move a cursor or a pointer located on an object displayed on the display unit 151, for example, an icon or the like, by scrolling the display unit 151 or the touch pad 135. [ Further, when the finger is moved on the display portion 151 or the touch pad 135, the path along which the finger moves may be visually displayed on the display portion 151. [ This will be useful for editing the image displayed on the display unit 151. [

One function of the terminal may be executed in response to a case where the display unit 151 (touch screen) and the touch pad 135 are touched together within a predetermined time range. In the case of being touched together, there may be a case where the user clamps the terminal body using the thumb and index finger. The one function may be activation or deactivation of the display unit 151 or the touch pad 135, for example.

For convenience of explanation, it is assumed that the portable terminal 100 mentioned below includes at least one of the components shown in FIG.

Also, an arrow or a finger-shaped graphic for pointing a specific object on the display unit 151 or selecting a menu on the display unit 151 is called a pointer or a cursor.

However, in the case of a pointer, it often means a finger or a stylus pen for a touch operation or the like. Therefore, in this specification, a graphic displayed on the display unit is referred to as a cursor, and a physical means for performing a touch, a proximity touch, and a gesture, such as a finger or a stylus pen, is referred to as a pointer.

4 to 8, according to the present invention, the user simultaneously displays an execution screen of another content desired by the user in a desired shape using the touch gesture on the current content execution screen among the simultaneously running contents The process will be described in detail.

The content may be executable in the portable terminal 100 and may include all data providing a user with a specific function or providing specific information. In one example, the content may include an application providing a specific function, a widget , Menus, data, multimedia, and the like.

4 is a flowchart illustrating a process of controlling the mobile terminal according to the present invention.

5 to 8 are explanatory views illustrating a process of controlling the portable terminal according to the present invention.

4 to 8, the controller 180 of the portable terminal 100 displays the first content selected by the user in the multitasking state in which two or more contents are being executed by the user's operation, on the screen of the touch screen 151 In step S110.

Then, the controller 180 displays on the touch screen 151 an item representing at least one or more second contents other than the first content among currently running contents [S120].

At this time, the controller 180 displays the first content in the main information display area 210 of the screen of the touch screen 151, arranges the item in the scroll bar area 220 of the screen, .

Preferably, if the item is more than one, the controller 180 may display the items in the scroll bar area 220 in a time sequence in which the contents corresponding to the two or more items are executed.

If the item is more than one item, the controller 180 may sort the items corresponding to the items in the scroll bar area 220 based on the ranking used by the user for a predetermined period of time. .

If there are two or more items, the controller 180 may assign items corresponding to the contents having the category associated with the first content among the contents corresponding to the two or more items to the scroll bar area 220 As shown in FIG.

In addition, if there are two or more items, the controller 180 may arrange the items corresponding to the contents that can be interlocked with the first content among the contents corresponding to the two or more items to the scroll bar area 220 .

7A, it is assumed that the first content is a " Naver "web page 351, and the currently concurrently playing contents are a" blog "web page 352, The control unit 180 determines that the category of the "Naver" web page is "Web page ", so that the content of the content excluding the first content 351 The items indicating the "blog" web page 352, the "next" web page 353, and the "Google" web page 354 can be sorted and displayed on the scroll bar area 220 by priority.

7B, when the first content is a " schedule "361 and the currently running content is a" blog "web page, a" next "web page, a" The content that can be interlocked with the "schedule" 361 among the contents is a keypad 362 for inputting a schedule, a contact is registered in the schedule or a schedule is registered in the schedule 362, The control unit 180 stores the items indicating the keypad 362, the phonebook 363, and the memo 364 in the scroll bar area 220 as the phone book 363 for the contact reading and the memo 364 to be created in the schedule. In order of priority.

4, the control unit 180 specifies a specific area through the user's hand or the touch pen in the display area of the first content [S130], and the item is dragged and dropped within the designated specific area (S140), the execution screen of the second content corresponding to the moved item is displayed in the specific area, thereby displaying the execution screen of the first and second contents together on the screen (S150).

At this time, the specific area designated through the user's hand or the touch pen may have a specific graphic shape.

When a specific region having a specific graphic form is designated by the user, the control unit 180 crops the image of the execution image of the second content corresponding to the item into the same shape as the specific graphic form, 2 display the execution screen image of the content within the specified specific area.

In addition, when the specific area is designated and the item is moved into the specific area, the control unit 180 may reduce the display area of the first content, reduce the display area of the first content, The second content corresponding to the item can be displayed.

In addition, when the specific area is designated and the item is moved to the specific area, the control unit 180 transparently processes the second content corresponding to the item so that the display area of the first content can be identified, Area can be displayed. That is, the user can visually identify the content of the first content and the content of the second content together through the specific region by transparently displaying the second content in a specific region within the display region of the first content have.

In addition, when the specific area is designated and the item is moved into the specific area, the controller 180 cuts out a portion corresponding to the specified specific area within the display area of the first content, And may display the second content.

Also, when any one of the display region of the first content and the specific region is selected, the control unit 180 may display the corresponding content displayed in the selected region on the screen in its entirety.

5 shows an example in which the currently running content is a web page 310, a message 320A, a messenger 330A and a gallery 340A, and the web page 310 as the first content in Fig. And the items representing the remaining messages 320A, messenger 330A and gallery 340A are displayed in the scroll bar area 220 in an aligned manner.

5 (a), the controller 180 designates the first specific area 231 having a specific shape on the web page 310 through the user's hand or the touch pen, 5 (c) and 5 (d), when an item corresponding to the message 320A among the items is dragged and dropped and moved into the first specific area 231, The execution screen image 320B of the message 320A is displayed in the first specific area 231 as shown in FIG.

5 (c), the controller 180 determines whether the content of the designated first specific area 231 in the web page 310 and the content of the first specific area 231 to be displayed in the first specific area 231 It is possible to display the execution screen image 320B of the message 320A in the specific area 231 by transparently processing the execution screen image 320B of the message 320A as the user can visually recognize the contents of the execution screen image 320B have.

The control unit 180 cuts out a portion corresponding to the designated first specific area 231 in the display area of the web page 310 and displays an execution screen image 320B of the message 320A in the cut- ) May be displayed.

5 (d), the controller 180 controls the web page 310 so that the web page 310 and the designated first specific area 231 do not overlap with each other in the screen And display the execution screen image 320B of the message 320A in the remaining area where the web page 310 has been reduced.

6 (a), the control unit 180 displays the web page 310 and the execution screen image 320B of the message 320A together on the screen, Another second specific area 232 is designated on the screen and an item corresponding to the messenger 330A among the items is dragged and dropped as shown in Figure 6B, The execution screen image 330B of the messenger 330A is displayed in the second specific area 232 as shown in FIG. 5 (c).

8 (a), the control unit 180 displays the web page 310, the execution screen image 320B of the message 320A, and the execution of the messenger 330A on the screen, Another third and fourth specific areas 233 and 234 are designated on the screen while the screen image 330B is displayed together and the music 372A and the moving picture 373A corresponding to the music 372A and 373A When the items are moved into the third and fourth specific areas 233 and 234, respectively, the third and fourth specific areas 233 and 234 are cut in the screen as shown in FIG. 8 (b) And displays the execution screen images 372B and 373B corresponding to the music 372A and the moving picture 373A at the cut-off portions of the third and fourth specific areas 233 and 234, respectively.

The present invention described above can be implemented as computer readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer-readable medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . Also, the computer may include a controller 180 of the portable terminal.

Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

The above-described portable terminal and its control method are not limited to the configuration and method of the embodiments described above, but the embodiments may be modified so that all or some of the embodiments are selectively And may be configured in combination.

100: mobile terminal 110: wireless communication unit
111: broadcast receiver 112: mobile communication module
113 wireless Internet module 114 short-range communication module
115: Position information module 120: A / V input section
121: camera 122: microphone
130: user input unit 140: sensing unit
141: proximity sensor 150: output section
151: Display unit 152: Acoustic output module
153: Alarm module 154: Haptic module
155: projector module 160: memory
170: interface unit 180: control unit
181: Multimedia module
190: Power supply

Claims (12)

A touch screen for displaying a first content among two or more contents simultaneously running on a screen;
Displaying an item on the screen indicating at least one second content excluding the first content among the two or more contents, and when the item is moved to a display area of the first content, the first content And a controller for controlling to display the second content corresponding to the moved item together.
The method according to claim 1,
Wherein the first content is displayed in a main information display area of the screen,
Wherein the item is displayed in a scroll bar area of the screen.
3. The method of claim 2,
Wherein the control unit preferentially displays items corresponding to contents having a category associated with the first content among the contents corresponding to the two or more items if the item is more than one item. .
3. The method of claim 2,
Wherein the control unit preferentially displays, in the scroll bar area, items corresponding to contents that can be interlocked with the first content, among contents corresponding to the two or more items if the item is more than one item terminal.
The method according to claim 1,
Wherein the control unit reduces the display area of the first content on the screen when the item is moved and displays the second content in a remaining area where the display area of the first content is reduced by reducing the display area, terminal.
The method according to claim 1,
Wherein when the item is moved into the specific area, the second content corresponding to the moved item is displayed in the designated specific area, To the portable terminal.
The method according to claim 6,
Wherein the control unit reduces the display area of the first content so as not to include the specific area on the screen and displays the second content in the specific area.
The method according to claim 6,
Wherein the control unit transparently processes the second content so that the display area of the first content can be identified and displays the second content on the specific area.
The method according to claim 6,
Wherein the control unit cuts out a portion corresponding to the designated specific area within the display area of the first content and displays the second content in the cut specific area.
The method according to claim 6,
Wherein the specific area has a specific figure shape by the touch gesture of the user.
The method according to claim 6,
Wherein the controller displays the content displayed on the selected area in full on the screen when any one of the display area of the first content and the specific area is selected.
Displaying a first content on a screen of a touch screen from among two or more simultaneously running contents;
Displaying, on the screen, an item representing at least one second content excluding the first content among the two or more contents; And
And displaying the first content and the second content corresponding to the moved item together on the screen when the item is moved to the display area of the first content.
KR20130049341A 2013-05-02 2013-05-02 Terminal and method for controlling the same KR20140130853A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR20130049341A KR20140130853A (en) 2013-05-02 2013-05-02 Terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20130049341A KR20140130853A (en) 2013-05-02 2013-05-02 Terminal and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20140130853A true KR20140130853A (en) 2014-11-12

Family

ID=52452485

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130049341A KR20140130853A (en) 2013-05-02 2013-05-02 Terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20140130853A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108769376A (en) * 2018-04-26 2018-11-06 中国联合网络通信集团有限公司 Display methods, device, terminal and the computer readable storage medium of status bar

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108769376A (en) * 2018-04-26 2018-11-06 中国联合网络通信集团有限公司 Display methods, device, terminal and the computer readable storage medium of status bar
CN108769376B (en) * 2018-04-26 2021-02-19 中国联合网络通信集团有限公司 Status bar display method, device, terminal and computer-readable storage medium

Similar Documents

Publication Publication Date Title
KR101788051B1 (en) Mobile terminal and method for controlling thereof
KR101657122B1 (en) Mobile terminal and method for controlling the same
KR101863925B1 (en) Mobile terminal and method for controlling thereof
KR101772453B1 (en) Mobile terminal and method for controlling thereof
KR101911252B1 (en) Terminal and method for controlling the same
KR101695816B1 (en) Mobile terminal and method for controlling thereof
KR101873745B1 (en) Mobile terminal and method for controlling thereof
KR101660737B1 (en) Mobile terminal and method for controlling thereof
KR20110113844A (en) Mobile terminal and method for controlling thereof
KR20120134688A (en) Mobile terminal and method for controlling the same
KR101878141B1 (en) Mobile terminal and method for controlling thereof
KR101842198B1 (en) Mobile terminal and method for controlling thereof
KR101615983B1 (en) Mobile terminal and method of controlling thereof
KR101980702B1 (en) Mobile terminal and method for controlling thereof
KR20130078236A (en) Mobile terminal and controlling method thereof, and recording medium thereof
KR20140118061A (en) Terminal and method for controlling the same
KR20150008951A (en) Terminal and method for controlling the same
KR20140130853A (en) Terminal and method for controlling the same
KR20140008061A (en) Terminal and method for controlling the same
KR20140104316A (en) Terminal and method for controlling the same
KR101701840B1 (en) Mobile terminal and method for controlling thereof
KR101809950B1 (en) Mobile terminal and method for controlling thereof
KR101709509B1 (en) Mobile terminal and method for controlling thereof
KR101871711B1 (en) Mobile terminal and method for controlling the same
KR101904938B1 (en) Mobile terminal and method for controlling thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination