KR20140040457A - Mobile terminal and operating method for the same - Google Patents

Mobile terminal and operating method for the same Download PDF

Info

Publication number
KR20140040457A
KR20140040457A KR1020120107133A KR20120107133A KR20140040457A KR 20140040457 A KR20140040457 A KR 20140040457A KR 1020120107133 A KR1020120107133 A KR 1020120107133A KR 20120107133 A KR20120107133 A KR 20120107133A KR 20140040457 A KR20140040457 A KR 20140040457A
Authority
KR
South Korea
Prior art keywords
area
input
screen
touch input
objects
Prior art date
Application number
KR1020120107133A
Other languages
Korean (ko)
Inventor
박진영
김현주
김성은
이진희
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120107133A priority Critical patent/KR20140040457A/en
Publication of KR20140040457A publication Critical patent/KR20140040457A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

A method for operating a portable terminal according to the embodiment of the present invention includes the steps of: displaying a first image on a display unit; receiving a multi-touch input which includes a first touch input and a second touch input; receiving a first drag input to increase a gap between the first touch input and the second touch input; and displaying the first image on a first region and a second region by dividing the screen of the display unit into first to third regions and displaying an object corresponding to a preset content on the third region arranged between the first region and the second region. Thereby, the screen is conveniently divided by one operation and the content is variously displayed. [Reference numerals] (110) Wireless communication unit; (111) Broadcast receiving module; (113) Mobile communication module; (115) Wireless internet module; (117) Near field communication module; (119) GPS module; (120) A/V input unit; (121) Camera; (123) Microphone; (130) User input unit; (140) Sensing unit; (141) Proximity sensor; (143) Pressure sensor; (145) Motion sensor; (150) Output unit; (151) Display unit; (153) Sound output module; (155) Alarming unit; (157) Haptic module; (160) Memory; (170) Interface unit; (180) Control unit; (181) Multimedia playing module; (190) Power supply unit

Description

Mobile terminal and its operation method {Mobile Terminal and Operating Method for the Same}

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a portable terminal and a method of operating the same, and more particularly, to a portable terminal capable of dividing a screen and using a plurality of contents efficiently.

A portable terminal is a portable device having one or more functions capable of carrying out voice and video communication, capable of inputting and outputting information, and storing data, while being portable. As the functions of the portable terminal have diversified, they have complicated functions such as photographing and photographing of a movie, playback of a music file or a video file, reception of a game, reception of broadcasting, wireless Internet, etc., and a comprehensive multimedia device .

In order to implement a complex function in a portable terminal implemented in the form of multimedia devices, various attempts have been made in terms of hardware and software. For example, there is a user interface environment in which a user can easily and conveniently search for or select a function.

Accordingly, an object of the present invention is to provide a portable terminal and an operation method thereof with improved operation convenience.

It is also an object of the present invention to provide a mobile terminal and a method of operating the same, which can conveniently divide a screen and use various contents with only one operation.

According to an embodiment of the present invention, a method of operating a mobile terminal may include displaying a first image on a display unit; receiving a multi-touch input including a first touch input and a second touch input; Receiving a first drag input in which the distance between the second touch inputs is increased and dividing the screen of the display unit into first to third areas to display a first image in the first area and the second area, And displaying objects corresponding to predetermined content in a third area disposed between the second areas.

According to an exemplary embodiment of the present invention, a portable terminal includes a display unit for displaying a first image, a multi-touch input including a first touch input and a second touch input, and an interval between the first touch input and the second touch input increases. In the case of receiving the first drag input, the controller may include a controller configured to divide the screen of the display unit into first to third regions, wherein the controller displays the first image in the first region and the second region, And to display objects corresponding to a predetermined content in a third area disposed between the second areas.

A method of operating a mobile terminal according to an exemplary embodiment of the present invention includes displaying a plurality of objects on a display unit, receiving an input for selecting two or more objects among the plurality of objects, and receiving a drag input for the selected objects. And moving the selected objects in response to a drag input and displaying the content corresponding to the selected objects on the divided screen when the distance between the selected objects decreases.

According to the present invention, it is possible to conveniently divide a screen or return to a previous screen with only one operation, and to display various contents. You can also access content you've recently used or multitasking faster. This improves the user's convenience of operation.

1 is a block diagram of a portable terminal according to an embodiment of the present invention;
2 is a front perspective view of a mobile terminal according to an embodiment of the present invention,
3 is a rear perspective view of a portable terminal according to an embodiment of the present invention,
4 is a flowchart illustrating an embodiment of a method of operating a mobile terminal according to the present invention;
5 to 11 are views for explaining various embodiments of a method of operating a mobile terminal according to the present invention;
12 is a flowchart illustrating an embodiment of a method of operating a mobile terminal according to the present invention;
13 and 14 are views for explaining various embodiments of a method of operating a mobile terminal according to the present invention.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The portable terminal described in the present specification may be used in various applications such as a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a camera, tablet computers, e-book terminals, and the like. In addition, suffixes "module" and " part "for the components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a block diagram of a portable terminal according to an embodiment of the present invention. Referring to FIG. 1, a mobile terminal according to an embodiment of the present invention will be described in terms of components according to functions.

1, the portable terminal 100 includes a wireless communication unit 110, an audio / video (A / V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, A controller 160, an interface 170, a controller 180, and a power supply 190. When such components are implemented in practical applications, two or more components may be combined into one component, or one component may be divided into two or more components as necessary.

The wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 113, a wireless Internet module 115, a short distance communication module 117, and a GPS module 119.

The broadcast receiving module 111 receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. At this time, the broadcast channel may include a satellite channel, a terrestrial channel, and the like. The broadcast management server may refer to a server for generating and transmitting at least one of a broadcast signal and broadcast related information and a server for receiving at least one of the generated broadcast signal and broadcast related information and transmitting the broadcast signal to the terminal.

The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal. The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information can also be provided through a mobile communication network, in which case it can be received by the mobile communication module 113. Broadcast-related information can exist in various forms.

The broadcast receiving module 111 receives broadcast signals using various broadcasting systems. In particular, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S) ), Digital Video Broadcast-Handheld (DVB-H), Integrated Services Digital Broadcast-Terrestrial (ISDB-T), and the like. In addition, the broadcast receiving module 111 may be configured to be suitable for all broadcasting systems that provide broadcasting signals, as well as the digital broadcasting system. The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 113 transmits and receives a radio signal to at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module 115 refers to a module for wireless Internet access, and the wireless Internet module 115 can be embedded in the mobile terminal 100 or externally. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 117 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and Near Field Communication (NFC) may be used as the short distance communication technology.

A GPS (Global Position System) module 119 receives position information from a plurality of GPS satellites.

The A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 123. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. Then, the processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ The camera 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone 123 receives an external sound signal by a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 113 and output when the voice data is in the call mode. The microphone 123 may be a variety of noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 130 generates key input data that the user inputs to control the operation of the terminal. The user input unit 130 may include a key pad, a dome switch, and a touch pad (static / static) capable of receiving commands or information by a user's pressing or touching operation. The user input unit 130 may be a jog wheel for rotating the key, a jog type or a joystick, or a finger mouse. Particularly, when the touch pad has a mutual layer structure with the display unit 151 described later, it can be called a touch screen.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, Thereby generating a sensing signal. For example, when the portable terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. In addition, a sensing function related to whether or not the power supply unit 190 is powered on, whether the interface unit 170 is coupled to an external device, and the like can be handled.

The sensing unit 140 may include a proximity sensor 141, a pressure sensor 143, a motion sensor 145, and the like. The proximity sensor 141 can detect the presence of an object approaching the portable terminal 100 or an object existing near the portable terminal 100 without mechanical contact. The proximity sensor 141 can detect a nearby object by using a change in the alternating magnetic field or a change in the static magnetic field, or a rate of change in capacitance. In addition, it is possible to detect whether the user holds the surface of the portable terminal 100. The proximity sensor 141 may be equipped with two or more sensors according to the configuration.

The pressure sensor 143 can detect whether the pressure is applied to the portable terminal 100, the magnitude of the pressure, and the like. The pressure sensor 143 may be installed at a portion where the pressure of the portable terminal 100 is required depending on the use environment.

When the pressure sensor 143 is installed on the display unit 151, the touch input through the display unit 151 and the pressure applied by the touch input The pressure touch input can be identified. Also, the magnitude of the pressure applied to the display unit 151 at the time of the pressure touch input can be determined according to the signal output from the pressure sensor 143. [

If the pressure sensor 143 is disposed at the outer periphery of the portable terminal 100, it is possible to detect whether the user holds the surface of the portable terminal 100 by detecting the pressure.

The motion sensor 145 detects the position and movement of the portable terminal 100 using an acceleration sensor, a gyro sensor, or the like. An acceleration sensor that can be used for the motion sensor 145 is a device that converts an acceleration change in one direction into an electric signal and is widely used along with the development of MEMS (micro-electromechanical systems) technology.

The acceleration sensor measures the acceleration of a small value built in the airbag system of an automobile and recognizes the minute motion of the human hand and measures the acceleration of a large value used as an input means such as a game There are various types. Acceleration sensors are usually constructed by mounting two or three axes in one package. Depending on the usage environment, only one axis of Z axis is required. Therefore, when the acceleration sensor in the X-axis direction or the Y-axis direction is used instead of the Z-axis direction for some reason, the acceleration sensor may be mounted on the main substrate by using a separate piece substrate.

The gyro sensor is a sensor for measuring the angular velocity, and it can sense the direction of rotation with respect to the reference direction.

The output unit 150 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 150 may include a display unit 151, an audio output module 153, an alarm unit 155, and a haptic module 157.

The display unit 151 displays and outputs the information processed by the portable terminal 100. For example, when the portable terminal 100 is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the photographed or received images can be displayed individually or simultaneously, and the UI and the GUI are displayed.

Meanwhile, as described above, when the display unit 151 and the touch pad have a mutual layer structure to constitute a touch screen, the display unit 151 may be an input device capable of inputting information by a user's touch in addition to the output device Can also be used.

If the display unit 151 is configured as a touch screen, it may include a touch screen panel, a touch screen panel controller, and the like. In this case, the touch screen panel is a transparent panel attached to the outside, and can be connected to the internal bus of the portable terminal 100. The touch screen panel keeps a watch on the contact result, and if there is a touch input, sends the corresponding signals to the touch screen panel controller. The touch screen panel controller processes the signals, and then transmits corresponding data to the controller 180 so that the controller 180 can determine whether the touch input has been made and which area of the touch screen has been touched.

The display unit 151 may be formed of an e-paper. Electronic paper (e-Paper) is a kind of reflective display, and has excellent visual characteristics such as high resolution, wide viewing angle and bright white background as conventional paper and ink. The electronic paper (e-paper) can be implemented on any substrate such as plastic, metal, paper, and the image is retained even after the power is shut off, and the battery life of the portable terminal 100 is long Can be maintained. As the electronic paper, a hemispherical twist ball filled with a telephone can be used, or an electrophoresis method and a microcapsule can be used.

In addition, the display unit 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three- (3D display). In addition, there may be two or more display units 151 according to the embodiment of the portable terminal 100. For example, the portable terminal 100 may include an external display unit (not shown) and an internal display unit (not shown) at the same time.

The audio output module 153 outputs audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 153 outputs sound signals related to functions performed in the portable terminal 100, for example, a call signal reception tone, a message reception tone, and the like. The sound output module 153 may include a speaker, a buzzer, and the like.

The alarm unit 155 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events occurring in the portable terminal 100 include call signal reception, message reception, and key signal input. The alarm unit 155 outputs a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal. For example, it is possible to output a signal in a vibration mode. The alarm unit 155 can output a signal to notify when a call signal is received or a message is received. Also, when the key signal is inputted, the alarm unit 155 can output the signal as the feedback to the key signal input. The user can recognize the occurrence of an event through the signal output by the alarm unit 155. A signal for notifying the occurrence of an event in the portable terminal 100 may also be output through the display unit 151 or the sound output module 153. [

The haptic module 157 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 157 is a vibration effect. When the haptic module 157 generates vibration with a haptic effect, the intensity and pattern of the vibration generated by the haptic module 157 can be converted, and the different vibrations can be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 157 may be provided with a function of stimulating by a pin arrangement vertically moving with respect to the contact skin surface, an effect of stimulating air through the injection or suction force of the air through the injection port or the suction port, A variety of tactile effects such as an effect of stimulation through the contact of the electrode (eletrode), an effect of stimulation by electrostatic force, and an effect of cold / warm reproduction using a device capable of endothermic or exothermic can be generated. The haptic module 157 can be implemented not only to transmit the tactile effect through direct contact but also to feel the tactile effect through the muscular sense of the user's finger or arm. The haptic module 157 may include two or more haptic modules 157 according to the configuration of the portable terminal 100.

The memory 160 may store a program for processing and controlling the control unit 180 and may store a function for temporarily storing input or output data (e.g., a phone book, a message, a still image, .

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM , And a ROM. ≪ / RTI > In addition, the portable terminal 100 may operate a web storage for storing the memory 150 on the Internet.

The interface unit 170 serves as an interface with all external devices connected to the portable terminal 100. Examples of the external device connected to the portable terminal 100 include a wired / wireless headset, an external charger, a wired / wireless data port, a memory card, a SIM (Subscriber Identification Module) card, a UIM An audio input / output (I / O) terminal, a video I / O (input / output) terminal, and an earphone. The interface unit 170 receives data from the external device or receives power from the external device and transmits the data to each component in the portable terminal 100 so that data in the portable terminal 100 can be transmitted to the external device .

The interface unit 170 is a path through which power from the cradle connected when the portable terminal 100 is connected to the external cradle is supplied to the portable terminal 100 or various command signals inputted from the cradle by the user are carried And may be a passage to be transmitted to the terminal 100.

The controller 180 typically controls the operation of the respective units to control the overall operation of the mobile terminal 100. For example, voice communication, data communication, video communication, and the like. In addition, the control unit 180 may include a multimedia playback module 181 for multimedia playback. The multimedia playback module 181 may be configured in hardware in the controller 180 or separately from software in the controller 180. [

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The mobile terminal 100 having such a configuration can be configured to be operable in a communication system capable of transmitting data through a frame or a packet, including a wired / wireless communication system and a satellite-based communication system. have.

FIG. 2 is a perspective view of a portable terminal according to an exemplary embodiment of the present invention, and FIG. 3 is a rear perspective view of the portable terminal shown in FIG. 2. Referring to FIG. Hereinafter, the portable terminal according to the present invention will be described with reference to FIGS. 2 and 3 in terms of components according to the external form. Hereinafter, for convenience of description, a bar-type portable terminal having a front touch screen among various types of portable terminals such as a folder type, a bar type, a swing type, a slider type, etc. will be described as an example. However, the present invention is not limited to a bar-type portable terminal, but can be applied to all types of portable terminals including the above-mentioned type.

Referring to FIG. 2, the case constituting the outer appearance of the portable terminal 100 is formed by the front case 100-1 and the rear case 100-2. Various electronic components are incorporated in the space formed by the front case 100-1 and the rear case 100-2.

The display unit 151, the first sound output module 153a, the first camera 121a, and the first through third user input units 130a, 130b, and 130c are connected to the main body, specifically, the front case 100-1 . A fourth user input unit 130d, a fifth user input unit 130e, and a microphone 123 may be disposed on a side surface of the rear case 100-2.

The display unit 151 may be constructed such that the touch pad is overlapped with the layer structure so that the display unit 151 operates as a touch screen so that information can be input by a user's touch.

The first acoustic output module 153a may be implemented in the form of a receiver or a speaker. The first camera 121a may be implemented in a form suitable for capturing an image or a moving image of a user. The microphone 123 may be implemented in a form suitable for receiving a user's voice, other sounds, and the like.

The first through fifth user input units 130a 130b 130c 130d and 130e and the sixth and seventh user input units 130f and 130g described below may be collectively referred to as a user input unit 130, Any manner can be employed in a tactile manner.

For example, the user input unit 130 may be embodied as a dome switch or a touch pad capable of receiving a command or information by a user's pressing or touching operation, or may be a wheel, a jog type or a joystick Or the like. In a functional aspect, the first to third user input units 130a, 130b and 130c are for inputting commands such as start, end, and scroll, and the fourth user input unit 130d is for inputting a selection of an operation mode, . In addition, the fifth user input unit 130e may operate as a hot-key for activating a special function in the portable terminal 100. [

 3, a second camera 121b may be additionally mounted on a rear surface of the rear case 100-2. On the side surface of the rear case 100-2, a sixth and a seventh user input units 130f, 130g, and an interface unit 170 may be disposed.

The second camera 121b has a photographing direction substantially opposite to that of the first camera 121a, and may have pixels different from those of the first camera 121a. A flash (not shown) and a mirror (not shown) may be additionally disposed adjacent to the second camera 121b. In addition, another camera may be installed adjacent to the second camera 121b to use it for shooting a three-dimensional stereoscopic image.

The flash illuminates the subject when the subject is photographed by the second camera 121b. The mirror enables the user to illuminate the user's own face or the like when the user intends to photograph (self-photograph) himself / herself using the second camera 121b.

A second sound output module (not shown) may be further disposed in the rear case 100-2. The second sound output module may implement the stereo function together with the first sound output module 153a, and may be used for talking in the speakerphone mode.

The interface unit 170 can be used as a path for exchanging data with an external device. An antenna for receiving broadcast signals (not shown) may be disposed in one area of the front case 100-1 and the rear case 100-2 in addition to the antenna for communication. The antenna may be installed to be capable of being drawn out from the rear case 100-2.

A power supply unit 190 for supplying power to the portable terminal 100 may be mounted on the rear case 100-2. The power supply unit 190 may be a rechargeable battery, for example, and may be detachably coupled to the rear case 100-2 for charging or the like.

On the other hand, in the present embodiment, the second camera 121b and the like are disposed in the rear case 100-2, but the present invention is not limited thereto. Also, the first camera 121a may be rotatably formed so that the second camera 121b can be photographed up to the photographing direction, even if the second camera 121b is not separately provided.

4 is a flowchart illustrating an embodiment of a method for operating a mobile terminal according to the present invention, and FIGS. 5 to 11 are views for explaining various embodiments of the method for operating a mobile terminal according to the present invention.

Referring to the drawings, first, a predetermined first image 510 is displayed on the display unit 151. [S410] Here, the first image 510 is not limited to the type. For example, the first image 510 may be a home screen, an execution screen of a predetermined application, a web browsing screen, a video, a photo, or the like.

Subsequently, the multi-touch input including the first touch input and the second touch input is received (S420), and the distance between the multi-touch input is increased, more specifically, the first touch input and the second touch input. The first drag inputs 520 and 530 having a larger interval are received (S430).

That is, the first drag inputs 520 and 530 may refer to inputs by multi-touching the display unit 151 with two fingers and then spreading the distance between the touched fingers and pinching out. ), Pinch open, and so on.

Referring to FIG. 5, the first drag inputs 520 and 530 in which the interval between the multi-touch inputs is increased are started at the first positions 521 and 531 of the first touch input and the second touch input and end in a direction away from each other. The input is dragged to 522 and 532. Accordingly, the distance between the first touch input and the second touch input is increased from d1 to d2.

In the present specification, the drag input is briefly illustrated in the form of an arrow for convenience. The starting point of the arrow is the touch input point, the direction of the arrow indicates the dragging direction, the end point of the arrow indicates the end position of the drag input, and the dropped position.

Meanwhile, the controller 180 may control to divide the screen of the display unit 151 into first to third regions 610, 630, and 620 based on the first drag input, as shown in FIG. 6. (S450)

In FIG. 6, the screen is divided into left and right directions, but the present invention is not limited thereto. For example, when the first drag input moves away in the vertical direction, the screen may be divided in the vertical direction.

Alternatively, the screen may be divided according to a predetermined direction / arrangement irrespective of the direction of the first drag input.

The controller 180 displays the first image A in the first area 610 and the second area 630, and between the first area 610 and the second area 630. Objects 621, 622, 623, and 625 corresponding to a predetermined content may be displayed on the third area 620 disposed at (S460).

That is, the current screen of the first image A being displayed is displayed by being torn to both sides through a gesture of opening by two fingers of the user, and a plurality of objects are displayed in the resulting space.

Therefore, by dividing the screen so as to correspond to a gesture such as opening or tearing the user's screen to the left or right, an intuitive user experience may be provided.

In operation S460, the same screen A of the first image may be displayed on the first area 610 and the second area 630. In this case, the user may view the same content regardless of whether the first area 610 or the second area 630 is viewed.

Alternatively, a screen divided by dividing the first image may be displayed in the first area 610 and the second area 630. For example, the left portion of the first image may be displayed in the first region 610, and the remaining portion of the first image may be displayed in the second region 630.

The objects 621, 622, 623, and 625 corresponding to the predetermined content may include an object representing at least one of a video, a photo, and an application.

Also, the objects 621, 622, 623, and 625 corresponding to the predetermined content may include an object corresponding to the most recently used content.

In addition, the objects 621, 622, 623, and 625 corresponding to the predetermined content may include a graphic or text object 625 corresponding to data on a clip board due to cut and copy. .

For example, as shown in FIG. 6, when a gesture of opening with two hands increases the multi-touch interval, the recently executed contents list 621, 622, and 623 and the copied memos 625 are located in the center area of the screen. 620 may be displayed.

The displaying of the objects (S460) may be performed when the first drag input satisfies a preset condition. The method may further include determining whether the first drag input satisfies a preset condition (S440).

That is, when the first drag input is received, the controller 180 determines whether the first drag input satisfies a preset condition (S440) and controls to divide the screen when the preset drag condition is satisfied. Can be.

Here, the input satisfying the preset condition may be an input exceeding a reference set based on touch / drag time, touch / drag length, touch / drag area, touch / drag pressure, and the like.

For example, the preset condition may include at least one of an area of the first touch input and the second touch input, a drag length of the first drag input, and an interval between the first touch input and the second touch input than a reference value. It may be a big case. Therefore, the user or manufacturer may set various combinations of conditions.

Referring to FIG. 5, the condition may be when the interval d2 at the end of the first drag inputs 520 and 530 is greater than the reference value, or when the drag length d3 + d4 is greater than the reference value.

The condition may be a case where an area of the first touch input and the second touch input is larger than a reference value.

For example, people's thumbs are larger than other fingers. Therefore, the size of the finger used by the user may be inferred based on the area of the touch input. The controller 180 determines the area of the touch input and performs screen division and image display according to the present invention when the pinch-out operation is performed with the thumb, and other matching when the pinch-out operation is performed with the other finger. By setting the operation to be performed), more various inputs are possible.

Alternatively, when using a touch input tool such as a stylus pen, the operation can be set according to the touch area, the type of the input tool, and the like.

Meanwhile, as described above with reference to FIG. 1, the sensing unit 140 may include a proximity sensor 141 and a pressure sensor 143. The proximity sensor 141 or the pressure sensor 143 may be carried by a user. Which side of the terminal 100 can be detected.

Therefore, the operation method of the portable terminal according to an embodiment of the present invention further includes the step of sensing a grip state of the portable terminal 100 through the sensing unit 140, wherein the preset condition is This may be the case when grips on two opposite sides of the portable terminal 100 are sensed.

As shown in FIG. 9A, the proximity sensor 141 or the pressure sensor 143 may be disposed at two opposite side surfaces 910 of the portable terminal 100 to detect whether the user grips the portable terminal 100. In addition, the proximity sensor 141 or the pressure sensor 143 is disposed on the four upper, lower, left, and right sides of the portable terminal 100 to determine which side the user is using the portable terminal 100, the specific surface In the case of holding and using, for example, when grips on two opposing sides are detected, it can be determined that the condition is satisfied.

Meanwhile, referring to FIG. 9B, when the preset condition is satisfied, the first drag input may be terminated in the predetermined areas 930 and 940. That is, only when the pinch-out operation is performed up to predetermined areas 930 and 940 of the screen, the screen may be divided and images and objects may be displayed.

In this case, the predetermined areas 930 and 940 may be set up to the same predetermined distance based on the screen side lines 931 and 941 of the bezel.

In the meantime, in the method of operating a mobile terminal according to an embodiment of the present invention, a second drag dragging one of the objects corresponding to the predetermined content to the first area 610 or the second area 630. Receiving an input (S470) and displaying the content corresponding to the dragged object (S480) in the area corresponding to the second drag input and the fourth area merged with the third area (S480) It may include.

For example, referring to FIGS. 6 and 7, when one of the objects 621, 622, 623, and 625 corresponding to the predetermined content is dragged to the first area 610, the controller ( 180 may adjust the divided screen and the displayed image.

The controller 180 controls to display the content B corresponding to the dragged object 621 in the fourth area 400 where the dragged first area 610 and the third area 620 are merged. can do. If the content B corresponding to the dragged object 621 is an application, an execution screen of the application B may be displayed.

Meanwhile, the image displayed in the remaining area can be maintained. In the example of FIG. 7, the A image may be continuously displayed in the second region 630.

That is, when one of the recent contents (B) is selected and placed in one of both screens, the screen is switched to the selected screen, and the previous screen A may be partially displayed on one side of the remaining screens. In addition, the aspect ratio of A and B can be adjusted by dragging the boundary of the setting screen or A, B.

Meanwhile, the drag input of the user may match another operation. For example, when a user opens a gesture with two hands, the user may return to a previously executed screen, or may further display options, editing functions, or tag information related to content on the current screen.

On the other hand, the operating method of the mobile terminal according to an embodiment of the present invention, the step of receiving an input for dragging the content displayed in the fourth area to the upper or lower corners and the screen divided left and right divided up and down The method may further include switching to a screen.

In this case, when the content displayed on the fourth area is dragged to the upper corner, the content displayed on the fourth area may be displayed on the upper side, and the content displayed on the fourth area may be displayed on the lower side.

For example, as shown in FIG. 7, if there is an input for dragging the content B displayed in the fourth area 710 to the upper edge 721, the upper area of the screen divided up and down as shown in FIG. 8 ( An image of the B application may be displayed in 810, and an image of the A application may be displayed in the lower region 820.

In addition, when there is an input for dragging the content B displayed in the fourth area 710 to the lower edge 722, as shown in FIG. 10, the upper area 1010 of the screen divided in the vertical direction is A. FIG. An image of the application may be displayed, and an image of the B application may be displayed in the lower region 1020.

That is, the input for dragging the divided screen to the edge of the portable terminal may be an input corresponding to the change of the screen division direction, and the upper or lower side may be an input for determining where to drag the content to be displayed on the newly divided screen.

According to the present invention, if the content (B) displayed by switching the screen division is dragged and dropped on the upper corner of the other content (A), the two contents (A, B) can be displayed simultaneously up and down and can be performed simultaneously. .

In addition, the two contents (A, B) can be switched up and down, and the aspect ratio can be adjusted. For example, by touching and dragging the boundary regions between the divided screens in the up, down, left, and right directions, the area of the divided screen and the ratio between the respective areas may be adjusted.

According to the present invention, when there is an input for dragging to satisfy a predetermined condition, the controller 180 divides the screen and displays content such as a recently used application, an application running in the background, and a video on one of the divided screens. can do. In addition, when the user selects the content, the corresponding operation may be performed by multitasking, and an operation screen corresponding to the execution of the operation may be displayed in one region of the screen.

On the other hand, the operation method of a mobile terminal according to an embodiment of the present invention, the step of receiving an input for dragging the content displayed in the fourth area to the remaining area of the left and right divided screen and each of the left and right screen divided The method may further include displaying images that are displayed on the screen.

Referring to FIG. 11, when the content B displayed in the fourth area 1110 is dragged to the remaining area 1120 of the screen divided left and right, the A application screen is displayed in the fourth area 1110. The B application screen may be displayed in the right region 1120.

On the other hand, the operation method of a mobile terminal according to an embodiment of the present invention, the method further comprises the step of receiving a drag input that the interval of the plurality of touch input points is reduced and switching to a predetermined initial screen (home screen) Can be.

Alternatively, the drag input in which the intervals of the plurality of touch input points become smaller may be the exit command of the screen division mode described above, and the portable terminal 100 may return to the screen before the screen division in response to this.

Here, the drag input in which the distance between the plurality of touch input points is reduced is multi-touch the display unit 151 with a touch input tool such as two fingers or a stylus pen, and then narrows the interval of the touch input tool such as a finger or a stylus pen. Means input by operation, and may be referred to as pinch in input.

12 is a flowchart illustrating one embodiment of a method for operating a mobile terminal according to the present invention, and FIGS. 13 and 14 are views for explaining various embodiments of the method for operating a mobile terminal according to the present invention.

Referring to the drawings, a plurality of objects 1311, 1312, and 1313 corresponding to content such as an application, a video, an image, and the like are displayed on at least a partial area 1310 of the display unit 151 (S1210).

Meanwhile, objects corresponding to a predetermined content may be displayed in another area 1320 in addition to the main area 1310.

Thereafter, an input for selecting two or more objects 1312 and 1313 among the plurality of objects 1311, 1312 and 1313 displayed on the display unit 151 is received (S122O), and the selected objects 1312 and 1313 are received. Receive a drag input for (S1230).

The selection and drag input may be a movement input operation connected to a predetermined object by an input of touching and dragging the object by a touch input means such as a user's finger or a stylus pen.

Meanwhile, the controller 180 may control to display the selected objects 1312 and 1313 while moving in response to the selection and drag input (S1240).

The controller 180 may divide the screen of the display unit 151 as soon as the interval between the selected objects 1312 and 1313 is reduced while the selected objects 1312 and 1313 are moved (S1260).

Alternatively, when the interval is smaller than the reference value R (S1250), it may be controlled to divide the screen of the display unit 151 (S1260).

For example, the reference value R may be set to '0', in which case the intervals of the objects 1312 and 1313 are '0', that is, the objects 1312 and 1313 are overlapped and merged. In this case, you can split the screen.

The number of split screens may be equal to the number of selected objects. For example, as shown in FIG. 13, when two objects 1312 and 1313 are selected, the display may be divided into two screens 1350 and 1360, and when three objects are selected, the display may be divided into three screens.

The controller 180 may control to display the contents B and C corresponding to the selected objects 1312 and 1313 on the divided screens 1350 and 1360 of the display unit 151, respectively. (S1270)

That is, as shown in FIG. 13, when an object such as an icon corresponding to a plurality of contents is displayed, for example, when two contents are merged in a recent content list screen, the two contents are displayed up and down and simultaneously operate as multitasking. can do.

Meanwhile, two contents can be switched up and down, and the aspect ratio can also be adjusted.

For example, if the user drags the boundary area 1355 as much as the user desires, the size and ratio of the divided screens may be changed while the boundary area 1355 moves.

In addition, the arrangement of the divided screen may be the same as the arrangement of the selected objects.

For example, when two objects 1312 and 1313 arranged vertically as shown in FIG. 13 are selected, the screens 1350 and 1360 may be divided in the vertical direction.

Alternatively, when two objects 1421 and 1422 are selected from among the objects 1421, 1422, 1323, and 1424 arranged to the left and right as illustrated in FIG. 14, the screens 1450 and 1460 are divided in the left and right directions, Images corresponding to D and E contents may be displayed on each screen.

Alternatively, when one object displayed on the upper first area 1410 and one object displayed on the lower second area 1420 are selected and merged, the screen may be divided up and down.

As described with reference to FIGS. 1 to 14, the present invention can conveniently divide a screen and display various contents by only one operation.

According to the present invention, it is possible to quickly check other contents such as recent contents and to display two contents simultaneously on the display unit 151 without being disturbed using the current screen.

In addition, one or more contents can be simultaneously executed without screen switching, and multitasking can be performed through quick and easy switching between contents.

On the other hand, by displaying a plurality of contents in a similar ratio, both contents are used as the main window, or the ratio of two divided content screens is adjusted to be used as the main and sub windows, respectively. It is possible.

In addition, it is possible to provide a clipboard that can drag not only recent content but also previously copied content, and can be used for memo, document work, and simple editing work.

The portable terminal according to the present invention and its operation method are not limited to the configuration and method of the embodiments described above but the embodiments can be applied to all or a part of each embodiment Or may be selectively combined.

Meanwhile, the present invention can be implemented as a processor-readable code in a processor-readable recording medium provided in a portable terminal. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium readable by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and also a carrier wave such as transmission over the Internet. In addition, the processor readable recording medium may be distributed over networked computer systems so that code readable by the processor in a distributed manner can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the present invention.

110: wireless communication unit 120: A / V input unit
130: user input unit 140: sensing unit
150: output unit 151: display unit
160: memory 170: interface section
180:

Claims (20)

Displaying a first image on a display unit;
Receiving a multi-touch input comprising a first touch input and a second touch input;
Receiving a first drag input in which an interval between the first touch input and the second touch input is increased; And
The third display unit divides a screen of the display unit into first to third regions to display the first image in the first region and the second region, and is disposed between the first region and the second region. Displaying objects corresponding to predetermined contents in a region.
The method of claim 1,
The displaying of the objects may be performed when the first drag input satisfies a preset condition.
3. The method of claim 2,
The preset condition may be a case where at least one of an area of the first touch input and the second touch input, a drag length of the first drag input, and an interval between the first touch input and the second touch input is larger than a reference value. A method of operating a mobile terminal, characterized in that.
3. The method of claim 2,
Detecting a grip state of the portable terminal;
The predetermined condition is a case where a grip on two opposite sides of the portable terminal is detected.
3. The method of claim 2,
The preset condition is a case in which the first drag input is terminated in a predetermined region.
The method of claim 1,
The objects corresponding to the predetermined contents may include objects representing at least one of a video, a photo, and an application.
The method of claim 1,
And the objects corresponding to the predetermined contents are objects corresponding to the most recently used contents.
The method of claim 1,
And displaying the same screen of the first image on the first area and the second area.
The method of claim 1,
Receiving a second drag input of dragging one of the objects corresponding to the predetermined content to the first area or the second area; and
And displaying content corresponding to the dragged object in a fourth region in which the region corresponding to the second drag input and the third region are merged.
10. The method of claim 9,
Receiving an input for dragging a content displayed in the fourth area to an upper or lower edge; and
And switching the screen divided into left and right divided into vertically divided screens.
11. The method of claim 10,
In the screen switching step, when the content displayed in the fourth area is dragged to the upper corner, the content displayed in the fourth area is displayed on the upper side, and when the content displayed in the fourth area is dragged to the lower corner, A method of operating a mobile terminal, characterized in that.
8. The method of claim 7,
Receiving an input for dragging the content displayed on the fourth area to the remaining area of the screen divided left and right; and
And displaying the images displayed on each of the divided screens left and right interchangeably.
The method of claim 1,
Receiving a drag input in which a distance between a plurality of touch input points is reduced; and
Switching to a preset initial screen; Operation method of a mobile terminal further comprising.
A display unit displaying a first image;
When receiving a multi-touch input including a first touch input and a second touch input, and a first drag input in which an interval between the first touch input and the second touch input is increased, the screen of the display unit may be configured to include the first to the second touch input. And a controller for dividing into a third area.
The controller is configured to display the first image in the first area and the second area, and to display objects corresponding to predetermined content in the third area disposed between the first area and the second area. A mobile terminal, characterized in that for controlling.
15. The method of claim 14,
And the controller divides the screen of the display unit when the first drag input satisfies a preset condition.
15. The method of claim 14,
When the controller receives a second drag input of dragging one of the objects corresponding to the predetermined content to the first area or the second area, the controller corresponds to the area and the third corresponding to the second drag input. And displaying content corresponding to the dragged object in a fourth region in which regions are merged.
17. The method of claim 16,
The control unit, when receiving an input for dragging the content displayed in the fourth area to the upper or lower corners, the control unit controls to switch the screen divided left and right to the screen divided up and down.
Displaying a plurality of objects on a display unit;
Receiving an input for selecting two or more objects of the plurality of objects;
Receiving a drag input for the selected objects;
Moving the selected objects in response to the drag input; And
And displaying the content corresponding to the selected objects on the divided screen when the interval between the selected objects becomes small.
19. The method of claim 18,
And the number of divided screens is the same as the number of the selected objects.
19. The method of claim 18,
The arrangement of the divided screens is the same as the arrangement of the selected object.
KR1020120107133A 2012-09-26 2012-09-26 Mobile terminal and operating method for the same KR20140040457A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120107133A KR20140040457A (en) 2012-09-26 2012-09-26 Mobile terminal and operating method for the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120107133A KR20140040457A (en) 2012-09-26 2012-09-26 Mobile terminal and operating method for the same

Publications (1)

Publication Number Publication Date
KR20140040457A true KR20140040457A (en) 2014-04-03

Family

ID=50650650

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120107133A KR20140040457A (en) 2012-09-26 2012-09-26 Mobile terminal and operating method for the same

Country Status (1)

Country Link
KR (1) KR20140040457A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111324398A (en) * 2018-12-14 2020-06-23 Oppo广东移动通信有限公司 Recent content processing method, device, terminal and storage medium
US11249592B2 (en) 2015-01-16 2022-02-15 Samsung Electronics Co., Ltd. Method of splitting display area of a screen and electronic device for processing the same

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11249592B2 (en) 2015-01-16 2022-02-15 Samsung Electronics Co., Ltd. Method of splitting display area of a screen and electronic device for processing the same
CN111324398A (en) * 2018-12-14 2020-06-23 Oppo广东移动通信有限公司 Recent content processing method, device, terminal and storage medium
CN111324398B (en) * 2018-12-14 2024-02-09 Oppo广东移动通信有限公司 Method, device, terminal and storage medium for processing latest content

Similar Documents

Publication Publication Date Title
KR101252169B1 (en) Mobile terminal and operation control method thereof
KR101657549B1 (en) Mobile terminal and control method thereof
KR20120001476A (en) Mobile terminal and operation control method thereof
KR101767504B1 (en) Mobile terminal and operation method thereof
KR20140080007A (en) Image display apparatus and method for operating the same
KR20140143610A (en) Mobile terminal and operation method thereof
KR20150068175A (en) Operating Method for Mobil terminal
KR101657543B1 (en) Mobile terminal and operation control method thereof
KR101720498B1 (en) Mobile terminal and operation control method thereof
KR101263861B1 (en) Mobile communication terminal and operation method thereof
KR20150039999A (en) Mobile terminal and operation method thereof
KR20150051757A (en) Mobile terminal and operation method thereof
KR20140040457A (en) Mobile terminal and operating method for the same
KR20140144562A (en) Method of operating a Mobile Terminal
KR101883179B1 (en) Mobile terminal and operation method thereof
KR101716133B1 (en) Mobile terminal and operation control method thereof
KR20150010182A (en) Mobile terminal and operation method thereof
KR101982775B1 (en) Mobile terminal and Operationg method thereof
KR101799320B1 (en) Mobile terminal and operation method thereof
KR101758177B1 (en) Mobile terminal and operation control method thereof
KR101679572B1 (en) Electronic device and operation control method thereof
KR101694157B1 (en) Mobile terminal and operation control method thereof
KR101687550B1 (en) Mobile terminal and operation method thereof
KR20140133073A (en) Method of operating a Mobile Terminal
KR20150086764A (en) Mobile terminal and operation method thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination