KR20140044433A - Mobile terminal and controlling method of mobile terminal - Google Patents

Mobile terminal and controlling method of mobile terminal Download PDF

Info

Publication number
KR20140044433A
KR20140044433A KR1020120110372A KR20120110372A KR20140044433A KR 20140044433 A KR20140044433 A KR 20140044433A KR 1020120110372 A KR1020120110372 A KR 1020120110372A KR 20120110372 A KR20120110372 A KR 20120110372A KR 20140044433 A KR20140044433 A KR 20140044433A
Authority
KR
South Korea
Prior art keywords
page
function
mobile terminal
hidden
home screen
Prior art date
Application number
KR1020120110372A
Other languages
Korean (ko)
Other versions
KR102046462B1 (en
Inventor
정재훈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120110372A priority Critical patent/KR102046462B1/en
Publication of KR20140044433A publication Critical patent/KR20140044433A/en
Application granted granted Critical
Publication of KR102046462B1 publication Critical patent/KR102046462B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Abstract

The present invention relates to a mobile terminal and a control method of the mobile terminal.
In the present invention, the mobile terminal corresponds to each page constituting the home screen and stores the hidden function in correspondence, and when a specific touch gesture for requesting execution of the hidden function is input to the specific page of the home screen, the mobile terminal corresponds to the corresponding page. Run the hidden function.

Description

Mobile terminal and controlling method of mobile terminal

The present invention relates to a mobile terminal and a control method of the mobile terminal.

A terminal can be divided into a mobile terminal and a stationary terminal depending on whether the terminal is movable. Again, the mobile terminal may be divided into a handheld terminal and a vehicle mount terminal according to whether the user can carry the mobile phone directly.

Recently, with the increasing use of smart phones, tablet PCs, and the like, various methods for efficiently managing applications installed in a terminal, such as a home screen, have been proposed. The user may arrange and use items such as an icon, a widget, and the like, which correspond to frequently used functions, applications, and the like on the home screen.

Meanwhile, as various applications are recently installed in a terminal, items arranged on a home screen are also increasing. Accordingly, methods for supporting faster and more convenient access to applications or functions frequently used by users have been proposed.

An object of the present invention is to provide a mobile terminal and a method of controlling the mobile terminal for improving accessibility to an application or a function frequently used by a user.

According to an aspect of the present invention, there is provided a mobile terminal comprising: a memory configured to correspond to and store a hidden function not displayed on a screen for each of at least one page constituting a home screen; touch screen; And when a specific user input is received in the home screen editing mode, enters a hidden function setting mode for the home screen, and when any one function is selected in the hidden function setting mode, the selected function is selected from the at least one page. If a touch gesture for requesting execution of the hidden function is input while one of the pages constituting the home screen is displayed, the control unit executes the hidden function corresponding to the displayed page. Include.

Also, a control method of a mobile terminal according to another aspect of the present invention includes the steps of: entering a home screen editing mode; Entering a hidden function setting mode based on a user input input in the home screen editing mode; If any one function is selected in the hidden function setting mode, storing the selected function as a hidden function corresponding to any one of at least one page constituting the home screen; Displaying one of at least one page constituting the home screen; and when a touch gesture for requesting execution of a hidden function is input to the displayed page, executing the hidden function corresponding to the displayed page. Steps.

According to the control method of the mobile terminal and the mobile terminal according to the present invention, the mobile terminal supports the hidden function for each page of the home screen, and supports the execution of the hidden function by only a simple touch operation. Has the effect of improving.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2 is a flowchart illustrating a control method of a mobile terminal according to a first embodiment of the present invention.
3 and 4 illustrate an example of setting a user-customized hiding function in a mobile terminal according to the first embodiment of the present invention.
5 illustrates an example of setting a multitasking function to a hidden function in a mobile terminal according to the first embodiment of the present invention.
6 to 9 illustrate examples of executing a hiding function in a mobile terminal according to the first embodiment of the present invention.
10 is a flowchart illustrating a control method of a mobile terminal according to a second embodiment of the present invention.
11 and 12 illustrate an example of rearranging items in a home screen editing mode in a mobile terminal according to a second embodiment of the present invention.
13 is a flowchart illustrating a control method of a mobile terminal according to a third embodiment of the present invention.
14 and 15 illustrate an example of rearranging items in a home screen editing mode in a mobile terminal according to a third embodiment of the present invention.
16 is a flowchart illustrating a control method of a mobile terminal according to a fourth embodiment of the present invention.
17 and 18 illustrate an example of setting a multitasking function on a hidden page in a mobile terminal according to a fourth embodiment of the present invention.
19 illustrates an example of executing a multitasking function using a hidden page in a mobile terminal according to a fourth embodiment of the present invention.
20 is a flowchart illustrating a control method of a mobile terminal according to a fifth embodiment of the present invention.
21 illustrates an example of displaying a previous activity in a preview form in a mobile terminal according to the fifth embodiment of the present invention.

The above objects, features and advantages of the present invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings. It is to be understood, however, that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and similarities. Like reference numerals designate like elements throughout the specification. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. In addition, numerals (e.g., days, days, etc.) used in the description of the present invention are merely an identifier for distinguishing one component from another component

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

In addition, the suffix "module" and " part "for constituent elements used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player) .

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram of a mobile terminal according to embodiments of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in Fig. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of a DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of a DVBH (Digital Video BroadcastHandheld).

The broadcast receiving module 111 receives broadcast signals using various broadcasting systems. In particular, the broadcast receiving module 111 may be a digital multimedia broadcasting broadcasting (DMBT), a digital multimedia broadcasting satellite (DMBS), a media forward link only (MediaFLO), a digital video broadcasting ), And ISDBT (Integrated Services Digital Broadcast Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems that provide broadcast signals as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 refers to a module for wireless Internet access, and the wireless Internet module 113 can be embedded in the mobile terminal 100 or externally. WLAN (WiFi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

The location information module 115 is a module for confirming or obtaining the location of the mobile terminal. A typical example of the location information module is a GPS (Global Position System) module. According to the current technology, the GPS module 115 calculates information on a distance (distance) from three or more satellites to one point (object), information on the time when the distance information is measured, , It is possible to calculate three-dimensional position information according to latitude, longitude, and altitude of one point (individual) in one hour. Further, a method of calculating position and time information using three satellites and correcting the error of the calculated position and time information using another satellite is also used. The GPS module 115 continues to calculate the current position in real time and uses it to calculate speed information.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display module 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ The camera 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence or absence of user contact, the orientation of the mobile terminal, And generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, it may be responsible for a sensing function related to whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor.

The output unit 150 is for generating output related to visual, auditory or tactile sense and includes a display module 151, an acoustic output module 152, an alarm unit 153, and a haptic module 154 .

The display module 151 displays information processed by the mobile terminal 100. For example, when the mobile terminal 100 is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display module 151 may be a liquid crystal display, a thin film transistor liquid crystal display, an organic light emitting diode, a flexible display, a 3D display And may include at least one.

Some of these displays may be transparent or light transmissive so that they can be seen through. This may be referred to as a transparent display. A typical example of the transparent display is a transparent LCD or the like. The rear structure of the display module 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display module 151 of the terminal body.

There may be two or more display modules 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display modules may be spaced apart or integrally arranged on one surface, and may be disposed on different surfaces, respectively.

In a case where the display module 151 and the sensor for sensing the touch operation (hereinafter, referred to as 'touch sensor') have a mutual layer structure (hereinafter referred to as 'touch screen' It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display module 151 or a capacitance generated in a specific portion of the display module 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display module 151 is touched or the like.

Referring to FIG. 1, a proximity sensor may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.

And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 outputs an acoustic signal related to a function (e.g., a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or audio signal may also be output through the display module 151 or the audio output module 152.

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may be configured to perform various functions such as an effect of stimulation by a pin arrangement vertically moving with respect to a contact skin surface, an effect of stimulation by air spraying force or suction force through a jet opening or a suction opening, A variety of tactile effects such as an effect of stimulation through contact of an electrode, an effect of stimulation by an electrostatic force, and an effect of reproducing a cold sensation using a heat absorbing or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to feel the tactile effect through the muscles of the user's finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The memory 160 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM At least one of a random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read- Type storage medium. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or receives power from the external device to transfer the data to each component in the mobile terminal 100 or to transmit data in the mobile terminal 100 to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) (Universal Subscriber Identity Module, USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input by the user to the cradle may be transmitted It can be a passage to be transmitted to the terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The control unit 180 typically controls the overall operation of the mobile terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of processors, controllers, microcontrollers, microprocessors, and electrical units for performing functions. In some cases such embodiments may be implemented using a controller 180 < / RTI >

According to a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that perform at least one function or operation. The software code may be implemented by a software application written in a suitable programming language. In addition, the software codes may be stored in the memory 160 and executed by the control unit 180. [

Embodiments disclosed in this document may be implemented in the mobile terminal 100 described with reference to FIG. 1. Hereinafter, the operation of the mobile terminal 100 for the implementation of the embodiments disclosed in this document will be described in more detail.

In this document, the display module 151 is assumed to be a touch screen 151. As described above, the touch screen 151 can perform both the information display function and the information input function. However, it should be clearly understood that the present invention is not limited thereto.

In this document, the touch gesture refers to a gesture implemented by touching or touching the touch screen 151 in a touch manner, and the touch input means an input received by the touch gesture.

The touch gesture is divided into tapping, dragging, flicking, press, multi-touch, pinch in, and pinch-out depending on the operation. do.

Tapping refers to a touch gesture, such as a click of a mouse on a general computer, by gently pressing and releasing the touch screen 151 once.

In addition, dragging may be performed by moving to a specific position while touching the touch screen 151, and when the object is dragged, the object may be continuously displayed in accordance with the drag direction.

In addition, flicking refers to an operation of touching the touch screen 151 and then moving the contact in a specific direction (upward, downward, left, right or diagonal) and then releasing the contact point. The mobile terminal 100, The processing of the specific operation is performed based on the flicking direction, speed, and the like. For example, the page turning operation of the e-book can be performed based on the flicking direction.

In addition, the press refers to an operation of continuously holding the touch for a predetermined time or longer after touching the touch screen 151. [

In addition, the multi-touch means an operation of touching a plurality of points of the touch screen 151 at the same time.

In addition, the pinch-in refers to an operation of dragging the touch screen 151 in a direction in which a plurality of pointers that are multi-touching approach each other. That is, a drag that starts from at least one point among a plurality of points that are multi-touched on the touch screen 151 and occurs in a direction in which a plurality of points that are multi-touched approach each other.

In addition, the pinch-out refers to an operation of dragging the touch screen 151 in a direction in which a plurality of pointers in a multitouch direction move away from each other. That is, dragging starts from at least one point among a plurality of points that are multi-touched in the touch screen 151 and occurs in a direction in which a plurality of points that are multi-touched move away from each other.

Hereinafter, a method of controlling a mobile terminal and an operation of the mobile terminal for implementing the same according to the first embodiment of the present invention will be described in detail with reference to the accompanying drawings.

2 is a flowchart illustrating a control method of a mobile terminal according to an embodiment of the present invention.

Referring to FIG. 2, the controller 180 associates a hidden function with respect to each page of a home screen and stores the hidden function in the memory 160 (S101). The home screen is a screen for arranging and managing items such as widgets and icons to which a shortcut function corresponds to a specific function or application. The home screen may include at least one page. The hidden function is not displayed on the home screen, but represents a function that is executed when a specific touch gesture is input.

Meanwhile, various embodiments of the present disclosure may provide a method of storing a hidden function corresponding to each page of a home screen.

First, the controller 180 may automatically select a hiding function corresponding to each page according to the items arranged on each page. In this case, when an item capable of previewing related information among items disposed on each page is included, the controller 180 may select the related information preview function of the corresponding item as a hidden function. For example, the controller 180 may select a preview function of today's schedule information as a hidden function for the page on which the calendar widget is disposed. For example, the controller 180 may select a preview function of the current battery status information as a hidden function for the page on which the power management widget is arranged. In addition, for example, the controller 180 may select a preview function of currently executing job information as a hidden function for the page on which the job manager widget is disposed. On the other hand, the controller 180 may select a common function for a page on which a preview of related information is unnecessary or only an item which does not support preview, such as a game, for example, a preview function of data usage information as a hidden function. .

In addition, the controller 180 may automatically select the hiding function according to an area in which a touch gesture requesting execution of the hiding function is input. In this case, the controller 180 may set a hiding function for each area of the page according to the item arrangement information. In addition, the step S101 may be omitted and a hidden function may be selected whenever a touch gesture is input. In the former case, the controller 180 may classify areas for each item arranged on a page, and select a hiding function of each area according to items arranged in each area. For example, the preview function of the information related to the item arranged in each area may be selected as the hidden function of each area.

In addition, the controller 180 may provide a user interface for setting a hidden function, and through this, may set a user-specific hidden function corresponding to each page.

3 and 4 illustrate an example of setting a user-customized hiding function.

Referring to FIG. 3, as the controller 180 enters a home screen edit mode, as shown in FIG. 3A, a plurality of pages constituting a home screen are collected on one screen. (3) is displayed on the touch screen 151. The controller 180 displays the home editing screen 3 using thumbnails of the plurality of pages P1 to P6 constituting the home screen. In addition, the controller 180 may display icons h1 to h3 indicating which hidden functions correspond to the pages P1 to P3 where the hiding function is set among the plurality of pages.

Meanwhile, the home screen edit screen includes a button B1 for requesting to set a hidden function as shown in FIG. 3A. If the button B1 for requesting the hidden function setting is selected while the specific page P6 is selected, the controller 180 enters the hidden function setting mode for the selected page P6. Accordingly, the controller 180 displays a list 4 on the touch screen 151 including at least one function that can be set as a hidden function in the selected page P6. Referring to FIG. 3B, a quick voice, a quick memo, and a custom multitasking function may be set as a hidden function. In addition, the function of displaying the remaining battery time, current data usage and remaining amount, terminal software update information, memory usage status, recent call history, etc. can be set as a hidden function. Meanwhile, FIG. 3 illustrates an example of functions that can be set as a hidden function, and the present invention is not limited thereto. According to the present invention, more or less functions than the function list 4 shown in (b) of FIG. 3 may be configured to be set as hidden functions.

Referring to FIG. 4, the controller 180 controls the controller 180 as the battery remaining time display function 4a is selected from the function list 4 that can be set as the hidden function as shown in FIG. 4A. Associates the selected battery remaining time display function 4a to the selected page P6 in the home screen editing screen as a hidden function. Accordingly, the controller 180 displays an icon h4 indicating that the battery remaining time display function is set as a hidden function on the home screen edit screen 3, as shown in FIG.

The controller 180 may set a function of executing a plurality of applications by multitasking as a user-customized hidden function based on a user's selection input.

5 illustrates an example of setting a multitasking function as a hidden function.

Referring to FIG. 5, as shown in FIG. 5A, the controller 180 selects a user-defined multitasking 4b function from a function list 4 that can be set as a hidden function. As shown in b), a menu screen 5 for selecting applications to be executed by multitasking is displayed on the touch screen 151. In addition, when the applications App1 to App4 are selected on the menu screen 5, the multitasking function of the applications App1 to App4 is set as a hidden function. On the other hand, when selecting a multi-tasking application, the controller 180 can set the priority of each application, the set priority is combined with each application (App1 ~ App4) as shown in Figure 5 (b) I can display it. Meanwhile, the controller 180 may set the priority of each application based on the order in which the applications to be executed by multitasking are selected, or set the priority of each application based on the characteristics of each selected application. In the former case, the controller 180 may set a higher priority for the first selected application.

Referring back to FIG. 2, the controller 180 displays a specific page of the home screen on the touch screen 151 based on a user control input (S102).

When the home screen is displayed, when the control unit 180 receives a specific touch input requesting to execute the hiding function (S103), the controller 180 checks the hidden function corresponding to the currently displayed page and executes it (S104).

In step S103, the touch gesture for requesting the execution of the hiding function is distinguished from the touch gesture for requesting the execution of the item arranged on the home screen. For example, the touch gesture for requesting execution of the hiding function may be a touch gesture for touching the touch screen 151 more than once within a preset time. For example, the touch gesture may be a touch gesture of touching an area where an item is not arranged on the home screen for a preset time or more.

6 to 9 show examples of executing the hiding function.

Referring to FIG. 6, as shown in FIG. 6A, when the page 6 of the home screen is touched twice consecutively within a preset time, the controller 180 receives a request to execute a hidden function. Accordingly, the controller 180 executes the schedule information display function, which is a hidden function corresponding to the corresponding page 6, and as shown in FIG. 6B, the schedule information 6a of the present invention is previewed. Display. Here, the schedule information display function may be set as a hidden function of the currently displayed page 6 by a user input, or the controller 180 may correspond to the calendar widget as the calendar widget is included in the currently displayed page 6. It may be selected.

Referring to FIG. 7, as shown in FIG. 7A, the controller 180 receives a hidden function execution request as the page 7 of the home screen is continuously touched twice within a preset time. Accordingly, the controller 180 executes the battery information display function, which is a hidden function corresponding to the corresponding page 7, and displays the battery information 7a in a preview form, as shown in FIG. . Here, the battery information display function may be set to a hidden function of the currently displayed page 7 by a user input, or the power management widget is controlled by the controller 180 as the power management widget is included in the currently displayed page 7. It may also be selected in connection with

Referring to FIG. 8, as shown in FIG. 8A, when the page 8 of the home screen is touched twice consecutively within a preset time, the controller 180 receives a request to execute a hidden function. Accordingly, the controller 180 executes a memory usage status display function, which is a hidden function corresponding to the corresponding page 8, and as shown in FIG. 8B, the memory usage status 8a is previewed. Display. Here, the memory usage display function may be set as a hidden function of the currently displayed page 8 by a user input, or the control unit 180 may include a task manager widget as the currently displayed page 8 includes a task manager widget. It may also be the one selected with respect to the widget.

Referring to FIG. 9, as shown in FIG. 9A, the controller 180 receives a hidden function execution request as the page 9 of the home screen is continuously touched twice within a preset time. On the other hand, if there is no hidden function corresponding to the current page 9, the controller 180 executes the data use and remaining amount display function set as the default as the hidden function. In addition, as shown in Fig. 9B, the data usage and remaining amount information 9a are displayed in a preview form. Meanwhile, in FIG. 9, when the hidden function corresponding to the currently displayed page does not exist, the case where the default function is executed as the hidden function has been described as an example, but the present invention is not limited thereto. According to the present invention, if a hidden function is requested to be executed in a state in which there is no hidden function corresponding to the currently displayed page, the controller 180 may display guide information for not executing any function or not displaying a hidden function. have.

Meanwhile, although FIGS. 6 to 9 illustrate two touch inputs that are continuous within a preset time as a touch gesture for requesting a hiding function, the present invention is not limited thereto.

Hereinafter, a method of controlling a mobile terminal and an operation of a mobile terminal for implementing the same according to a second embodiment of the present invention will be described in detail with reference to the accompanying drawings.

10 is a flowchart illustrating a control method of a mobile terminal according to a second embodiment of the present invention. 11 and 12 are diagrams for describing the control method of FIG. 10.

Referring to FIG. 10, the controller 180 enters a home screen edit mode as a specific control input is received while a home screen is displayed (S201). For example, the home screen edit mode is entered as a pinch-in input to the home screen is received.

As the home screen edit mode is entered, the controller 180 displays each page using thumbnails of the plurality of pages constituting the home screen, as shown in FIG. 3.

Thereafter, the controller 180 selects one screen from among a plurality of pages displayed on the home screen edit screen based on the user control input, and enlarges the selected page to display the selected page (S202). On the other hand, when a specific page is enlarged, the controller 180 enters an item editing mode in which an item disposed on the enlarged page can be moved, deleted, or changed in size.

In addition, one item is selected from the enlarged page based on the user control input (S203). When the selected item is dragged to another page (S204), the item is moved and arranged on the page to which the item is dragged (S205).

11 and 12 illustrate an example of rearranging items in a home screen editing mode.

Referring to FIG. 11, as the controller 180 enters a home screen editing mode, a home screen editing screen in which a plurality of pages constituting a home screen are collected on one screen as illustrated in FIG. 10A. (10) is displayed on the touch screen 151. The controller 180 displays the home editing screen 10 by using thumbnails of each of the plurality of pages constituting the home screen.

On the other hand, the home screen edit screen 10 includes an icon 10a to which the page enlargement function corresponds, as shown in FIG. As shown in FIG. 11A, as the icon 10a corresponding to the page enlargement function is dragged to a specific page 10b, the controller 180 shows a diagram of FIG. 10B. , The corresponding page 10b is enlarged and displayed. As the page 10b is enlarged, the controller 180 automatically enters an item editing mode in which items arranged on the page 10b can be moved or deleted.

Referring to FIG. 12, in a state where a specific page 10b is enlarged, the controller 180 selects a specific item 11a disposed on the page 10b as illustrated in FIG. 11A. As it is dragged to another page 10c, a movement request of the selected item 11a is received. Accordingly, the controller 180 moves and arranges the item from the enlarged page 10b to the page 10c to which the selected item 11a is dragged, as shown in FIG. 12B.

Meanwhile, although FIG. 11 illustrates a case in which an icon corresponding to an enlargement function is dragged to a corresponding page in order to enlarge a specific page in the home screen editing mode, the present invention is not limited thereto. According to the present invention, when the icon corresponding to the enlargement function is selected in the home screen edit mode, and then a specific page of the home screen is touched, the controller 180 may enlarge and display the touched page.

Referring back to FIG. 10, in step S204, when the selected item is dragged to the blank page or the blank area to which the new page generation function corresponds, a new page is automatically generated. You can also place the moved item on the created new page.

According to the second embodiment of the present invention described above, the user can easily change the arrangement of items using the page enlargement function in the home screen editing mode in which a plurality of pages constituting the home screen are displayed on one screen. . That is, in the home screen editing mode, as thumbnails of each page are displayed, it is possible to solve a problem that the display size of the items arranged on each page is small and difficult to select or move using the page enlargement function.

Hereinafter, a method of controlling a mobile terminal according to a third embodiment of the present invention and an operation of the mobile terminal for implementing the same will be described in detail with reference to the accompanying drawings.

13 is a flowchart illustrating a control method of a mobile terminal according to a third embodiment of the present invention. 14 and 15 are diagrams for describing the control method of FIG. 13.

Referring to FIG. 13, the controller 180 enters a menu screen editing mode as a specific control input is received while a menu screen is displayed (S301). For example, when a pinch-in input is received while the menu screen is displayed, the menu screen edit mode is entered.

As the menu screen edit mode is entered, the controller 180 displays each page by using thumbnails of the plurality of pages constituting the menu screen.

Thereafter, the controller 180 selects any one screen from among the plurality of pages displayed on the edit screen based on the user control input, and enlarges and displays the selected page (S302). On the other hand, when a specific page is enlarged, the controller 180 enters an item editing mode in which items arranged on the enlarged page can be moved.

In addition, any one item is selected from the enlarged page based on the user control input (S303). When any one item is selected, the controller 180 displays the home screen on the touch screen 151 instead of the menu screen (S304), and displays the selected item on the home screen.

In operation S305, the selected item is placed on the home screen based on a touch release point for the selected item. That is, the item selected on the menu screen moves on the home screen in a touched state, and then the controller 180 arranges the item based on the position of the corresponding item at the time when the touch on the selected item is released.

14 and 15 illustrate an example of rearranging items in a home screen editing mode.

Referring to FIG. 14, as the controller 180 enters the menu screen edit mode, the edit screen 14 collecting a plurality of pages constituting the menu screen on one screen as shown in FIG. 14A. ) Is displayed on the touch screen 151. The controller 180 displays the edit screen 14 by using thumbnails of the plurality of pages constituting the menu screen.

On the other hand, the edit screen 14 includes an icon 14a to which the page enlargement function corresponds, as shown in Fig. 14A. As shown in (a) of FIG. 15, the controller 180 drags the icon 14a corresponding to the page enlargement function to the specific page 14b constituting the menu screen, so that the controller 14 (b) of FIG. As shown in, the page 14b is enlarged and displayed.

Referring to FIG. 15, in a state where a specific page 14b of the menu screen is enlarged, the controller 180 may display a specific item 15a disposed on the page 14b as illustrated in FIG. 15A. As this is selected, as shown in FIG. 15B, the home screen 15b is displayed on the touch screen 151. In addition, the item 15a selected on the menu screen is moved and displayed on the home screen 15b. Thereafter, when the touch on the selected item 15a is released, the controller 180 places the corresponding item 15a on the home screen 15b.

Meanwhile, in FIG. 14, a case in which an icon corresponding to an enlargement function is dragged to a corresponding page in order to enlarge a specific page in the menu screen editing mode is described as an example, but the present invention is not limited thereto. According to the present invention, when the icon corresponding to the enlargement function is selected in the menu screen editing mode, and then a specific page is touched, the controller 180 may enlarge and display the touched page.

According to the third embodiment of the present invention described above, the user does not need to select an item to be placed on the home screen while switching each page of the menu screen, and uses the page enlargement function in a state in which the menu screen is collected on one screen. It is possible to easily move the desired item to the home screen.

Hereinafter, a method of controlling a mobile terminal and an operation of the mobile terminal for implementing the same according to the fourth embodiment of the present invention will be described in detail with reference to the accompanying drawings.

16 is a flowchart illustrating a control method of a mobile terminal according to a fourth embodiment of the present invention. 17 to 19 are diagrams for describing the control method of FIG. 16.

Referring to FIG. 16, the controller 180 sets a multitasking function on a hidden page of a home screen (S401). Here, the hidden page means a page in which the display is deactivated in the home screen display mode and the display is activated only when entering the home screen edit mode.

The controller 180 enters the home screen editing mode as a specific control input is received on the home screen (S402). For example, the home screen edit mode is entered as a pinch-in input to the home screen is received.

As the home screen edit mode is entered, the controller 180 displays each page by using thumbnails of each of the plurality of pages constituting the home screen. It also displays at least one hidden page.

Thereafter, when one hidden page is selected (S403), the controller 180 executes a multitasking function corresponding to the selected page (S404). That is, a plurality of applications set corresponding to the selected hidden page are executed by multitasking.

17 and 18 illustrate an example of setting a multitasking function on a hidden page.

Referring to FIG. 17, as the controller 180 enters the home screen edit mode, the controller 180 displays a home screen edit screen 17 on which a plurality of pages constituting the home screen are collected on one screen, on the touch screen 151. do. The controller 180 displays the home editing screen 10 by using thumbnails of each of the plurality of pages constituting the home screen.

In addition, when entering the home screen editing mode, the controller 180 is not displayed as a home screen, but includes the hidden pages HP1 and HP2 constituting the home screen on the home screen edit screen 17. Each of the hidden pages HP1 and HP2 may be set with a multitasking function. If any one hidden page is selected in the home screen editing mode, the controller 180 may execute applications corresponding to the selected page by multitasking. Meanwhile, the controller 180 displays icons 17a representing multi-tasking applications on each hidden page HP1 and HP2, so that the user can intuitively identify which applications are set to multitasking on which hidden page. It may be.

Referring to FIG. 18, as shown in FIG. 18A, the controller 180 may touch one hidden page HP2 on the home screen edit screen 17 for more than a preset time. As shown in (b) of FIG. 2, a menu screen 18 for selecting applications to be executed by multitasking is displayed on the touch screen 151. In addition, when the applications 18a to 18d are selected on the menu screen 5, the multitasking function of the applications 18a to 18d corresponds to the selected hidden page HP2. In addition, as shown in (c) of FIG. 18, the icons of the multitasking applications 18a to 18b are displayed on the hidden page HP2.

Meanwhile, when selecting a multi-tasking application, the controller 180 may set the priority of each application, and the set priority is set as shown in FIG. 18B during the application selection process. ) In combination with). The controller 180 may set the priority of each application based on the order in which the applications to be executed by multitasking are selected, or set the priority of each application based on the characteristics of each selected application. In the former case, the controller 180 may set a higher priority for the first selected application.

Meanwhile, in FIG. 18, in order to distinguish a touch gesture for executing a multitasking function by selecting a hidden page and a touch gesture for setting a multitasking function for the hidden page, a multitasking function for the hidden page is set. Although the touch gesture has been described as an example of a long touch gesture in which a touch is maintained for a preset time, the present invention is not limited thereto.

19 illustrates an example of executing a multitasking function using a hidden page.

Referring to FIG. 19, as the controller 180 touches one hidden page HP2 on the home screen edit screen 17, the controller 180 receives a request for executing a multitasking function corresponding to the touched page HP2. do. Accordingly, the controller 180 multi-tasks applications, such as KakaoTalk, Twitter, Melon, and Naver, which are set to the multi-tasking function on the selected hidden page HP2.

According to the fourth embodiment of the present invention described above, the mobile terminal sets a multitasking function on a specific page, so that when the corresponding page is selected, it is possible to execute several applications at once.

Hereinafter, a method of controlling a mobile terminal and an operation of a mobile terminal for implementing the same according to a fifth embodiment of the present invention will be described in detail with reference to the accompanying drawings.

20 is a flowchart illustrating a control method of a mobile terminal according to a fifth embodiment of the present invention. 21 is a diagram for explaining the control method of FIG. 20.

Referring to FIG. 20, the controller 180 executes a first activity and displays the first screen on the touch screen 151 according to operation S501. Thereafter, the controller 180 executes the second activity based on the user control input, and accordingly switches the screen to the second screen instead of the first screen (S502).

When a specific control input requesting a preview of the activity executed immediately before the second screen is displayed (S503) is received, the controller 180 executes the execution screen of the first activity executed immediately before from the memory 160. The first screen is read and displayed on the second screen in a preview form (S504). That is, the first screen is reduced and overlapped with a portion of the second screen in the form of a pop-up window.

Thereafter, when the second screen is touched in a state where the preview image of the first screen is displayed (S505), the controller 180 ends the preview display of the first screen (S506).

21 illustrates an example of displaying a previous activity in a preview form.

Referring to FIG. 21, in a state where the first web page 21a is displayed as shown in FIG. 21A, the controller 180 receives a user input for requesting page switching. Switch to the second web page 21b as shown in b). Thereafter, the controller 180 receives the previous activity preview request as the touch screen 151 is continuously touched twice within the preset time while the second web page 21b is displayed, and accordingly displays the screen on the screen immediately before. The first web page 21a that has been previously displayed on the second web page 21b in the form of a pop-up window. Thereafter, when the second web page 21b is touched, the controller 180 ends the display of the first web page 21a and returns to the original state.

According to the fifth embodiment of the present invention described above, the mobile terminal provides the user with preview information of the activity just executed by using a simple touch gesture. Accordingly, when the user wants to check a previously performed action, such as when checking a webpage that he visited before, the current activity is executed by a simple touch operation without having to terminate the current activity and execute the previous activity again. In the active state, it is possible to check the previous activity.

The above-described method of controlling a mobile terminal according to the present invention can be provided by being recorded in a computer-readable recording medium as a program for execution in a computer.

The control method of the mobile terminal according to the present invention can be executed through software. When executed in software, the constituent means of the present invention are code segments that perform the necessary tasks. The program or code segments may be stored in a processor readable medium or transmitted by a computer data signal coupled with a carrier in a transmission medium or communication network.

The computer-readable recording medium includes a mode type recording device in which data that can be read by a computer system is stored. Examples of computer-readable recording devices include ROM, RAM, CD-ROM, DVD-ROM, DVD-RAM, magnetic tape, floppy disk, hard disk, optical data storage device, and the like. The computer readable recording medium can also be distributed over network coupled computer devices so that the computer readable code is stored and executed in a distributed fashion.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. The present invention is not limited to the drawings, and all or some of the embodiments may be selectively combined so that various modifications may be made.

Claims (7)

A memory that corresponds to each of at least one page constituting the home screen and stores a hidden function not displayed on the screen;
touch screen; And
When a specific user input is received in the home screen editing mode, the user enters a hidden function setting mode for the home screen, and when any one function is selected in the hidden function setting mode, the selected function is selected from the at least one page. The controller controls the hidden function to be stored in a hidden function and executes a hidden function corresponding to the displayed page when a touch gesture for requesting execution of the hidden function is input while one page constituting the home screen is displayed.
.
The method of claim 1,
When the controller enters the home screen editing mode, the controller displays thumbnails of each of the at least one page, and displays an icon representing a hidden function corresponding to each page by combining the thumbnail with each thumbnail of the at least one page. Mobile terminal characterized in that.
The method of claim 1,
The controller is configured to execute a hidden function set as a default when a touch gesture for requesting execution of a hidden function is input while a page not corresponding to the hidden function is displayed among the at least one page. terminal.
The method of claim 1,
The hiding function is a mobile terminal, characterized in that it comprises a multi-tasking execution function of a plurality of applications.
5. The method of claim 4,
The control unit displays an application list roll on the touch screen as the hidden function setting mode is entered, and selects a multi-tasking function of a plurality of applications selected from the application list as a hidden function.
6. The method of claim 5,
The control unit sets the priority of the plurality of selected applications based on the order in which the applications are selected in the application list.
Entering a home screen editing mode;
Entering a hidden function setting mode based on a user input input in the home screen editing mode;
If any one function is selected in the hidden function setting mode, storing the selected function as a hidden function corresponding to one of at least one page constituting the home screen;
Displaying any one of at least one page constituting the home screen; and
Executing a hidden function corresponding to the displayed page when a touch gesture for requesting execution of the hidden function is input to the displayed page;
And transmitting the control information to the mobile terminal.

KR1020120110372A 2012-10-05 2012-10-05 Mobile terminal and controlling method of mobile terminal KR102046462B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120110372A KR102046462B1 (en) 2012-10-05 2012-10-05 Mobile terminal and controlling method of mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120110372A KR102046462B1 (en) 2012-10-05 2012-10-05 Mobile terminal and controlling method of mobile terminal

Publications (2)

Publication Number Publication Date
KR20140044433A true KR20140044433A (en) 2014-04-15
KR102046462B1 KR102046462B1 (en) 2019-11-19

Family

ID=50652390

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120110372A KR102046462B1 (en) 2012-10-05 2012-10-05 Mobile terminal and controlling method of mobile terminal

Country Status (1)

Country Link
KR (1) KR102046462B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022177298A1 (en) * 2021-02-16 2022-08-25 장경호 Program execution control system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120070666A (en) * 2010-12-22 2012-07-02 엘지전자 주식회사 Electronic device and control method for electronic device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120070666A (en) * 2010-12-22 2012-07-02 엘지전자 주식회사 Electronic device and control method for electronic device

Also Published As

Publication number Publication date
KR102046462B1 (en) 2019-11-19

Similar Documents

Publication Publication Date Title
EP2667293B1 (en) Mobile terminal and control method thereof
US9928028B2 (en) Mobile terminal with voice recognition mode for multitasking and control method thereof
KR101867513B1 (en) Mobile terminal and control method thereof
KR102044460B1 (en) Mobile terminal and method for controlling of the same
KR20110123348A (en) Mobile terminal and method for controlling thereof
KR20130029475A (en) Mobile terminal and control method therof
KR20140045060A (en) Mobile terminal and method for controlling thereof
KR101906415B1 (en) Mobile terminal and control method for mobile terminal
KR20130080102A (en) Mobile terminal and control method therof
KR101842198B1 (en) Mobile terminal and method for controlling thereof
KR20140092694A (en) Method for muitiple selection using multi touch and the terminal thereof
US20160196058A1 (en) Mobile terminal and control method thereof
KR101904940B1 (en) Mobile terminal and method for controlling thereof
KR20130006777A (en) Mobile terminal and control method for mobile terminal
KR102046462B1 (en) Mobile terminal and controlling method of mobile terminal
KR20150064523A (en) Electronic device and control method thereof
KR20130082200A (en) Mobile terminal and control method for mobile terminal
KR20140118061A (en) Terminal and method for controlling the same
KR20130089476A (en) Mobile terminal and control method for mobile terminal
KR101917074B1 (en) Mobile terminal and control method for mobile terminal
KR101632993B1 (en) Mobile terminal and message transmitting method for mobile terminal
KR101701840B1 (en) Mobile terminal and method for controlling thereof
KR20150048532A (en) Mobile terminal and control method for the same
KR101917073B1 (en) Mobile terminal and controlling method thereof
KR101818112B1 (en) Mobile terminal and method for moving object thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant